Evaluating Our Evaluation – Data At Work


How we can develop the evidence base to guide future action? Our workshop session is kicked off by moderator by Rob Thomas, as participants work in smaller round table groups.


Rob asks, where do we want to go? What does the future look like for evidence based science communication?

As group discussion starts, our table begins at the basics. What is evaluation? Why are we evaluating? We agree that there are different types of evaluation for different purposes.

Evaluation is often seen as an obligation or an afterthought, where people consider what their funder wants to see, about proving what you did was successful rather than planning how to deliver better to your audience in the future. There is a perception that funders want reports on numbers, which are easier to obtain, but others also want qualitative stories that require more resources and skills to obtain.


After some debate and mind map doodling, the room comes together to identify the key impediments:

  • Knowing what we are trying to achieve through evaluating and having defined objectives and outcomes.
  • Knowing what we are measuring and how we will do this.
  • Having the resources to evaluate (we need a great emphasis on this).
  • Lack of standards for evaluation and quality metrics. What is good vs. bad evaluation?

We soon realise that brainstorming impediments is much easier than identifying workable solutions. The second part of the workshop sees participants changing tables and discussing our solutions to impediments.

Key solutions from around the room:

Need for defined objectives and outcomes

  • Be explicit about why you are evaluating and choose appropriate evaluations tools.
  • Define objectives at the start of the program.
  • Recognise multiple objectives and multiple perspectives.

Need for standards

  • Develop a set of evaluation standards, guidelines or a checklist that is easily available.
  • Look at other evaluation standards: what can be adapted?
  • Have a conversation about standards and best practice.

Need to identify needs and interests for scientific engagement

  • Collaborate with other disciplines (easier said than done?).
  • Identify data that exists about audiences and identify gaps.
  • Engaging stakeholders in a conversation.

Lack of resources

  • Ongoing recognition of science communication as a research area.
  • Ongoing funding (also supported by funding applications).

Participants voted on their favourite solutions and only the fittest will survive. So more to come when our top solutions are collated this afternoon.
photo (1)

  • Get Livefyre
  • FAQ


  1. criminal defense attorney Fresno says:

    it accordingly.ask around – once you have…

    checked and doubled checked your own work, why not ask your friends, family or colleagues to also check the content, just in case you have missed something. once you have finished your article or web content, you can easily miss errors…