“Workshop feedback? I was facilitating, I know how it went!”… said no smart workshopper, ever.
We talk a lot about putting data and outcomes before gut feelings and hunches, and there’s no better time to put that into practice than when you’ve just run a workshop!
Not only does positive workshop feedback help you understand the best ways to get results from your team sessions, negative (but constructive) feedback is one of the most powerful tools you have for improvement.
It can be as simple as ‘just’ sending a survey, or emailing a couple of questions to attendees, but there’s a lot to think about. Follow these steps to make sure you’re getting the most out of your workshop feedback.
The benefits of workshop feedback
When you ask participants for feedback, you’re letting them know that their time and contributions are valuable. You’re also learning how to get the most value out of their attendance possible.
Often, and particularly if the feedback is anonymous, you’ll discover that there were plenty that individuals didn’t share in the session that could be useful. Ideas they were too shy to put out there, or a thought they had after mulling things over.
And if you’re running repeated sessions (project check-ins or retros) then you can start to build a useful set of data about what works, what doesn’t and how your own facilitation improves over time.
How to gather workshop feedback
There are lots of options for gathering feedback from your attendees. Here are some of the key questions you’ll need to answer before getting started and the pros and cons of each of the possible choices.
What is the most important data for you to collect?
If you set a clear outcomes-based goal at the beginning of the workshop, you’ll know for yourself whether or not this has been achieved. But if you were hoping to improve communication or team cohesion, better understand your customers and their journey or something else that’s tricky to measure, this is a great place to start.
For yourself, as a facilitator, it’s also useful to find out things like:
- Was everyone who attended able to add value?
- Did everyone who attended receive or contribute enough value to make it worth their time?
- Did your attendees perceive it to be valuable, even if you know that it technically was (or was not).
It’s also important to think of the feedback as the data that it is. That will help you decide what format to receive, store and analyse it in. For example, if you run this session every month (or three months, or year), you’ll know you need to keep at least some of the questions the same, so that you can spot trends. If one of your performance objectives is around facilitation, you’ll want data to back up your achievements in this area.
And if it’s just a one-off, such as a brainstorming workshop, you’ll know you can have more fun with the feedback you gather!
Workshop feedback survey, or something else?
Surveys aren’t the only way to gather feedback – although they are incredibly powerful. Some other options include:
- A person email asking for attendees’ thoughts after the session.
- An anonymous ‘comments’ box for people to place their feedback in at the end of the session (if in person).
- Building a ROTI (return on time invested) activity into the end of your session.
But surveys – well, they have so many benefits. Here are some of the reasons we think they’re often the best approach:
- They can be completely anonymous, so people are more likely to say what they really think.
- They give people the opportunity to take some time to ruminate before they fill it in.
- For recurrent meetings, they can be duplicated to allow for accurate comparison of data between sessions.
- They’re quicker to set up and analyse the results from than comments in emails or in a suggestion box.
- Many survey providers offer a free option, and often include a tonne of useful advice about survey structure and questions.
Whatever process you choose, there will likely be some form of question asked (even if you use an anonymous suggestion box) to help guide your participants to provide something useful. Here’s how you ask the right question in the right way.
Encouraging answers (and honest ones!) to your workshop feedback questions
A quick note here on how to improve your chances of getting lots of responses that include good information.
Send the survey as part of an email that thanks people for participating, and let them know how long the survey will take. And, of course, keep it as short as possible to get what you need without boring your respondents.
If at all possible, keep the survey anonymous and state that it is in your first contact.
A simple “Thank you for taking part today, it was so valuable to have you there. Please take this two-question survey about how you feel it went; it will take approximately one minute and really helps me improve these sessions.”
Best workshop feedback question formats
The problem with asking people for their input is that, often, there will be plenty that they want to say, but it’s not all going to be useful. Tailor your workshop feedback questions to draw out what you want them to say, instead. We’ll talk about that later, but here are some question/answer formats to consider and how to make the best use of each one.
- Simple question with an either/or answer. For these, make doubly sure that the possible answers really are mutually exclusive and that there are no other possible answers. There’s nothing more frustrating than being asked to choose between two options when the answer is something in the middle, or not represented at all.
- Multiple choice, select one. As with above, unless you include an ‘Other’ answer (and are fully prepared for most people to use it), you’ll need to make sure the answers are exhaustive and mutually exclusive.
- Multiple choice, select X number/as many as apply. These questions, especially with an ‘Other’ option, are really useful. Especially if you say you’re only going to ask one or two questions! For example: “What did you find useful about the session?” with options like: the background information provided; the expert introduction; the format of the activities; the voting section; other (please specify).
- Some sort of ‘scale’. Likert scales, for example, are very popular. This is useful where the answer is not either/or, but likely sits on a spectrum. For example, “How valuable did you personally find this session?” with a scale of ‘not useful at all’ to ‘very useful’.
- Ranking questions. For example, “Rank the sections in order of how useful you found them”. These can be frustrating to fill in, both practically and if you found two or more items equally useful.
Try to use neutral language, as leading questions will render your feedback far less useful. If you wouldn’t ask “How frustrating did you find the session?” then don’t ask how ‘great’ they thought it was, either!
Think very carefully about what you will do with the feedback. For example, if people rank the different activities in order of usefulness, what will your action be if something consistently comes out last? Swap it for something else?
For example, if people say the ice-breaker wasn’t useful, you could try a new one. But it may just be that the rest of the session was very practically applicable to their roles and… well, something had to go last! And if there’s nothing you can realistically do in response to the answers, don’t waste a question on asking.
And we’d always add a free text box to the end with an open-ended question like “Is there anything else you’d like to say about the session or project?” – this is where you’ll get some real eye-openers!
Best workshop feedback questions
While leading questions are not best practice, you can start with a positive to help ease people in and make them more comfortable with sharing more constructive thoughts later.
What did you most enjoy about the workshop session?
Leave this as a blank text box, or add a ‘Choose one from the list’ option and be sure to include ‘Other, please specify’.
How would you rate each of the following elements of the workshop?
Give enough of a prompt to jog their memory instead of just the name of the activity, and include descriptions of what the scales mean (not just numbers). Try to stick to low numbers for poor outcomes, or – if you must have them the other way around, use high numbers for poor outcomes in all questions. Otherwise, people will forget and give feedback that is the opposite of how they feel!
Some examples:How would you rate the icebreaker exercise (draw your job)?
- It made me feel uncomfortable about engaging with the workshop and other attendees.
- It made me feel slightly uncomfortable about engaging with the workshop and other attendees.
- It did not affect how comfortable I was about engaging with the workshop and other attendees.
- It made me feel slightly more comfortable about engaging with the workshop and other attendees.
- It made me feel very comfortable about engaging with the workshop and other attendees.
Notice that it wasn’t necessarily about how much they enjoyed it or if they thought it was ‘good’; the aim of the icebreaker is to help people to feel comfortable about taking part in the workshop. This is how you find out if it did that, or not. Keep the wording very similar to avoid any possibility of confusing people about how uniform the scale is.
How would you rate the idea generation session (e.g., Crazy Eights)?
- It made it much harder for me to come up with ideas.
- It neither helped nor hindered my ability to come up with ideas.
- It made it easier for me to come up with ideas.
You don’t need to include all of the sections. For example, if you ran Who, What, When to get people to commit to a list of actions, you will already know if it was effective because you’ll have a list of names against actions (or not). But if you want to, and your survey is not already too long, you can ask an open-ended question here.
What did you think of the Who, What, When exercise to assign people to actions?
Next, ask people about their own personal contributions. Keep these simple and use a free text box, as follows.
Do you feel that you made a valuable contribution to the workshop?
Add your answer, and include any other information that you’d like to share.
Do you feel that you received value from attending the workshop?
Add your answer, and include any other information that you’d like to share.
This helps you work out if you invited the right people, and if you made good use of their time.
Now that people are warmed up, this is the bit that will help you to make improvements next time you run a workshop.
Is there anything you think could have been done differently to improve the workshop?
- Yes (please specify)
It’s so tempting to add ‘No, it was great!’, but that implies that if you click ‘Yes’ it means you did not think it was great, and people won’t do that very often unless it was really not great. It is possible to think it was a great workshop and still have constructive criticism. In fact, that’s the best possible outcome!
Is there anything that surprised you about the session?
This is a good way to find out what people’s expectations were, and whether you met them or delivered something completely different. If there were any surprises, think about whether that element of surprise was helpful, and – if not – how you can prepare people more effectively next time round.
And finally, it’s always good to add an optional “Is there anything else you’d like to say about the workshop? Remember, this survey is completely anonymous.” question. This allows people to share that little nugget of information they’ve been holding onto that could help you transform your practice.
For example “I found it really hard to concentrate for two hours with only a short break. I’d have loved to do this as two sessions either side of lunch, or on different days.” or “It would have been really valuable to invite the interns to this session too.”
If you can nail asking for the right feedback – and acting upon it – the incremental improvements to your facilitation could see you at the top of your game in no time.