If we came up with a list of 'what we (think we) do well' as an organization, somewhere near the top of the list would be designing interactive learning experiences and workshops. We love to pay attention to all the details – how people are welcomed into the space, the use of design and 'tactiles', the importance of changing body positions, mixing up divergent and convergent thinking, the creative use of case studies and more. So, when the restrictions on gathering due to COVID-19 became apparent, we thought we would attempt to recreate this (as much as possible) in an online context, using Zoom.
We picked our Impact Measurement workshop – the Measurement Master Class – which we were used to delivering as part of our Make Good accelerator course for social entrepreneurs. Normally, there are about 15-20 participants, seated in groups of four around tables with large, designed interactive sheets that have graphics, text and illustrations based on the major points of the class. This is cluster-based learning. Whilst there is an overall arc from the beginning to the end of the workshop, it is not linear. Participants have pens and sticky notes, play a game, discuss in small groups and as a large group and, generally, have a fun and meaningful experience. They leave the workshop with increased understanding, some measurement tricks and tools, and what they need to get started in measuring the impact of their project.Our plan was to stretch the possibilities of Zoom to its limits in terms of interactivity. We turned the large brown sheets into slides for a shared screen, intended to use the annotation tools for people to fill in the slides and draw on them and make notes, designed a poll at the beginning and end of the workshop, and with breakout rooms for all the discussion. We asked people to have their own sticky notes at hand. We worked with Learning Forte – a digital learning agency – to facilitate Zoom for the workshop. We put the word out that we would test this new approach and had enough participants for two tests. We practiced. We prepared. Look out digital learning, RootedGood is coming! (or so we thought…)
It started quite well. Of course, there are the awkward introductions. With no discernible way to give order to people introducing themselves (apart from – 'Ben, please tell us your name…') there is the inevitable silence and then that moment where three-people-speak-at-once-and-all-are-then-too-polite-to-speak-next. Then the polling didn't work. We had a few co-hosts who could manage the workshop, and one of us closed the poll early before others had voted. Then, we couldn't see the results. Never mind – we ran the poll again and chatted about the results. The initial content was fine and it came to the first break-out room. We had carefully selected the groups and looked forward to the moment where they would discuss and we would have the chance to reflect on how the workshop was going.Unfortunately, for some reason, the break-out rooms did not work. Participants found themselves:
We had to move on. Then we tried it again for the second planned break-out room and it failed again. We had to pivot and take out all the remaining three breakout rooms – and therefore the vast majority of the discussion. The group was too large for discussion with all the participants so, as the main presenters, we had to talk much more than we were planning. So much for conversation.
We had no back-up plan. The interactive, discursive pedagogy we value quickly turned into a webinar! The shock. The horror. In trying to fix the problem, we lost time. We rushed some of the sections and some of the content felt more clunky and less suitable in this new format. A lot of this could have been mitigated by a back-up plan for the break-out rooms. The digital equivalent to having the caterer's number on speed dial in case there is an issue with lunch.The workshop ended. We thanked the participants (and apologized again). The Zoom call finished with that awkward moment where everyone says 'goodbye' then stares at the screen as the participants find their way to the 'leave' option.
We probably apologized too much. We acted as if any technological challenge was necessarily a complete failure on our part. We had unrealistic expectations. An analogy might be helpful. I used to live and work in London (the UK version, not the Ontario one). 90% of people that turn up for a meeting in Central London use public transport to get there (or they used to!). Public transport is always incredible in theory, tolerable in practice, and a nightmare on occasion. If you turn up late to a meeting and mention the delayed train or broken-down bus there is no condemnation, just knowing and empathetic replies. This just happens.
We couldn't pull out of the second test though we were tempted. We investigated the problems with Zoom and were (somewhat) confident the technology would work better this time. It did. The polling, the annotation, the breakout rooms – they all worked! We revised the content a little, didn't feel too rushed and got some really good feedback. Participants enjoyed the experience and there was a remarkable increase in the NPS (Net Promoter Score). After the first test, we asked the participants how likely they would be to recommend the Measurement Master Class. The NPS was -15. Not great. After the second test, the NPS was 82. A really significant increase and almost entirely due to the technology working more smoothly. Clearly, the second test was much improved and people enjoyed it. But, did it work as well as we hoped?
Whilst we knew it would be a challenge, we didn't appreciate the extent to which a Zoom call is a qualitatively different space to a meeting room or coffee shop. Not everything that can happen in a room can happen on a call. Conversely, there are limitations to being in a meeting room that are not relevant to a Zoom call. Crucially, there are documented reasons why you feel different after a zoom call to a similar conversation in person [https://www.bbc.com/worklife/article/20200421-why-zoom-video-chats-are-so-exhausting]. The possibilities and limitations are different, the learning styles and required approaches are different. The length of the workshop needs to change. We learned that the content will come across in a more linear fashion if you use ppt slides – so it needs to be designed that way. Additionally, participants will be more distracted and harder to engage. However, they will likely be more relaxed and have lower expectations. Right now, of course, people are also available. Our starting challenge was about how we take our current measurement workshop – the content and pedagogy - and put it online. In light of what we have learned, we have redefined this challenge. How do we design an online workshop to share what we have learned about impact measurement and what content and pedagogy is needed to do this well.?
Our primary observation is that in the rush to 'go digital', we naively assumed that the technology would be 'neutral' – that if we were creative enough, the same content and pedagogy would achieve the same results. However, as Marshall McLuhan famously said, 'the medium is the message'. The context and method are not neutral. Our Covid-19 reality means that we all need to constantly experiment and try new things – to together learn how to learn in this new environment.
One of the ways we communicate important news and info is through our email newsletters. You can sign up to stay connected.