Combined Everest — Kilimanjaro — Fuji zones in Bangalore

How to run a Design Sprint for 1,000 people

Part 4: Test running the Design Dash

Robin Wong
3 min readAug 2, 2019

--

With 4 weeks to go until we ran our first 1,000+ person Design Dash (our condensed Design Sprint), we had a clear plan to run our 3 sessions, we had a script to follow, we had draft materials prepared, and we had recruited over 30 people to help us run a test on our workshop format.

Setting up for a test run

But, at this stage, because our research was still underway, we had no actual research to kickstart the Design Thinking process and no pre-recorded videos of us to use to talk testers through the activities, so we needed to improvise.

Improvising

The first thing we had to improvise on was research. How might we create a short video illustrating a key problem that people in the business face? Luckily, because we already the challenges identified, we simply found someone willing to talk about their experience with regards to that particular challenge and filmed a short interview with them to use during the test.

The next part we had to improvise around was the lack of videos introducing each of the activities. This was much easier because we had the next best thing, ourselves, armed with the scripts of what we were going to say. Yes, it meant that it might not be as polished as a video, but it would still get the message across, and it would allow us to get invaluable feedback on what people were able to understand and take away.

Testing, testing…

We decided to test with 3 teams of 10–12 people, with varying sizes of printed materials, and with team leads and Human-centred design (HCD) practitioners of varying levels of proficiency to see how each team would fare on the activities we set out.

Given that we were limiting ourselves to MCing at the live event, we also stated upfront that we would not provide any help to the teams to see if and where they got stuck, and how well their team leads and HCD experts would support them.

The plan for the testing day was for it to run much as the live events would run, with 1 session in the morning and 2 in the afternoon. This was also an opportunity to work with the events company, OWL Live (who incidentally were great), to plan how the stage management would work with lighting and sound.

What we learned

The testing day was invaluable, it informed not only how we were going to structure and present our pre-recorded research videos for each of the challenges, but it confirmed that the format we were going for, mixing pre-recorded and live segments allowed us to balance the learning part of the experience with a fun, collaborative activity.

Every team was able to get to the end of the experience and deliver a video pitch that clearly understood the challenge being faced in our improvised research video, and all without help or facilitation from us.

With the prototype of the live event validated, we felt confident going into the first live event at the start of July, provided of course that we could get all the recordings for research and activities completed in time!

The Story continues…

This is part 4 of a 5-part blog post, read on to find out how we went about

Back to

Like this story? If so, please give a clap, it only takes a second!

Want to hear more from me? follow me with one click via @robinow.medium.com/follow

--

--

Robin Wong

I help people turn ideas into human- and humanity-centric ventures. Global Head of Service Design at BT.