3 lessons from working on a 72,000-user country-wide online quiz
We recently participated in an unprecedented project in Colombia's education history: ICFES, Colombian Institute for the Evaluation of Education, took on the challenge of applying an electronic state test from home for the first time.
We were humbled to participate in this initiative with our partner Cognosonline and our friends from Sumadi (proctoring solution) on bringing the Open LMS platform as the quiz engine for this task, especially, for that this meant for the 72,000 learners that were able to stay at home for the test and not expose themselves or their families to COVID-19
The project's scope was a considerable challenge: An audience of 72,000 learners on a weekend, 12 different versions of the evaluation, more than 100 questions per quiz, extreme conditions for concurrency, live support during the tests, etc. Preparations happened in a record time of one month. We learned a lot in the process and we want to share some lessons we learned on the project that could be useful for those that face a similar challenge.
Map out a path and focus
When you operate in high concurrency situations, you need to focus on the essential steps of the user experience and prioritize.
In this case, it was clear: We needed to provide learners an easy-to-use and reliable way to take their exams and finish it successfully at the given time. To optimize this scenario, we mapped out a path with all the required steps that learners would take for the quiz. We took screenshots and notes on what components were involved in each step, labeling in different categories to decide later what was required, what could be turned off, and what could be simplified in some way.
In very practical terms, this meant that we turned off activities and resources that were not needed (assignments, pages, etc), we streamlined forms, eliminated information and navigation elements that were not essential. We also provided navigation shortcuts so users would access the quiz directly from the proctoring view.
For your project, ask yourself:
- What's the expected outcome?
- What's the path for that outcome?
- What can I remove, simplify, or improve to get to that outcome with minimum friction?
Communication and performance support are critical
Providing real-time help to 72,000 users, most of them interacting with the platform for the first time, requires some strategies to help them properly finish their quizzes without issues and minimize issues during tests. That was one of the biggest challenges our partner Cognosonline had and the approach to maximize the success rate for learners had two different perspectives, before the quiz via training, and during the quiz via performance support.
For the before part, our partner Cognosonline created amazing videos that described the process. Those videos were shared with learners via email, Youtube, Twitter, SMS communications, among other channels. There were also practice tests that were sent to the users to familiarize themselves with the tool and the navigation.
On the performance support part, instructions, videos, and prompts were added to Open LMS in the mapped path to take the quizzes so learners always had context on where they were, what they could do on a given view, how they could achieve what they were required to do on a given moment, etc. Users always had the chance to drop into a live chat to communicate with our partner who could assist them in real-time, but these elements provided a self-serving way to minimize possible confusion. We used labels, language strings, personalization fields in the theme, and page components as performance support guides.
For your project, ask yourself:
- What questions could a user have on the path towards the expected outcomes?
- What elements could I use to provide contexts and guidance on a given moment without cluttering the view?
- Can I provide a prior engagement so users could familiarize themselves before the real thing?
Ideally, go through the planned path with a real customer, observe and learn from what you identify.
Think like a robot
Inputs, outputs, processes, steps, times, etc.: Decomposing your quiz event in atomic pieces could help you identify things to optimize, standardize, or automate.
Here are a couple of examples:
- We thought of provisioning and enrolment files as inputs from a system. We needed to standardize and process information to get it ready for scale. Consistency was key.
- The 72,000 learners were divided into cohorts across different courses and quizzes by identifying patterns in user files and quizzes. This model could help us during the reporting phase and also could encapsulate potential issues. Divide and conquer!
- We automated everything we could. Via PLD, we automated course flows and notifications, and using Conduit, we automated content replication, user provisioning/updates, and enrolments. We also automated report generation in-real time as a monitoring strategy.
- Once we identified and connected all the components from the quiz experience, then it was time for us to test everything. And we automated that too using Jmeter. Along with our friends from Sumadi, we ran multiple load and integration tests for different user numbers and concurrencies.
For your project, ask yourself:
- What are the building blocks of the path for your outcome?
- What elements could be considered as inputs and outputs?
- On what elements could be adopted a standard? What could be automated?
I hope you find this useful, and that it could help with your initiatives. For us, it was a project with a lot of learning.
In addition, a big portion of our team is from Colombia, and for us, it was absolutely meaningful to be able to work on an initiative with such an impact on our community.