How Did I Measure Learner Engagement in Online Classroom?

In 2018, when we at GreyAtom made a complete 360 from the traditional offline learning experience to an online remote learning environment,I knew it was time to “Flip the Classroom”. What I primarily flipped was the active role of the instructor in passing the concepts and content knowledge. Simply put, we wanted learners to learn at their own pace and use the valuable in class time for more collaborative activities

For me, this flip was essential for learners to develop their own train of thought. This motivates them to take learning beyond the boundaries of the classroom, apply critical thinking and create something out of what they have learned. This flip ensured that we were training them the right way- training them to think.

But what exactly was the process? Let’s take a look.

During the week, the learner goes through the platform content More About GreyAtom. The instructor helps them develop their own understanding during the live classroom time- by discussions, group codealongs and projects. They work together in breakout rooms on mini challenges and projects shared by the instructor. Towards the session end, instructor spends 15 minutes on concept onboarding for the next sprint for learners to get a quick overview of the next sprint

As we put ourselves through this ‘flip the classroom’ transformation, we focused on one of the primary catalysts to this change – Engagement.

Engagement was key to measure how effective the learning has been. It is a leading indicator of learner success vs completion rates – which is a lagging indicator. If we can drive better engagement, we have a control over higher learner outcomes

So what does engagement really mean? Between seeking and finding the answer, in retrospect, the answers were hidden in First Principles. For us, the quantity and quality of a learner’s intrinsic participation in the program were the key indicators of engagement. Based on the learners’ activities on the platform and their participation on the slack channel, discussion forums, we broadly defined engagement in the following two categories:

1. Cognitive Engagement

Better understood as the learner’s mental connection to the learning process, it is the learner’s intrinsic and immersive reflection of the learnings.

We started thinking about the extent to which learners are

  • Paying attention to program content

  • Processing information

  • Performing tasks, assignments and projects

2. Behavioral engagement

If learners are interactively involved in the program, it will be indicated by daily interactions between learners and their learning contexts.

Broadly, the ways in which we thought this to be evident-

  • Learners helping each other

  • Participating in discussions

  • Responding to GreyAtom conversations

Now, I wanted to start small – and discover the indicators that reflected Cognitive and Behavioral engagement of our learners.

The best way to do this was quantitatively – to come up with an Engagement Score.So started with the Cognitive aspects

How did I define the Cognitive Engagement Score?

  1. Identified key events on the platform that signified engagement CleverTap Events – CleverTap is a realtime analytics and segmentation platform.

  2. Performed EDA on current cohorts We kept our mentors involved as we dived deep into the EDA of our learner-data. Apart from what the platform data was telling us, our instructors and mentors weighed in valuable insights from their in-class interactions. We were trying to constantly validate what we were seeing in the data with what our mentors were seeing in the classroom.

  3. Defined buckets and thresholds for engagement based on the data We ran unsupervised machine learning (clustering) on the data we had for the key indicators and arrived at learner-engagement buckets

    • Low Engaged – Score 1 – 9

    • Moderately Engaged – Score 10 – 20

    • Highly Engaged – Score 21+

    • Not Engaged – No activity on platform

  1. Identified activities/events of our power learners (we know who they were from our live interactions) and re-validated our hypothesis on Key Events.

Some interesting insights came out :-

  • Learners submitting questions prior to the sessions were also the most interactive during the sessions

  • Learners who were gritty would have 10+incorrect submissions, but also wouldn’t look at Hints & Solutions on the platform

  • Learners with the highest engagement also had the highest number of tasks submits

  1. Baselined ranges of engagement buckets – kept running this on 3 months data to iterate the ideal buckets

  2. Identified targets and triggers to increase engagement – We tried to take the weekly engagement score higher with the helpof our learner-success team.

    Through their focussed efforts, we also identified the key triggers to prioritize increase in engagement.

  3. We took the learner data into zoho dashboards and started building reports for Mentors, our careers team, Program Managers – to take data-driven decisions towards learner-activity.

The next thing I wanted to pick up was the signals for the behavioral data – yet to start there

The desired end state I had in mind for our Analytics Engine was to gather learner data from different sources – build a predictive model based on the learner engagement data and have a personalized learning plan developed for each learner.

This work is far from complete, but I am very excited about the possibilities it opens up.

Would love to know your thoughts as well.


Some papers we referred to deep dive on engagement


education-08-00025 (2) (4).pdf


Thanks for reading all the way to the bottom. I know your attention is demanded everywhere so I’m grateful that you even gave a minute of it to this post. If you ever want to reach out, please feel free to reply here, find me on Linkedin or message me on twitter.

Here’s a crazy idea: if you received this in your inbox, can you please forward it to a friend who you think might enjoy my writing? It’s easy. And I would be forever grateful.