Tamarin Butcher | Portfolio GetSmarter,Leadership,Portfolio,Project Management,QA Scaling Quality: Leading QA for Hundreds of Online Courses

Scaling Quality: Leading QA for Hundreds of Online Courses

During my time as Head of Evaluations at GetSmarter, I was tasked with an ambitious goal: ensuring the instructional quality of hundreds of online short courses across disciplines—each with its own subject matter experts, unique delivery needs, and evolving learner expectations.

To meet this challenge, I led the development and implementation of a robust, scalable quality assurance (QA) process. At the heart of it was a model I refined and put into practice: a structured, repeatable evaluation method grounded in the “E” of ADDIE—Evaluation.

A Practical QA Model in Action

The QA process was not abstract theory—it was operationalized through detailed steps, evaluative rubrics, and clear performance metrics. As described in this worked example, each course review followed a methodical sequence:

  1. Set Up Evaluation Goals
    What are we trying to evaluate? Is it engagement? Content accuracy? Assessment alignment? We established these goals at the outset and tailored the evaluation for both new and existing courses.
  2. Collect Data from Multiple Sources
    We gathered:
    • Learner feedback (surveys, support tickets, completion data)
    • Instructor and SME insights
    • Technical performance metrics (e.g., SCORM tracking, media playback rates) This multi-source approach ensured a holistic understanding of course performance.
  3. Evaluate Against a Rubric
    Our evaluation rubric included categories such as:
    • Clarity and coherence of learning outcomes
    • Alignment between outcomes, content, and assessments
    • Accessibility and media usability
    • Consistency of tone and instructional voice
    • Scaffolding and cognitive load balance
  4. Synthesize Key Themes
    Findings weren’t presented as isolated critiques but grouped into overarching themes. For example:
    • “Learners are confused about course pacing”
    • “Assessment instructions require clarification”
    • “Media needs to be updated for accessibility”
  5. Prioritize and Recommend
    Based on evaluation outcomes, we tagged items as:
    • Critical: Needs urgent revision before relaunch
    • Moderate: Fix in the next update cycle
    • Minor: Noted for future enhancement

Each report included actionable recommendations, often accompanied by screenshots or annotated documents. These were shared with project managers, instructional designers, and SMEs, ensuring everyone had a clear pathway to improvement.

A System for Scaling Up

This QA model became the basis for our annual review cycle, tracked across hundreds of courses using a central dashboard. Courses were tagged by review status, last revision date, and feedback priority—allowing us to deploy evaluator resources efficiently and respond to quality issues with speed and precision.

Over time, the model became a key internal benchmark: consistent enough to ensure quality, yet flexible enough to adapt to course-specific needs.

Leadership in QA Practice

As Acting Head, I wasn’t just overseeing rubrics—I was actively shaping a culture of quality:

  • Training my evaluation team to apply rubrics fairly and insightfully
  • Mentoring designers and SMEs in best practices derived from evaluation data
  • Advocating for learner-centered design decisions in cross-functional teams

I also collaborated with internal leadership to integrate evaluation findings into broader program decisions—ensuring that QA wasn’t just a checkpoint, but a catalyst for innovation.

Conclusion: QA as a Design Mindset

Leading the QA function at GetSmarter taught me that evaluation isn’t the end of a process—it’s a design tool in its own right. When embedded into the lifecycle of course creation and revision, it drives clarity, cohesion, and continuous improvement. And when scaled thoughtfully, it becomes a strategic lever—not just for maintaining standards, but for raising them.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Post