How Educators Can Drastically Boost Learner Outcomes Through Agile Educational Design (AED)

Blog
July 15, 2022

Since the dawn of the internet, there has been fundamental truth:

The internet is a tool that enables increasingly faster information exchanges.

From emails and blogs to Tweets and Instagram stories, knowledge and experiences can be dispersed worldwide in seconds.

This allows for a faster communication, collaboration, and learning.

While this is a net positive for society and industries that manage to keep pace through continuous innovation, it exposes a fundamental weakness within our existing educational system.

Specifically:

The education we pick up becomes irrelevant every few years.

I know this to be true from my own experience. Software Engineering today isn’t nearly what it was back in 2016 at a time when my university was still teaching a curriculum reminiscent of the early 2000s.

In order for us to move forward, we need a more adaptive system that is in tune with the needs of the industry.

The status quo won’t do.

Today, I propose a new philosophy for how educators should develop curriculum and learner experience through “Agile Educational Design” or AED. This is drawing on my experience after speaking to hundreds of online schools and education leaders as the Founder & CEO of Virtually and the host of the Reshaping Education podcast.

Introducing Agile Educational Design

Just like agile product development which has taken over teams across Silicon Valley, educators need an adaptive methodology that can continuously evolve in response to industry feedback.

AED consists of four main steps:

  1. Program Design
  2. Program Delivery
  3. Measurement
  4. Reflection

The goal of AED is to continuously go through this cycle to improve learner outcomes.

Step 1: Program Design

AED starts with the design of the program.

A program isn’t just about the topics that are covered in class but also includes the student experience.

It’s how students are supported throughout their entire learning journey- whether that’s through office hours, TA discussion sections, or team projects.

The curriculum is the delivery mechanism for students to undergo transformation.

In order to build an impactful learner experience, we must first start off with a hypothesis.

These are a series of assumptions an educator must make in terms of determining what will deliver a transformation for the student.

Problem:

Students struggle to grasp X, Y, and Z concepts leading to poor performance during final assessments.

Hypothesis:

By introducing a team project around X, Y, Z concepts and adding additional TA-lead office hours, we’ll see a 30% improvement in final assessment scores.

This is what is referred to as the “Learning Journey”

Step 2: Program Delivery

This is where the “teaching” happens.

Once a curriculum has been designed, it’s up to the teaching staff to execute the program plan.

Just like any scientific experiment, there need to be dependent and independent variables. If we change too many elements of the program, we won’t know to what to attribute failure (or success).

In which case it won’t be repeatable, making any progress meaningless.

Step 3: Measurement

While the actual measurement of your program experiment takes place during program delivery, the infrastructure must be set up ahead of time.

In fact, it’s during the design step that we need to lay out which metrics we will track to validate (or invalidate) the results of our experiment.

What to track is entirely dependent on the hypothesis we laid out during the design step.

While our hypothesis really only depends on the final assessment scores, it would be unwise to not make any other measurements.

For this particular experiment, the following metrics could be appropriate to track:

Measurements:

  • Attendance during added office hours
  • Team project scores
  • Final assessment scores

Should our program fail to deliver on its initial objectives, we’ll need to use the data we collected to determine the root cause and build a new hypothesis.

Explicit vs Implicit Feedback

Explicit feedback is what we’re familiar with as it includes:

  • Surveys
  • Verbal/written feedback

Implicit feedback is how we learn through more objective measures like:

  • Community engagement
  • Attendance
  • Assessment scores

While explicit feedback is important; implicit feedback is drastically underutilized. To create a truly remarkable experience for learners, we need to leave behind our internal biases in favor of objective methods of student success.

Step 4: Reflection

This is the step that is most often neglected.

Collecting data is not enough. Data must be monitored and leveraged. Reflection is when the teaching staff comes together to interpret data collected over the program.

If assumptions were clearly stated during the design process, this step becomes quite straightforward.

Hypothesis:

By introducing a team project around X, Y, and Z concepts and adding additional TA-lead office hours, we’ll see a 30% improvement in final assessment scores.

Reflection:

  • Did we achieve the goal?
  • If not, what did we achieve?
  • What were the driving indicators of missing the target?
  • What adjustments can be made if any to achieve the target on the next iteration?

Benefits of AED

AED will become a necessity for learning during the digital age.

Career paths like YouTuber, social media manager, or course creatordidn’t exist a decade ago- meaning, you can’t train for these roles in traditional educational settings.

Instead, we need lean programs that are receptive to the needs of industry.

Examples of AED in the real-world

While AED isn’t widely documented, it is starting to have wide adoption in corners of the world, specifically- online education.

A few examples:

Skillful

Skillful is a learning community where ambitious early-mid career professionals go to launch and accelerate their careers in tech.

Its programs consist of 5 - 7 week learning sprints that are intended to simulate being on-the-job. Students are put into a cohort of learners on a similar journey to their own. Each sprint is divided into smaller teams, led by a mentor who works with them on a project. Mentors are peer-level accessible experts who work in the roles that learners want to upskill on.

Rather than spending a fortune learning out-dated curriculum from graduate programs, Skillful students learn from accessible experts in the industry and accelerate their careers in a condensed amount of time. Because mentors work in the roles they’re training for, it ensures that what they’re learning is up-to-date, continuously evolving and quite closely mirrors what our mentors are doing on-the-job.

Skillful leverages all sorts of data to ensure their programming meets the needs of mentors and mentees alike, leaning heavily on feedback from surveys and Slack. They often source new sprint ideas directly from their existing community, test them out on alumni before opening them to the public, and consistently make tweaks and pivot along the way.

Skillful’s feedback loop goes full circle, as they not only collect survey data from mentors and mentees associated with Skillful’s programming, but also have access to feedback from employer partners on how Skillful-trained employees are faring in interviews and during the first few months on the job. Skillful can leverage this employer feedback to further tweak its sprints to ensure they’re effectively preparing alumni for the job.

What’s more, Skillful also tracks quantitative metrics like attendance, weekly milestone completion, and job placement.

All of these data points enable the team to consistently reflect on programming (most often during “sprint retros”) and iterate quickly in order to provide the best possible learning experience for the Skillful community.

Learn more about Skillful’s learning philosophy on this episode of the Reshaping Education Podcast.

Part-time YouTuber Academy

Ali Abdaal’s Part-time YouTuber Academy (PTYA) is one of the most successful online cohort-based courses out there. Although they only got started back in Fall 2020, they’ve already grossed $2.5M teaching 1,000+ aspiring YouTubers how to build and grow their own channels.

PTYA’s key to success? The put their students first. They constantly ask themselves and leverage quantitative and qualitative feedback data to see what they can make better and how they can offer more value to students.

They’ve created a constant feedback loop where they collect feedback via surveys, focus groups, calls with individual learners, and checking in on what learners are discussing in Circle to deeply understand their students, their motivations, and how the PTYA can support them further in their goals.

When the PTYA team receives important feedback, they don’t simply wait until the next cohort to implement changes, they often take action to make changes right within the active cohort. They quickly create action plans to address issues and implement changes as swiftly and completely as possible.

PTYA’s data driven approach helps them understand areas of improvement within a cohort in real time so they can act quickly to iterate on what’s broken.

This is true Agile Educational Design (AED) in action.

Listen to the PTYA team talk about this in depth on this episode of the Reshaping Education Podcast.

On Deck

On Deck runs 20+ cohort-based communities that bring together top talent to network, build their careers, or even start a new company. They leverage action-oriented programming and sector-specific events to help members explore and launch ideas, build deep connections, and learn and grow rapidly alongside other professionals with similar goals.

On Deck’s internal structure is decentralized, so each program acts as its own testing ground for running Fellowships with maximum efficiency and impact.

On Deck collects data on the performance of each Fellowship to uncover insights on what programmatic changes should be made to make sure fellows get maximum value out of their experience.

For example, the first cohort of On Deck Community Builders was an 8 week cohort. After reviewing feedback from periodic experience surveys, event feedback, and general engagement in Slack, they decided to shift the program from an 8 week cohort to a 4 week onboarding cohort + yearlong community. Based on the success of any changes made, this structure then has the opportunity to be applied to other Fellowships across On Deck.

Collecting data and iterating on programming is a general practice across On Deck. They leverage all data collected - event attendance, slack engagement, custom platform engagement, survey responses, and more - to constantly iterate on how programs are structured, what programs are offered, which speakers get invited to events, and what changes need to be made to internal processes in order to truly optimize the fellow experience.

Learn more about On Deck in this episode of the Reshaping Education Podcast featuring Joe Penn, On Deck’s Head of Operations.

Conclusion

Educators have never had a more difficult obstacle in front of them: change.

Specifically, change at a rate which makes their curriculum obsolete in a matter of months. This means they need to be students and educators at the same time. They need learn about the changes in industry while simultaneously passing on those lessons.

Our current system for educational development won't do. We need something better.

AED is perhaps the first step.

Ish Baid

Ish is the Founder & CEO of Virtually (YC S20).