Skip to content

Latest commit

 

History

History
49 lines (32 loc) · 3.4 KB

course-analysis.md

File metadata and controls

49 lines (32 loc) · 3.4 KB

Course Feedback and Analysis

Deploying our Courses

For any given program we run, there are a number of steps involved to successful deployment to students:

  1. Creation and Design of Curriculum
  2. Course Staffing (Instructor, TechFellow and/or Volunteer)
  3. Course Deployment (Marketing, Admissions, Operations)
  4. Course Maintenance and Iteration

Check out this separate document on program phases for more details.

Analyzing and Improving our Courses

For every course we run, how do we practically evaluate success and improve our courses over time? We are looking for the following top-level metrics:

  • Cohort Completion - What % of students overall complete the course objectives successfully? What % from admitted, what % from week 3? How do URMs compare to total students on these metrics?
  • Subjective Satisfaction (NPS) - How do students self-report their likelihood to recommend the course to others (i.e "How likely are you to recommend this course to other students?")
  • Perceived Impact - How do students self-report the impact of the course in specific ways we want the course to help. (i.e "How well did this course increase your confidence in technical interviewing?")
  • Student Proficiency - How well does the program achieve it's desired proficiency outcomes? For example, if this is an inspire course, how well are students inspired to continue? If this is a software fundamentals course, how well are students reporting on interview success? If this is a special topics course, how are students faring in their ability to build real projects during and after the course?
  • Follow-through - How likely are students that complete the course to take additional courses and complete their journey over multiple semesters?

These four distinct metrics help us understand the various ways our programs are working, and where we need to improve. We should be asking ourselves always:

  1. How do we ensure that a higher percentage of students that are admitted into a course (and make it to week 3), are likely to complete the course objectives successfully?
  2. How do we ensure that a higher percentage of students are promoters and believers, and are likely to evangelize our courses to others?
  3. How do we ensure that a higher percentage of students report the course had a higher impact on their confidence around key course objectives?
  4. How do we ensure that a students are achieving higher levels of progress in their personal proficiency at the end of the course?

For more details on gathering feedback, analyzing and improving courses, check out the fourth section of the deploying programs doc.

What exactly is NPS?

The Net Promoter Score (or NPS) is the world’s leading metric for measuring customer satisfaction and loyalty for a product or service. The core is that it is designed to gauge their willingness to recommend it to others.

Scores range from -100 to 100

(Number of Promoters — Number of Detractors) / (Number of Respondents) x 100

  • 0 – 6: Detractors
  • 7 – 8: Passives
  • 9 - 10: Promoters

Interesting notes about the challenge of high scores with NPS:

  • Students rating a course 7 or 8 give us no positive points
  • Only a 9 or 10 give us any positive scores
  • Any thing from 0 - 6 subtracts points from score (if everyone rated that you'd be in the negative)