Sure, we can tell you all about our in-depth curriculum, the enormous number of in-class hours we give you, our brilliant and good looking instructors, but at the end of the day, you're not going to be fooled by silly things like our instructors' witty personality or perfect smiles. You want results. Data. Hard numbers.
And as you'd imagine based on the fact that we have this page here, we're not ones to disappoint in that regard:
Qualifying students that took our course averaged a 12.2 point increase from their first to their best practice test.
The first question that probably comes to your mind is: what is this 'qualifying students' bit? Well, we've got nothing to hide. Our approach for what qualifies a student is pretty simple:
- We included all students that took at least four proctored diagnostic tests of seven of our scheduled diagnostic tests (seven because we often host an extra test for students the week before the LSAT). While other courses only count students that took every single diagnostic test, we feel that's too specific to be a reasonable sample (for the record - students that took all seven tests averaged a whopping 19 point improvement, but it's a silly statistic since only ~12% of students took all seven tests).
- We excluded any retaking students. Although their overall score increases would have further raised our numbers, we felt it wasn't fair to include them since their improvements would have come from more than one attempt at the course.
Based on this, we simply took each student's first and best score to establish their 'improvement', and took the average of those numbers. While we believe the first-to-best approach is the best available way to assess a student's improvement, others might prefer a first-to-last approach (whereby the students' first score is compared to their last score). Our average score improvement on a first-to-last approach for students that took at least four tests was 10.8 points, and for students that took all seven tests it was 18.9 points (though to reiterate, this one doesn't mean much since only ~12% of students actually took all seven tests).
Feel free to look around. Most of our competitors don't dare release their score improvement data, and for the few that do, the numbers speak for themselves (and for us, they're quite flattering). For more information on how our score improvements stack up to our competitors', click here.