“Simple practice effects” and the SAT

Useful article in the Washington Post re: standardized testing and fairness: No one likes the SAT. It’s still the fairest thing about admissions.

I’ll post some of the sections on income and scores in a bit, but this section on tutoring caught my eye:

Highly paid tutors make bold claims about how much they can raise SAT scores (“my students routinely improve their scores by more than 400 points”), but there is no peer-reviewed scientific evidence that coaching can reliably provide more than a modest boost — especially once simple practice effects and other expected improvements from retaking a test are accounted for. For the typical rich kid, a more realistic gain of 50 points would represent the difference between the average students at Syracuse and No. 197 University of Colorado at Boulder — significant, perhaps, but not dramatic.

By Jonathan Wai, Matt Brown and Christopher Chabris | 3/22/2019

Simple practice effects !

yeesh

Continue reading

How can you tell when a student has mastered a skill?

How can you tell whether someone has truly mastered a skill? What is the measurable indicator that a person really knows how to do something? These questions should be at the heart of every teaching decision . . . and every evaluation we make about the success of an educational program. Yet for many educators, and certainly for most parents, answers to these questions are anything but clear. Most of us have grown up in a “percentage correct world” where 100% correct is the best anyone can do. But is perfect accuracy the definition of mastery? . . . In fact, we see many children and adults who can perform skills and demonstrate knowledge accurately enough – given unlimited time to do so. But the real difference that we see in expert performers is that they behave fluently – both accurately and quickly, without hesitation.

Continue reading