Correcting Our 2014 Best High Schools Lists

A note from our editor.


September 12, 2014: Earlier this week, we published a series of corrections to our list of Greater Boston’s top private high schools (see below). Today, after further investigation, we’re retracting the list and taking it offline. As part of an ongoing audit, we found further inaccuracies, as well as inconsistencies in the way our information was vetted that aren’t up to the magazine’s standards.

We found that during our fact-checking process, we had attempted to confirm some (but not all) of the schools’ publicly reported numbers by contacting them directly. In some of those cases, we accepted updated numbers from schools—but in other cases, we didn’t. The result was that some data came from the 2013 school year, and other data from 2014 school year. While we believe the data collected is accurate, our standard wasn’t applied consistently, which could call the rankings into question.

Some of the errors we found could have been corrected: For instance, a column marked “Upper-School Enrollment” should have been marked, simply, “Enrollment,” because the numbers provided referred in some cases to schools that also have classes below high school. That’s consistent with how we’ve reported those numbers in past lists, and it would not have affected the rankings, but it’s also clearly inaccurate. Likewise, we should have clarified that schools report “Student-Teacher Ratio” for their entire student body, which in some cases includes non-high-school grades. In another case, the inconsistencies clearly could have affected the rankings: Specifically, we found inconsistencies in the way schools reported their AP classes, one of our weighted categories. The information we reported doesn’t meet our standards for accuracy.

In addition, a Boston Globe reporter discovered that our list included four schools that primarily serve students with special needs—including the Willow Hill School, which was ranked in our top 15. It’s not fair to either type of school to rank them together, since they serve populations with different needs. We found, independently, that at least one school that should have been evaluated—Malden Catholic—wasn’t.

Before taking the list down, we considered a different option: keeping the information up, but leaving schools unranked—and also continuing to update information as we received it. However, the inconsistencies in our audit thus far suggest that the data, as a whole, need to be more thoroughly vetted—and, obviously, should have been more thoroughly vetted before we published our top-15 list in the September issue of the magazine. Our readers, and our schools, deserve a better effort than this one. For now, we’re re-examining our process for vetting schools with the intention of strengthening our commitment to fairness and accuracy.

—Carly Carioli, Editor-in-Chief, Boston magazine

 


 

September 9, 2014: We made four significant errors in the reporting and editing of our 2014 lists of Greater Boston’s best public and private schools. As a result, we have updated the rankings online, and will be publishing a corrected list of the top private schools in the October issue of Boston magazine. When our list was published, numerous schools contacted us to point out these errors, and we thank them for their vigilance—and we also apologize to the schools, and to our readers, for the errors. To be clear, these were errors on the part of Boston magazine’s staff—not an attempt by schools to inflate their scores.

Two of the errors occurred in our ranking of Greater Boston’s public schools, and were simple errors of omission: We accidentally left out Burlington High School and Chelmsford High School. We’ve now added both back—Chelmsford is number 44, and Burlington is number 67. The remainder of the public school rankings have been adjusted accordingly.

The remaining two errors, in our ranking of Greater Boston’s private schools, were more significant, and deserve a longer explanation. After the publication of the list, we discovered that in two cases—the Roxbury Latin School, ranked number one, and Austin Preparatory School, ranked number three—we had ranked schools based on what we believed, inaccurately, to be average SAT scores. In fact, at the time neither school disclosed its students’ average SAT score. Instead, Roxbury Latin reported a “median” SAT score—which was higher than its average. And Austin Prep reported a number that it says is the average SAT score of the top 10 percent of its students—a number that is certainly higher than the average for the entire student body.

Austin Prep reported its data to us accurately—that is, it provided us SAT scores with the caveat that the number was drawn from the top 10 percent of its students. However, in the final accounting we failed to catch the caveat. In the case of Roxbury Latin, we used a number clearly identified on the school’s website as a median score rather than an average score—but we failed to note the difference. In both cases, the errors should have been caught during our editing process, and were not. The fault clearly lies with the magazine, and not with the schools.

After we realized the errors, we reached out to both schools and asked for their average SAT scores. Austin Prep declined, but Roxbury Latin agreed to disclose its scores. We then re-crunched our numbers with the same algorithm we used to determine the rankings. As a result, our revised top-15 list of private schools looks very different. Roxbury Latin fell from first to second, and Austin Prep’s ranking fell from three to 48. And those weren’t the only changes: some schools moved up the list, while others moved down.

Why the steep change to Austin Prep, and why did other schools’ rankings change even if their scores didn’t? Advance warning: To explain, we’ll have to get into some math. To compile our rankings, we gathered data from schools, and then employed a statistician to crunch the numbers. That process, as we explained in the methodology note accompanying the rankings, “calculated the mean scores for each category, and then ranked the schools based on a weighted average of each school data point’s difference from the mean, using mean values when data was unavailable.” That last part is the most important: When private schools don’t provide us with SAT scores, we use an estimated data point that is equivalent to the average of all the schools’ scores. So instead of using the partial score provided by Austin Prep—the top-10-percent number—we’re now using a mean value to calculate their ranking. And there were also follow-on effects. The absence of Austin Prep’s SAT scores—and the lowering of Roxbury Latin’s scores—also changed the mean, from which we calculate the overall SAT rankings. That change helps some schools (Phillips Academy jumps from 12 to nine) and hurts others (Buckingham Browne & Nichols School falls from nine to 13).

One criticism of our private-schools list is that SAT scores factor too heavily in the rankings—especially given that an increasing number of private schools refuse to provide their test scores. The last time we compiled private-school rankings, in 2009, 17 of the 61 schools refused to provide SAT data. This year it was 26 of 65. And while we believe the practice of using mean values to be statistically relevant and useful, one effect is that the schools that don’t report their scores tend to cluster in the middle of the rankings. The schools that do report data tend to cluster at the top and at the bottom. After our correction, 14 of the top 15 schools in our list of the best private schools provided us with their SAT scores.

Why don’t more private schools report SAT data? Quite simply: because they don’t have to. Public schools are required by law to report their test scores—private schools are not. Of course, private schools don’t receive public funds, either. But as a result, it’s getting harder and harder to provide a statistical comparison between public and private education, or even from one private school to another.

None of which excuses our errors in the 2014 rankings—all were preventable errors of reporting and editing, as opposed to deviations from our standard methodology. Still, the discussion about our private-school rankings engendered by those errors has sparked an internal discussion here at the magazine—should we do them differently? Should we do them at all?

My personal opinion is that the data we compile and report is a useful tool for parents to compare schools—and that the rankings function both as an incentive for schools to participate, as well as a useful framing device to help readers approach the data. I’m also persuaded that, given the reluctance of many private schools to release testing scores, in the future we’ll have to get smarter about how we evaluate and present our information. Not coincidentally, our 2014 schools issue featured in-depth looks at a wide and growing ambivalence about standardized test scores and rankings. In the pages following our public- and private-school lists, we profiled the incoming president of the Massachusetts Teachers Association, who is calling for a moratorium on standardized tests themselves. And we looked into Northeastern University’s efforts to game the influential college rankings published by U.S. News and World Report. Now, perhaps not soon enough, we’re examining the way we treat rankings ourselves.

—Carly Carioli, Editor-in-Chief, Boston magazine