Does Kiplinger’s claim of “weak” Madison schools compared to “suburban” schools hold up? Would I be writing about it if it did?

 The State Journal has an editorial today about another high national ranking for Madison.  Kiplinger’s, which the State Journal describes as “a monthly business and personal finance periodical,” has a slideshow on its website on ten “Great Cities for Raising Families.”  Madison makes the list.  But Kiplinger is not unreserved in its praise.  It ranks Madison’s schools “4” on a scale of 10 (the next lowest ranking for any of the other nine selected cities is “6”) and asserts “Madison city schools are weak relative to the suburban schools.”

Say what? 

The Kiplinger piece states that the school rankings come from an outfit called GreatSchools.org.  I looked at the GreatSchools website and could find no information on its methodology for ranking the quality of a city’s schools compared to other cities in the country, as it apparently purports to do.  So, I have no way of knowing why Madison’s schools are ranked “4” on a scale of 10 while the schools in, say, Pocatello, Idaho rate a “6” and College Station, Texas schools rack up an “8”.  (Anyone anxious to move from Madison to College Station?)

We do have some basis to examine Kiplinger’s statement that Madison schools are “weak” compared to suburban schools. Before we turn to that examination, though a couple of side notes.  First, I certainly recognize that it is disingenuous to complain about one arbitrary input into a generally high ranking for Madison that is based on nothing but arbitrary inputs. 

Second, let’s pause for a moment to give some props to the State Journal.  I’m not always a fan of the paper’s editorials, but I agree wholeheartedly with these three right-on-the-money paragraphs of today’s effort:

And Madison, in some ways, is ahead of the ‘burbs.  It consistently graduates some of the highest-achieving students in the state.  It offers more kinds of classes and clubs.  Its diverse student population can help prepare children for an increasingly diverse world.

 How much the negative view of Madison schools is based on perception and how much on reality is unclear.  What is clear is that Madison can’t let its schools slip.  A city that loses its schools is in trouble.  Just ask Milwaukee.

That’s why Madison schools need to tout their strengths while aggressively tackling their vexing problems.

Now on to the data.  DPI’s website  contains a treasure trove of information on school performance on the WKCE, Wisconsin’s state-wide standardized test, and other measures of academic performance.  I spent some time on a quick-and-dirty analysis of the performance of students at MMSD schools compared to the students enrolled in the 15 other Dane County school districts.  

Since I don’t have the time to analyze WKCE results for all grades, I looked only at eighth grade scores.  A student’s eight grade performance is more significant than his or her performance in, say, third grade.  You’d rather see a student test in the proficient range in third grade and advanced in eighth grade than to fall into the advanced range in third grade and proficient in eighth. 

Overall test results can be seriously skewed by demographics.  On average, economically disadvantaged students score lower than non-economically-disadvantaged students.  A district that has relatively more economically disadvantaged students will tend to have lower average test results than a district with relatively fewer economically disadvantaged students, all else equal.  In order to make my comparison more meaningful, I limited it to non-economically-disadvantaged students.   

The WKCE categorizes student results into four categories:  minimum, basic, proficient and advanced.  I based my comparison on the percentage of students who fell into the advanced category rather than the more common grouping of both proficient and advanced.  For a non-economically-disadvantaged student, achieving a “proficient” ranking isn’t much to brag about, particularly when more than half of the student’s peers may score in the advanced range.  Also, the percentages of non-economically-disadvantaged students who score in either the proficient or advanced categories are frequently so high (above 90%) that comparisons aren’t all that useful. 

The eighth grade WKCE tests students in five subject areas:  Reading, language arts, math, science and social studies.  For each district, I calculated the average of the percentage of non-economically-disadvantaged eighth graders who tested in the advanced  category in the five subject areas. 

Here are the Dane County school districts ranked from top to bottom on the average percentage of non-economically-disadvantaged 8th graders who scored “advanced” on the most recent WKCE tests:

1.  Verona                                 59.3%

2.  Madison                           59.1%

3.  Middleton-Cross Plains   58.3%

4.  Waunakee                         56.1%

5.  Deerfield                            55.9%

6.  Sun Prairie                       54.0% 

7.  Belleville                            53.1% 

8.  DeForest                           53.0%

9.  Stoughton                         49.6%

10.  Oregon                            48.2%

11.  Monona Grove              47.3%

12.  Wisconsin Heights        47.0%

13.  McFarland                     43.2%

14.  Mt. Horeb                       40.3%

15.  Cambridge                      39.0%

16.  Marshall                         36.6%

I also looked at ACT scores in order to include high school performance.  (The WKCE is given for the final time to high school sophomores in October and so does not measure almost three-fourths of the student’s high school career.)  As far as I can tell, ACT test results are not available disaggregated by economic status.  However, for districts that do not require all students to take the ACT (as Monona Grove does, for example), I suspect that the students who choose to take the test tend to fall disproportionately into the non-economically-disadvantaged category.

Here are the Dane County school districts ranked from top to bottom on the basis of average composite ACT scores for 2009:

 1.  Middleton-Cross Plains             24.6 

2.  McFarland                                    24.5 

3.  Madison                                     24.2 

4.  Oregon                                         24.1   

5.  DeForest                                      23.9 

5.  Waunakee                                    23.9 

7.  Mt. Horeb                                     23.8 

8.  Stoughton                                    23.4 

8.  Verona                                          23.4 

10. Sun Prairie                                  23.2 

11. Belleville                                       23.0 

12. Monona Grove                           22.4 

13.Wisconsin Heights                      22.1 

14. Cambridge                                  22.0 

14. Deerfield                                      22.0 

14. Marshall                                      22.0

As the table indicates, the average composite ACT scores for the districts are much more closely bunched than the 8th grade “advanced” percentages listed above.  (For those with an unusual craving for this sort of information, average ACT scores for school districts in southeastern Wisconsin can be found on page 29 of this report.) (Hat tip: SIS)

Whatever one makes of these tables, they do tend to contradict the assertion in the Kiplinger’s piece that Madison city schools are weak relative to the suburban schools.  On the other hand, I am happy to assume that all the other variables Kiplinger relied upon in finding that Madison is a great city for young families are completely accurate.

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

6 Responses to Does Kiplinger’s claim of “weak” Madison schools compared to “suburban” schools hold up? Would I be writing about it if it did?

  1. Chris says:

    The problem with most rankings from what I can figure is exactly as you point out. They use overall averages which are not generally apples-to-apples due to significant economic demographic differences.

    I’ve done very similar, amateur tinkering with the WKCE data http://madschools.blogspot.com/ which you might be interested in looking at. I haven’t gotten the 2010 data in there just yet.

    One thing though that I do find concerning about the data is the wild disparity within MMSD between some of the top performing schools and bottom ones. Take Hamilton MS compared to Sherman MS: the same demographic (in theory) performs more than twice as good at Hamilton vs. Sherman.

    My hunch about this is that simply because kids are not low income doesn’t mean they’re the same demographic and there are other factors. However I’m not sure the other factors can explain such huge differences.

  2. Chris says:

    Whoops — bad HTML in the previous post: http://madschools.blogspot.com/

  3. Ed

    The Greatschools.org methodology is here: http://www.greatschools.org/find-a-school/defining-your-ideal/ratings.gs?content=2423

    It looks like a simplistic state standardized test score calculation, based on an either/or proficient and above/below proficient.

    It also appears that Kiplinger’s has violated a fundamental rule by using data for other than the intended purpose. Quoting from the Greatschools.org site:

    “Can I compare GreatSchools Ratings across different states?

    No, GreatSchools Ratings cannot be compared across states, because they are based on test results and different states use different tests.”

    Last, it is wort noting without exception the funders of Greatschools .org are charter,voucher, privatization advocates: http://www.greatschools.org/about/aboutUs.page

    * National Funders
    * Bill & Melinda Gates Foundation
    * Robertson Foundation
    * Walton Family Foundation
    * Milwaukee, WI
    * The Kern Family Foundation
    * The Joyce Foundation
    * The Bradley Foundation
    * Washington, D.C.
    * NewSchools Venture Fund
    * Newark, NJ
    * Newark Charter School Fund

  4. I should add that there may be some NAEP scores in the mix, but how they may be mixed isn’t transparent: http://www.greatschools.org/find-a-school/moving/top-cities-2010-methodology.gs?content=2400.

  5. Kristen Nelson says:

    I also want to point out that using WKCE data to isolate and compare specific schools gets complicated. (And the data is very easy to manipulate to get the results you want.) I’m all for using it to show district-wide trends, but it is less useful to use it to compare two schools.

    Let’s say we wanted to compare the “non-economically disadvantaged” kids at two schools. I’ll pick Marquette and Glendale, and I’ll use 4th graders and reading scores. At Glendale, 100% of the kids are either proficient or advanced! At Marquette, it is only 94.9%. Glendale must be a better school!

    But, no. Not really. Glendale has 6 kids who meet my selected comparison requirement. (Marquette has 59.) Can we really compare those two groups and remain statistically valid? Can we really judge an entire school on the test scores of 6 nine-year-olds? All it would take is one kid to have a bad day, or miss breakfast, and whoosh- there goes the advanced testing percentage for “non-economically disadvantaged” kids at that school.

    my query is here:
    http://tiny.cc/35mha

  6. Pingback: Open Enrollment: Board Says No to Recommending 3% Cap | Ed Hughes School Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s