||Volume 17, Number 4, January/February 2005 |
The problems behind the Fraser Institute awards
Many media outlets have carried news that five Vancouver public schools (Magee, King George, Kitsilano, Byng, and U-Hill) have refused a nomination for an award from the Fraser Institute.
Alex Grant, principal of Magee, sent a polite note declining the nomination and stating he had little confidence in the institute’s "ability to conduct a meaningful assessment."
Peter Cowley, the institute’s director of school performance studies, asked for clarification. Here is Alex’s response.
Dear Mr. Cowley,
It was not my intent to enter into a debate regarding your awards program; however, I would be pleased to have Magee recognized for excellence in any aspect of our school program, including academic excellence, if the assessment of excellence was credible.
Your Report Card on Secondary Schools purports to rate and rank schools, yet it ignores entire dimensions of student performance. My criticism of the Fraser Institute ratings is not based solely on the fact that areas such as Music, Art, Drama, Business Education, Physical Education, Special Education, and Technical Studies are completely ignored; I object to the manipulation of a limited data set to produce a rank ordering of schools from 1 to 279 as though such a feat had meaning or validity. One of the more obvious shortcomings of your data analysis is evident in your rationale for including a rating according to the number of provincially examinable courses taken per student. It is simplistic in the extreme to suggest that, "for most students a decision to take advantage of these courses is a good one and a school that is successful in encouraging students to take these courses shows that it offers practical, well-informed counselling." This is analogous to ranking diets on the basis of their recommended intake of protein; since protein is essential for a healthy diet, those that recommend the greatest daily intake of protein (regardless of age, sex, weight, medical condition, etc.) must be the healthiest!
A hallmark of practical, well-informed counselling is the tailoring of advice to the individual’s talents, interests, and aspirations. Among those students who plan to attend university, three to five provincially examinable courses may be required for admission to a range of programs, although certain specialized programs require or recommend some Grade 12 level courses that are not provincially examinable. Once admission requirements are met, there is no compelling argument for assigning value to a provincially examinable course and no value to a non-examinable course. If Magee students stopped enrolling in Music courses and switched to provincially examinable courses (even if those courses were neither of interest to the students nor required for their post-secondary plans), then Magee’s ranking on this indicator would go up. This is not evidence of a well-considered indicator.
Each of the eight indicators you have chosen has some surface validity; however, by limiting your assessment to those indicators that are easily reduced to a number (and then forcing a rating of 0 to 10 for each, coupled with some dubious statistical methods, including calculating an average examination percentage from a set of weighted mean scores on a diverse set of examinations), you have missed most of what really matters in schools. The main reason that you have no competition in the business of ranking schools is that those who understand schools also understand how complex, contextual, subjective, and in the end, how meaningless an exercise it is.
You ask for suggestions regarding output measures for music and/or art. Your question betrays the flaw at the heart of your ranking/rating system—if it cannot be easily measured or reduced to a number, it will not fit into your system. I do not mean to suggest that it is not possible to distinguish between music or art programs of greater or lesser quality; rather, it is possible but not simple. After spending a bit of time considering how one might rate or compare a choral jazz program to a classical strings program it is quite natural to question just what is gained and what is lost by reducing an entire school to a number that is reported to one decimal place.
I note that you have plans to include participation in inter-school sports teams as a new indicator in future ratings. Again, you may be taking something with surface validity and reducing it to a meaningless number by ignoring contextual issues such as:
- Are staff members required by contract to coach or sponsor teams?
- Are students required to sign up for one or more sports?
- Is there a strong community-based league in competition with schools for the athletes’ time?
- Does the school have a strong intramural program?
- To what extent do climate, community facilities, cultural, and socio-economic factors limit or encourage participation?
- If some participation is good, is more necessarily better?
In summation, the best suggestion I can offer in regard to improving your Report Card (short of eliminating it) is to stop amalgamating disparate types of data to produce a single rating/ranking for schools. It would be more honest to report average examination scores by subject and weight each of them according to participation rates. The disaggregated data are much more meaningful and provide a clearer picture of student performance. It would also help to clarify that you are not actually rating schools (they are far too complex for your limited data set to capture), you are rating student performance on provincial examinations.
Alex Grant, principal, Magee Secondary School, Vancouver.