OECD (2011) Education at a Glance: OECD Indicators Paris: OECD
For those of you interested in educational statistics (and for masochists in general), here’s your early Christmas treat. Any attempt at summarizing this 490 page report is futile, but I’ve picked out some data that may be of interest to readers of this site. Italics denote direct quotes from the report – important because I may have misinterpreted some of the data not in italics so check with the original!
81% with upper secondary education and 37% with tertiary (post-secondary) degrees across the OECD in 2009
The proportion of those with tertiary qualifications has risen from 13% in 1933 to 37% in 2009 [for the 34 countries that are full OECD members].
Korea: from 21st to 1st in tertiary education attainment; Germany has the least progress; USA in the middle; Canada second in tertiary attainment; China has 12% of all graduates worldwide
The growth rate at the tertiary level has been relatively slow in the United States, where attainment was originally relatively high, and in Germany, which had lower levels of attainment. In contrast, Japan and Korea have made higher education dramatically more accessible. In both countries, among the cohort born 1933-42, only about one in ten had tertiary qualifications by late in their working lives. Among younger Japanese and Koreans, who reached graduation age around the turn of the millennium, most now have tertiary degrees. On this measure, Korea has moved from the 21st to the first rank among 25 OECD countries with comparable data. The United States now shows just over the average proportion of tertiary-level graduates at age 25-34. In Europe, Germany stands out as the country that has made the least progress: it has a population of tertiary graduates only around half the size, relative to its total population, of many of its neighbours.
Canada is just behind Korea in terms of the proportion of 25-34 year olds with a tertiary qualification (just under 60% – Chart A1.1, p. 30).
While the level of tertiary attainment in China is still low, because of the size of its population, China still holds some 12% of all tertiary graduates, compared with 11% in Japan and 26% in the USA
How important are OECD indicators?
At one level, indicators are no more than a metric for gauging progress towards goals. Yet increasingly, they are performing a more influential role. Indicators can prompt change by raising national concern over weak educational outcomes compared to international benchmarks; sometimes, they can even encourage stronger countries to consolidate their positions. When indicators build a profile of high-performing education systems, they can also inform the design of improvements for weaker systems.
Other stuff
Tons of it. There is a lot of data on upper post-secondary education graduation, on vocational education, and the relationship between graduates and salaries.
What does it mean?
It would really help if the OECD would produce a simple, ‘key facts’ document or brief executive summary that highlighted the main, clear points to come from the study. Why am I having to do this? Surely the OECD should be confident in its own data. However, I suspect that doing a simple fact sheet would be difficult because of the rather abstruse and complicated methods they have used to produce comparable figures across different countries, which tends to raise the question of why all this is being done if it’s not possible to come to some clear conclusions for many of the statistics they have generated.
A lot of the data in this report are difficult to understand or interpret (and yes, I did study statistics at university, but didn’t specialize in it). For instance, from what I can judge, undergraduate completion rates look very low, averaging around 39% for ‘abstract’ Tertiary type A programs (and Canada is slightly below average). Are our completion rates really so low? Do barely one in three students graduate? If not, what is this table comparing?
I suspect that these figures represent the way the OECD statisticians have chosen to measure the graduation rate but it is not at all clear how this is done without spending ages checking the original methodology and even then I’m still not clear as to the actual basis for this figure. (Surely the figure should represent what proportion of students who entered an undergraduate program graduated within 4, 5 or six years, or more. But the data isn’t collected that way). It’s also complicated that in many countries a bachelor’s degree program is three years, whereas in others it’s four or even five. Rather than take general national statistics from survey data then trying to make them ‘fit’ questions the data were never intended to answer, these questions would be better addressed through specific studies. Canada in particular is a nightmare for such statistical comparisons, as we don’t have a national system so the data have to be collected from each province, which again introduces more variables and lack of consistency. And how meaningful is the distinction between type A universities and type B universities? Who decides?
Even more worrying is the comment in the report on how indicators have been driving government behaviour. If I was a minister, I’d like to have my own statistician with me before making any decisions based on these data, which really shouldn’t be necessary. For instance is a degree from Germany the same as one from Slovakia? The OECD indictors suggest that it would be better to have more degrees at a lower standard, than fewer degrees at a higher standard. Of course, no-one at the OECD would make that argument but the argument is strongly implicit in the rankings. Never mind the quality, feel the width.
The attempt to gather cross national data in a comparable way is commendable, and certainly the trends over a long period of time become more discernible, but it does sometimes feel as if the OECD is trying to do the impossible, which is to standardize results from very different systems through the same sausage machine.