Tertiary performance

The results of performance-ranking criteria designed to make tertiary institutions more accountable were released last week.

The material is on successful course completion, completion of qualifications, student progression to higher study and student retention.

The Government spends $2.2 billion directly on tertiary tuition subsidies and $1.8 billion on student loans and allowances.

It is little wonder, therefore, that "accountability" to the taxpayer is necessary.

This latest exercise, nevertheless, raises the usual problems when "blunt instruments" are applied as efforts are made to ensure best-spending of public money.

First is the wide diversity in what various universities and polytechnics do.

How can they be lined up in "league" tables when they are so different? Secondly, these assessments - which from 2012 will be linked with 5% of funding - can distort what institutions do.

All will "play the game" as much as they can, just as they do with performance-based research funding.

Making sure many of the right boxes are ticked becomes more important than just doing jobs as effectively as possible. Thirdly, such mechanisms are a boost to unwieldly and costly bureaucracy, whether in the Tertiary Education Commission or in the institutions themselves.

Staff are required to produce and analyse information and to make sure the game is played to best advantage.

Those who warn the information should be interpreted with considerable caution are correct.

The commission would have it that the data will help students make choices and will improve public accountability and encourage improvements.

Taken at face value, Otago and Auckland universities top the charts so would appear to be the best places to attend.

This may well be so, and from a parochial point of view we both think and hope so.

But overall ratings can swamp wide variations within universities, particularly those with more than 20,000 students and an extensive array of courses.

What, too, about the student mix? Otago attracts a disproportionate number of students from higher socio-economic groups and these (mostly) young men and women will tend to do better whatever the institution.

This is like the head-start a decile 10 school has over one from a poorer area.

Massey University, at the other end of the tables, with its many part-time extramural and older students, has cried foul.

Many of its students, like those in various polytechnic sub-degree programmes, are there to complete specific courses but not full qualifications.

It should come as little surprise, then, that they fare poorly on the criteria of completing qualifications.

What, too, about the incentives created to make it easier to pass courses, thereby lowering standards? Are the measures an inducement not simply to improve quality and support for students but also to slacken standards at the margins? And what about pressure to introduce or drop programmes solely because of their effect on rankings and funding? Otago Polytechnic, meanwhile, is able to crow about being the best-performing educational organisation for students completing degrees at 91%.

While that seems impressive, and probably is, what that means and why it has occurred could be open to different interpretations.

Low in all the standards is the Invercargill-based Southern Institute of Technology.

Has the free-fees scheme had unintended consequences? Does, in fact, this public information reveal all is not well and significant improvements are needed? It sounds as though the institute will be in challenging discussions with commission "investments managers", as they are known.

At least the commission recognises valid reasons can explain why some figures in some places appear deficient.

And at least, despite all the report's limitations, efforts are being made to ensure public money is better spent.

Add a Comment