Since the current hot topic is the appointment of a new UM VC, perhaps a few words of friendly advice are relevant. And since the previous VC was (in)famous for bragging about the fact that UM was among the top 100 and later top 200 universities in the world, let us revisit those rankings again.
I have to thank Dr.Richard Holmes, who sent me a link to his website on university rankings before my prelim exams and I apologize to him for the delay in publishing this post. He has
an excellent article which was published in the Asian Journal of University Education on the shortcomings of the THES ranking system and I want to highlight some salient points for the benefit of the readers and perhaps for Dr Sharifah Hapsah, if she ever gets to read this blog. (Dr. Holmes is currently teaching at MARA)
His article notes that the firm that was given the task of compiling the rankings, a certain QS Quacquarelli Symonds, "does not seem to have any specialized knowledge of research and teaching in the natural and social sciences or the humanities". Rather, it is a company that specializes in the promotion of MBA programs and executive recruitment. The fact that it has offices in Washington DC, Paris, Beijing, Singapore, Tokyo and Sydney, Dr. Holmes suggest, can partly explain the bias of the rankings towards universities in certain countries.
This company's ignorance of university education worldwide is shown using two examples. The first is their mistake in coding non-Malays in Malaysian universities as foreigners which this blog highlighted even before the admission of this mistake by QS. The second example that Dr. Holmes give is the fact that the company listed "Beijing University" as the top university in Asia even though, strictly speaking, there is no such thing as "Beijing University". A simple google search of "Beijing University" would reveal that it is actually "Peking University" and that there are a number of specialist "Beijing" universities in different fields which are not associated with "Peking University", the premier university in China.
In addition, "QSÂs managing director, Nunzio Quacquarelli, is on record as telling a meeting in Malaysia that the reason for the contrast between Beijing UniversityÂs stellar score on the peer review and its score of zero for citations of research was that Âthey probably published their work in Mandarin but we just couldnÂt find the journals (New Straits Times, 22/11/2005). Had they looked for research from Peking University, which is how researchers describe their affiliation in academic journals, they would have found quite a bit. It looks as though some people in QS were unaware of the universityÂs official name." (Holmes, 2006) This makes me wonder about the ability of QS to conduct a survey of this nature. Did they offer a good 'rate' to the Guardian? Was it a quid pro quo thing? Why not ask a survey firm such as AC Nielson who have offices in many more countries than QS and would presumably be more experienced in conducting surveys of this nature? (My suspicion is that there larger, global survey firms were probably too expensive)
Dr. Holmes brings up the point that the peer review category, which constitutes 40% of the overall score, is the most problematic category out of all the categories used in the THES ranking. It lacks transparency especially in regards to the selection and sample size of the participants in the peer review survey.
Dr. Holmes' criticism probably confirms what other people who have examined the rankings in depth feel - "It is difficult to avoid the suspicion that the peer review was based on convenience sampling, with QS simply asking those that they had come across during their consultancy activities. This would explain the presence in the top 200 of several apparently undistinguished universities from France, Australia and China where the consultants have offices and the comparative scarcity of universities from Eastern Europe, Israel, Taiwan and Canada where they do not."
Dr. Holmes also makes a convincing point that the "the peer review is not really an international ranking since academics where asked "to name Âthe top universities in the subject areas and the geographical regions in which they have expertise. In other words Chinese physicists, we can only assume, were not asked to name the best university for physics in the world but to name the best university for, say, nuclear physics in Asia, maybe even just in China. If this is the case, these are not then world rankings."
Dr. Holmes brings up many more criticisms of the THES survey including the dubious methodology of recruiter ratings, the theoretical foundations of using measures of international students and faculty and the bias against the social sciences and humanities in the citation score. I'd encourage anyone who is interested in the THES rankings to read his article in depth and I'd definitely encourage the new UM VC and the new Minister for Higher Education to read this article.
Finally, there are two ways in which Malaysian universities can improve its ranking among the top universities in the world. One way is easier and the other way takes more effort and will entail painful institutional changes.
The easy way includes taking the following measures, assuming that QS continues to conduct the THES survey using a similar methodology:
1) Have a lot of tie-ups with other universities to offer a multitude of MBA courses. Better yet, have tie-ups with universities that are clients of QS. That is a sure fire way to be featured more prominently and positively in the radar screen of QS.
2) Track down the academic 'experts' which QS uses for the peer review and offer them 'incentives' to rank UM highly.
3) Hire a bunch of foreign lecturers regardless of their qualifications.
4) Open up places in local varsities to foreign students regardless of their qualifications.
5) Have the local universities direct QS to employers that will rate the local universities favorably. Better yet, go directly to these employers and offer them 'incentives' to rate the local varsities highly.
The second way involves painful institutional changes but will ensure a genuine improvement in the quality of Malaysian universities in the medium to long term, regardless of the methodology used or the consultant employed to compile these rankings:
1) Hire, fire and promote lecturers based on academic work using objective criterion such as publications in highly acclaimed journals or the publication of widely acknowledge books and research in the field.
2) Make appointments to positions of administrative leadership (VC, deputy VC, heads of departments) based on ability to improve academic standards and other objective criterion that is linked to academic standards.
3) Based on the above two recommendations, hiring and appointment policies should be race-blind.
4) Create incentives for raising private funds / donations to the local universities so that resources and infrastructure can be improved and better pay can be awarded to distinguished faculty members.
5) Create incentives for members of the academia to work with the private sector on research projects so as to obtain external funding as well as to leverage the expertise available in the private sector
So, which path do you think is most likely to be taken?
UM should institutionalised the tenure system ala the american system. Tenure should be granted subject to a refereeing process of the lecturer's publications by a set of external referees.
ReplyDeleteLearn from the mistakes of Charles Sturt University whereby continously charging students with no qualification in sight is not the way to go.
ReplyDelete-----
ReplyDeletehttp://en.wikipedia.org/wiki/1421_theory
------------------
Actually, Dr. Holmes' ranking criteria is also highly subjective especially categories E,F,and G.
ReplyDeleteCat C - total number of teachers should be further sub-divided to qualifications held by these lecturers. With higher scores given to PhD holders, Fellows, etc.
There should also be an inclusion of the number of full-time research-only staff and total funding of research grants received.
A small percentage should also be given to academic affiliations, partnerships, groupings like U21, Go8, Russell, etc. So for exp, Melbourne U and UWA who are members of the Go8 can be given 3 points each.
Honours and awards such as the Nobel prize, Fields Medals, etc. should be given some percentage but not a large propotion. Such awards are highly biased towards Western countries but nevertheless should not be discarded.
These suggestions are more objective and ready statistics are available. Basically, any ranking criteria should be based on a fair and just system that's not biased to any organisation, publishing company, etc. Hence my reasoning for saying that cats E, F, G all based on Y! and google is not suitable.
Actually, all this has got me thinking. Y not we set-up our own rankings system. An open source system so to speak where anyone can suggest their methodologies and then a committee discusses on its suitability. Sort of like RFCs used by IETF. All these will go through a structured process made up by volunteering academicians of high calibre. A select committee will meet bimonthly to discuss and approve any recommendations. Or else, we could just start really small with Mr Tony leading up the committee which hopefully can include some notable academicians. The emphasis of such a ranking system should be to create one that is as concise as possible. This includes such subjective areas as peer review and employment prospects, but such criterions should not be given a heavy weightage. Volunteers (students and academicians) from all over the world or maybe just start with Asia can report on any discrepancies or faulty data of their universities. So therefore, such a problem as Beijing - Peking University will not arise. The key here is open source ala wikipedia.
I don't think the ranking system that Dr. Holmes' suggested is any better than that of THES. It has its own set of discrepancies and again, it's highly subjective, especially in categories E, F and G.
ReplyDeleteAs the previous commenter wrote, there are other important criteria, which I thought were quite necessary to rank a University, that were not included -- research activities (grants, fundings), research staff, honours and awards (international level), publication score of each university from ISI-indexed journals only (Google is not too bad an option, but you do get dubious search hits which may be inaccurate.
G is in particular, heavily weighted (!) and quite unnecessary. From my own experience, Yahoo! does not provide good search hits when it comes to "research activities", so to speak.
I must add as well, that many universities in the ranking were given 0 points due to an apparent lack of information. Why is that so? No explanation at all. If there's so much lack of information and statistics, its not possible to do any credible ranking.
Furthermore, much of the explanation of methods were poorly documented and it does not shed much light into the 'whats', 'hows' and 'whys' of criteria selection, scoring and nature of research conducted.
I'm feeling discontented as I think (am more than sure of) my university is misrepresented in its scores (a lot of 0 points given due to lack of information in many categories!). We are very active in research work, acquisition of funds and grants, and journal publishing at the international level (a simple Google scholar search will reveal that).
Why would this ranking system of a certain Dr. Holmes be any better than THES's or even Shanghai Jiao Tong's (which I think is the most accurate ranking thus far)?
I don't know about you KM, but wouldn't you take a sit and give a thought about that ranking criteria?
quote from article:
ReplyDelete3) Hire a bunch of foreign lecturers regardless of their qualifications.
4) Open up places in local varsities to foreign students regardless of their qualifications.
UM has been doing no3 in faculty of science chemistry department. My friend is reading chemistry there. The few foreign lecturers they have are basically flower pots --- they are pretty... pretty useless, according to him.
UKM is doing no4. Quite a number of Middle East, African and Indonesian students. I may be bias against them, but I'm not sure how good they really are.