Friday, December 15, 2006

It takes a kampung...

I posted my findings on USM's faculty about a week back. Thanks for all your comments. I thought it would be useful to do a follow up by examining each faculty / school within USM (or any other public university) in greater detail. But such an undertaking is too vast for me or Tony to do by ourselves. Hence, I'm inviting our readers to join us in a little bit of 'investigative' blogging.

I'm asking our readers who are interested to do the following:

1) Pick a faculty or department or school e.g. Public Administration, Electrical Engineering, History, Chemistry, within a public university of which you are relatively familiar with (either in terms of the subject matter or the faculty or both)

2) Compile a list of full time teaching faculty within this department (for USM, go to this link, for other universities, you can visit that university's website)

3) Do a systematic search of the publication record of each of the teaching faculty in that department / school. This can get pretty complicated but I can think of a couple of ways to do this. If you have access to some sort of journal database (especially if you're currently in a university setting), you can search these journal databases for articles by these faculty members. One such example is Thomson's Web of Science or IEEE for electrical engineers. If you don't have access to such journal databases, you can use Google Scholar for a much less scientific / systematic way of finding articles by certain authors. (The advantage of google scholar is that it picks up books as well as articles)

4) Compile some sort of ranking system or calculation method of the articles or books published by these faculty members. For example, you might want to count the number of articles a faculty member has published in high impact journals. (For a litest of such journals, you might want to visit this link provided to us by "Your Fellow Anon", one of our readers. You woud also want to evaluate the impact / importance of books written by these faculty members. For example, a translated textbook (from English into Malay) might not have as high an impact as a book which is trying to show or prove a new hypothesis.

I can think of a number of reasons why something like this would come in handy. It would be a good judge of whether a professor or associate professor has a good publication record or if he or she has gotten that title based on other factors. It would be the basis for comparing across the public universities in Malaysia, if we have a large enough sample size. It would give us different methodologies by which to rank different departments. It would give us some indication of whether our younger faculty members are keeping up in the publishing 'contest', so to speak.

5) Whatever the methodology used, it should be one that is systematic and consistent. We're not trying to 'target' any particular faculty member. We're just trying to evaluate the quality of a department as a whole. If we find certain faculty members who have a publication record that is not commensurate with his or her position, we'll highlight this fact. But we won't do a publication search just for one or two people in a department because we don't happen to like them.

6) The reason why I need more people to get involved is because different people have different knowledge areas and hence, are in better positions to judge what a good publication record is in that knowledge area. For example, I would have no idea how to judge whether a professor in the computer science department has a good publication record or not.

So, if you're interested in joining this informal 'project', please email me at We'll post your findings (with complete attribution to you, of course, unless you wish to remain annonymous) on this blog.

P.S. In case some of you are wondering about the title, it's a spin-off from Hillary Clinton's book, "It Takes a Village". In our case, we need the 'kampung' which comprise of our readers to come together to collaborate on this 'project'.


Anonymous said...

Hi KM,
I've always found your blog very well written and interesting, and I'm a frequent reader, though this is my first time comment.

I never went beyond a Master's degree as I work in industry, but I do keep a continuing interest in the research related to my work area.

I've had some contact with the academic staff at USM, my opinion is mixed. There are some excellent people there, many so-so, and a fair number of incompetent folks who don't deserve their positions.
I fully agree the bottom line is the quality of the research which is turned out. The wider academic world has a well defined system of peer review on research papers, and it shouldn't be too difficult to have a yardstick put in place for assessing the local academics.

I very much agree with you about the existence of unofficial but very real networking at a Postgraduate level. For a PhD or Masters candidate, a recommendation to faculty members from a well regarded previous Postgraduate student can be a very valuable in addition to their formal academic qualifications. The networking can be very helpful later on in an academic career too.

I tend to disagree with your assessment based on % of PhD's from elite UK and US universities. No doubt Oxford, Cambridge and Harvard have excellent academic standards. But the sum total of human knowledge is very,very wide and Oxbridge and Harvard are not necessarily the cream in many fields of knowledge. If you dive deeper into Scientific and Technical publications, a great deal of cutting edge research comes out of the "provincial" universities you listed e.g. Cardiff, Sheffield, Birmingham and Manchester, Pittsburgh in the US.

My particular knowledge area is engineering and manufacturing, and I'm well aware a lot of the best research is coming out of universities based in industrial cities, NOT Oxbridge or Harvard.

Kian Ming said...

Thanks kittykat46 on your insightful remarks. I just want to qualify that I used the overall THES UK rankings and the overall US News and World Report because it was the easiest and most consistent 1st cut method to use. I agree that certain universities such as Sheffield and Birmingham in the UK and Pittsburg in the US have top notch research going on in fields such as engineering and manufacturing.

To be consistent, I should do a 2nd cut of analysis examining the rankings of universities based on different fields. But this is very time consuming and will probably take me another week, if not more. What I think I will do is to pick a few departments in USM (maybe those with more than 10 faculty members listed) and then do a more in depth analysis of these selected departments.

But of course, one might criticize the ranking of certain universities by individual fields as well. It's an imperfect system but we have to use some sort of baseline.

Anonymous said...

Hi Kian Ming,

I realised that in a department itself (say of 20 faculty members), ther can be a HUGE skew in term of publications. You may have one professor publications in peer-reviewed journals up to 200. And the other colleague just have 10-20 publications, or none.

Another thing about professors at medical faculty is that they are heavily involved in patient care. They are very brilliant, do all sorts of complicated procedures. They work from 8am till 10pm at night, almost everyday. Therefore no time for writing research papers. It will be most unfair if we were to categorise such people as "non-deserving" professors. These academician have different priorities, especially when the teaching staffs are not enough.

Anonymous said...

Regarding the following in (4):
"Compile some sort of ranking system or calculation method of the articles or books published by these faculty members. For example, you might want to count the number of articles a faculty member has published in high impact journals."
High impact factor of journals can be a controversial matter. First of all it varies across discipline. Any comparison made should be restricted to that particular discipline. Secondly it does not necessarily reflect the quality of the journal; for instance, theoretical journals may have low impact factor because the pool of researchers in the particular fields is small. The impact factor of a journal does not measure the impact of the article published. An article in a lower impact journal can garner higher citation counts than an article in a relatively higher impact journal; of course this refers to articles already in print for quite some time. Nevertheless, the impact factor of a journal (and also citation counts of a researcher) can be a useful indicator but should be used with care.

Anonymous said...




Anonymous said...

page 18 Sunday Star 17 Dec. 06

Thanks to Tony and Kian Ming.