
Two Qatar-based universities are among the top five higher education institutions in the Middle East and North Africa based on the quality of their research, according a new index that includes branch campuses for the first time.
Texas A&M University at Qatar, which specializes in engineering programs, took the top spot in a snapshot of the Times Higher Education (THE) ranking, while Qatar University (QU) comes in fourth place out of nearly 100 universities in the region that were examined.
THE has so far only released the top five institutions, and is expected to reveal the full ranking of 30 universities at its MENA Universities Summit it will host at QU on Feb. 23-24.
Lebanon also did well, with two of its universities making the cut in the “sneak peek” list. Saudi Arabia’s King Abdulaziz University came in third place.
- Texas A&M at Qatar
- Lebanese American University
- King Abdulaziz University (Saudi Arabia)
- Â Qatar University
- American University of Beirut
Scoring system
This latest table judged the universities on one aspect – the excellence of their research, based on Elsevier’s Scopus database – and used a metric the rankings organization has called “field weighted citation impact.” This measures the ratio of citations received for each institution against the number which it would be expected to get based on the average of the subject field.
Articles, reviews and conference papers published between 2009 and 2013 across all subject disciplines were examined, although universities had to publish a minimum of 50 papers a year to qualify.
It did not consider the overall volume of research published – a factor that is part of an ongoing debate among academic experts over how universities should be rated.
THE has said that the purpose of its upcoming summit is for academic experts to discuss the factors that should be included in an index of universities in the region, and a full ranking is expected to be released next year.
For example, most Western rankings do not consider that some research in the region may only be published in Arabic and thus not garner the same number of citations outside of the region as an English paper.
Other rankings
The early results from THE’s table of top performers in the region is at odds with other recently-published rankings, which did not include branch campuses, but placed QU further down their tables.
For example, in the region’s first-ever higher education ranking by Washington-based US News & World Report, QU came in 29th place out of a total of 91 MENA schools.
In this index, quality and quantity of research was considered and the company looked at schools that had published 400 or more papers between 2009 and 2013.
Universities were then ranked according to nine weighted indicators, including the number of published papers and how frequently its research is cited in other articles.
Meanwhile, QS Intelligence Unit’s MENA rankings which were also released at the end of that year, put QU overall in 16th position for the region.
This measured a number of different factors, including academic reputation and employer reputation in addition to institutions’ research impact by looking at paper per faculty and citations per paper.
Thoughts?
I must say the engineering graduates produced by Texas A&M are head and shoulders about QU and I would never hire anyone graduating from a Saudi University. The education there is blinkered and close minded, they are not prepared for the real world, well certainly not for employment outside of saudi.
Re: KAUST being #3, you might want to consider this: https://liorpachter.wordpress.com/2014/10/31/to-some-a-citation-is-worth-3-per-year/
They’re buying, not working and earning, their way into high rankings.
This is something that should be celebrated more than the Handball World Cup.
Why?
Might have been cheaper to buy.
kill yourself
Long-term impact on the development of the education system?
Do you really need an answer to this?
I agree
Unfortunately, humanities research is not indexed in Scopus, so research contributed from other disciplines is not factored into these metrics. A comprehensive analysis including all disciplines might
change these rankings.
Incorrect actually. Scopus does index some humanities but then it covers more science and technology generally. http://www.elsevier.com/online-tools/scopus/content-overview
Ranking universities is useless.
Unless you go to a really bad one and then you might care.
I am for: how many Nobel Prize laureates has the university produced? None? then it is a bad university.
How many Western universities produced Nobel laureates in their first 5 years or decade of activity?
The article talks about the QF universities, which are foreign and very very far from their first decade of activity, as is the QF in general.
Qatar University is not QF, and far from foreign … but I think I understand what you mean … and agree …
Agreed, but it is 40ish years old and not a newcomer. The other institutions are foreign, not Qatari.
Not all of the Colleges are 40 years old. Engineering and Bus & Econ are established as you mention. http://en.wikipedia.org/wiki/Qatar_University#Colleges_and_Departments
Moreover, QU didn’t really become the size of an institution that would become competitive until the 2003-2007 period with a boost in faculty, student enrollment, administrative restructuring, and funding increases.
Good points all and agreed
No. The article talks about the Qatari branches not the mother institutions. If it is the same thing as you are suggesting then the Nobel laureates of Texas A&M US should be considered as laureates of the Qatari brunch as well.
The Qatari branches are part of the mother institutions and ultimately answer to them – QF has precious little educational input into them. Many researchers will start research and do a few articles at the home campus and then carry on publishing on the topic while at the international branch campus. They are just that, a branch of a foreign institution, not a Qatar institution – the stats are misleading.
The Qatari branches are part of the home institution, the reverse does not apply, they don’t have that level of autonomy. Home institutions would of course get partial accolades for the achievements of their branch campuses – the same can not be said for branch campuses.
I do not get it. So you agree with me that it does not make sense to expect them to have Nobel laureates after 5 years of service?
Yes and no. I wouldn’t expect it after five years of service, but some of the universities are hundreds of years old, the branch campuses are only 10 or so.
More realistically, I wouldn’t expect it from the branch campuses because they don’t have the faculty who are going to earn them Nobels – by a large the staff drawn to the campuses (with the possible exception of Georgetown) are not the Nobel sort. Do the mother institutions have the ability to earn Nobels – yes; so the supported foreign branches? No.
What you’re not accounting for is that many of these faculty are young and have a great deal of pressure to publish in good journals if they ever want to move out of here to another school. Now, the negative is that this pressure could adversely impact their teaching and since they want good evals from students they might make tests easier than they should.
Yes, sadly the pressure that the faculty get to pass students of certain families and passports is well known.
haha butt hurt
The title of this article is misleading. The published ranking measures the size of a university’s research output, not its quality.
As we all know, quantity does not equal quality 😉
It’s not volume of publications. .”The top-five snapshot was calculated using the ratio of the citations received by an institution’s publication output between 2009 and 2013 and the total citations that would be expected based on the average of the subject field.”
which is basically a measure of quantity not quality.
The principle behind citation databases such as Scopus is that only “quality” peer-reviewed publications are indexed in the database in the first place. The methodology assumes that if a publication is highly cited by other publications within the database then it is (generally) also an indication of “quality” rather than just quantity. There are exceptions to the rule such as the discredited MMR article by Wakefield that was in The Lancet until they decided it was rubbish. Famously “bad” articles such as this get highly cited too but you could argue these exceptions prove the rule.
I think AEC is correct. And these are pubs by grad students, post-docs and professors. Maybe rarely a student would get attached as co-author. There is a whole esoteric system behind which journals have impact factors and so on. But everyone in those disciplines knows which ones they are. Believe it or not there are some journals with low reputation such that publishing there can even reflect negatively on you if you have too many of these on your c.v. Those types of gut journals are not in an index like this.
A Citation is a qualitative measure. The more of it you get the more qualitative your work is, in theory obviously.
What? How do you figure that? Citation counts are quantitative,and in no way reflective of the quality of your work. That is where you get into impact factor, with the assumption that publications in rigorous journals are of higher quality and of greater impact than lesser journals. As for being ‘qualitative’ – no, not in the least. I think that you are confusing ‘quality’ and ‘qualitative’, very different beasts as they are used in research.
Consider citations as trophies. Not everyone get cited, and much less in reputable journals. The more of these you get, the more likely you have an interesting work. Obviously, this is not totally accurate but it should be counted as one of the qualitative characteristics in any ranking.
Interesting perhaps but not necessarily of good quality. A university in Qatar has three advantages. 1) There are fewer universities in Qatar then Saudi for example, so maybe 5 research papers in Saudi each come from a different university, in Qatar the 5 might come from 2 universities. As an example let’s say as an A&M student I want research if groundwater was being contaminated by onshore drilling activities. 2) The lack of existing research here on that topic will get me citations even if the quality of research is average or even bad because of the limited data available. 3) The availability of research grants here is probably better than Lebanon, which helps cover the cost of purchasing or using new technologies on projects.
Other countries on the list probably have advantages as well but Im not familiar with other academic institutions in the region. My point is rating quality is quite complex and possibly subjective depending on what factors you use to measure and rate.
fail
Saudi universities have been buying inflated article ratings for a while now
https://www.sciencemag.org/content/334/6061/1344?related-urls=yes&legid=sci;334/6061/1344
https://liorpachter.wordpress.com/2014/10/31/to-some-a-citation-is-worth-3-per-year/
The rumor is that QU is moving in that direction with some recent hires.
If I’m not mistaken, the good thing about this kind of rating is it is based a less subjective measure. However, looking at the two articles you cite I can see that even this system could be manipulated but I suspect over the long run that practice will have lower impact on this particular index than on the USNews one. There’s bound to be some push back from their actual home department. Of course the other side of focusing on citation and impact is that it’s not effective as a measure of teaching success or whether students actually go on to get decent jobs and make meaningful impacts. But these latter soft measures are precisely where the US News style ratings fall down and become susceptible to bullying and bribing the refs. Those ratings systems have lost a great deal of credibility among professionals if not among high school parents and prospective students. At least this measure indicates T A&M-Q & QU academics and post-docs are getting published and their pubs are getting cited in journals with high impact ratings. That is indeed something positive.
Re the QU Rumor: Source? Or heard from a friend who heard from a friend who heard it from the Bureau of Somebody’s Butt (BSB)? Truly not meant as a personal attack, just when throwing some stuff like that into the mix we all know lots of things get floated that fly in all different directions with no actual evidence. So ‘grain of salt’ territory.
Rumours are by their nature unsourced. QU was my employer for near a decade until recently. That was the water cooler gossip in my department, and tied in with memos mandating the use of QU name on all publishing it led to speculation. There was a huge push to raise QU’s publishing profile, some of it not, in imho, fully ethical. As you say, hard to prove, but certainly plausible.
Ok. At least I have a sense of the sourcing of the rumor from your description. Thanks.
Looking at all the negative comments, I lost faith in some of the top DN commenters. But then again, I thought, that’s what the colonists always wanted, keep the natives uneducated, fearing that they may take back their jobs.