Wednesday, April 20, 2016

What’s missing in the Indian ranking for varsities?



What’s missing in the Indian ranking for varsities?

The all-India ranking for higher education institutions in the country released by the human resource development (HRD) ministry last week is being seen as a big step in improving the quality of education imparted by Indian universities. It also aims to make these universities globally competitive. While 100 institutions each were ranked under the university and engineering categories, 50 each were ranked in management (research and teaching) and pharmacy (research and teaching) categories. 

Prominent names ­missing from the list
 
One of the surprise elements was that some prominent institutions in different disciplines were missing in the National Institutional Ranking Framework (NIRF). These include Delhi Institute of Pharmaceutical Sciences and Research, National Law School of India University (Bengaluru), Faculty of Management Studies (Delhi), National Institute of Design (Ahmedabad), Mudra Institute of Communication (Ahmedabad), Indian Institute of Mass Communication (Delhi/Dhenkanal) and School of Planning and Architecture (Delhi and others). 

The rankings were arrived at after detailed analysis and validation of the data submitted by more than 3,600 higher educational institutions in the country classified in six categories. These rankings followed an Indian approach, where an academic institute was assessed on parameters, including teaching-learning; research; collaborative practice and professional performance; graduate outcomes; placements; outreach and inclusive action and peer group perception. Each of these was subdivided into nearly 20 sub criteria to comprehensively assess an institution.

Citing reasons for prominent names not making the cut, Ashok Thakur, former secretary to the government of India, department of higher education, HRD ministry, says, “This is the first year of NIRF and it is possible that many institutions, including some very good ones could not participate or could not upload complete information. For example, in the category of universities, though we have more than 740 in the country, only about 250 of them participated.” 

Professor Surendra Prasad, who is part of the core committee involved in devising the ranking framework, says, “NIRF has ranked only those institutions, which registered with it for ranking. Those who registered, were allowed to submit the data required for the rankings. Those who did not, were out of our loop. It was entirely their choice.”

Karthick Sridhar, vice chairman, Indian Centre for Academic Rankings and Excellence Pvt Ltd, however, says, NIRF could have been more proactive in reaching out to institutions. “Conducting workshops at regional levels, addressing queries, setting up a dedicated phone and email assistance service and engaging more closely with stakeholders so as to educate them on the requirements could have been done in a better way. Many technical universities felt handicapped as they were not aware if they were to participate as a university or an engineering college or both. The data requirements set forth by NIRF was way too demanding such as data of the last three years. There was no proper channel of communication between institutions and NIRF. All phone calls made to a particular number at National Board of Accreditation were either unanswered or queries redirected to UGC or AICTE. No one agency took complete charge and addressed the situation. In the process, many well-known institutions ignored the rankings and hence many not-so-well-known institutions got their chance under the sun.”

Institutions cite their own reasons for not being able to make a mark on the NIRF. “I think institutions like FMS have been clubbed with their parent university as FMS is not a standalone institute. It is a constituent component of University of Delhi. So the university has been ranked and not individual faculties and departments,” says ML Singla, dean, Faculty of Management Studies.

Categories not exhaustive
 
Another aspect where there is scope for improvement in the NIRF is the number of categories under which institutions have been ranked. This number isn’t exhaustive. 

“In the years to come, the number of categories will have to be increased in order to cater to various types of institutions as one can only compare apples with apples. For example, apart from subject-wise categories, even within the universities, the newer ones want separate parameters for ranking, which, to some extent, is understandable as their challenges are different from the established ones. As far as the overall parameters are concerned, these seem to be very relevant and adequate. In our country, even an engineering or a dental collage can don the mantle of a university to circumvent regulation. The national rankings can highlight such discrepancies and help separate the wheat from the chaff,” says Thakur. Data from the Category B institutions in all domains continued to exhibit major inconsistencies despite NIRF’s best efforts to remove them. It was decided, therefore, that no rankings be announced for Category B institutions this year. Similarly, due to non-representative participation in the domains of architecture and general degree colleges, no rankings were announced this year.

Data verification a big challenge
 
The general nature of the NIRF rankings also brings into question the verification of data. Prasad says, “This was one of the biggest challenges for us operationally. Data-based objective rankings can be only as good as the quality of the underlying data. Enormous effort was spent on making sure that data are scrutinised carefully to remove as many inconsistencies as we could spot. We used some automation (statistical tools), but more importantly a large number of senior volunteers (without a conflict of interest) for this purpose. Wherever available, we used data from independent sources. Wherever data collected from institutions was used, they went through very strict scrutiny. Wherever we did not have confidence, we desisted from doing a ranking. That is another reason, we did not rank all categories.”

Data vetting is the key in ranking institutions further. “A random sampling method must be in process and any data that looks out of the ordinary must be reexamined. Technology must be employed at the highest level and government must seek support of agencies that have expertise in this area. Physical verification of infrastructure is out of question in a country that is so large and an education system that is so complex,” adds Sridhar.

Source | Hindustan Times | 20 April 2016

Regards

Pralhad Jadhav
Senior Manager @ Library
Khaitan & Co

Upcoming Event | National Conference on Future Librarianship: Innovation for Excellence (NCFL 2016) during April 22-23, 2016.

Note | If anybody use these post for forwarding in any social media coverage or covering in the Newsletter please give due credit to those who are taking efforts for the same.

No comments:

Post a Comment