University Ranking

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 22902 Experts worldwide ranked by ideXlab platform

Ding Rui-chan - One of the best experts on this subject based on the ideXlab platform.

  • In-depth Analysis on QS University Rankings System
    Comparative Education Review, 2013
    Co-Authors: Ding Rui-chan
    Abstract:

    The QS World University Rankings,QS University Rankings: Asia,QS University Rankings: Latin America,and QS World UniversityRankings by Subject are comprehensively analyzed in this paper in the aspects of the selection of evaluation objects,the setting of evaluation indicators system,the distribution of weight coefficients,the collection of indicators' observation values,the construction of aggregate model,and processing of data respectively.Based on QS University Ranking methods,Chinese universities should rationally take correspondingmeasures toorganicallycombine the followingelements together: promotingtheir Rankings,building world-class universities,and performingtheir basic functions.

David D Dill - One of the best experts on this subject based on the ideXlab platform.

  • The CHE University Ranking in Germany
    Public Policy for Academic Quality, 2009
    Co-Authors: Maarja Beerkens, David D Dill
    Abstract:

    University Rankings have become widely influential in the last 10 years, both on a national and international scale. Rankings as a consumer information tool can function as an effective quality assurance mechanism. Most existing University Rankings, however, seem to distort rather than improve the higher education market. The CHE Ranking in Germany is an exception. It is a carefully designed Ranking that minimizes the main conceptual and methodological problems that University Rankings commonly face. The analysis in the chapter concludes that commercially oriented entities alone cannot provide a high quality University Ranking. Original data collection and data verification is a costly activity and there is a strong incentive for commercial providers to rely on easily available statistics. Therefore, even if a commercial venue can be effective in compiling, presenting and marketing relevant information, the quality of a University Ranking depends on the data collected by public or not-for-profit agencies.

  • academic quality league tables and public policy a cross national analysis of University Ranking systems
    Higher Education, 2005
    Co-Authors: David D Dill
    Abstract:

    The global expansion of access to higher education has increased demand for information on academic quality and has led to the development of University Ranking systems or league tables in many countries of the world. A recent UNESCO/CEPES conference on higher education indicators concluded that cross-national research on these Ranking systems could make an important contribution to improving the international market for higher education. The comparison and analysis of national University Ranking systems can help address a number of important policy questions. First, is there an emerging international consensus on the measurement of academic quality as reflected in these Ranking systems? Second, what impact are the different Ranking systems having on University and academic behavior in their respective countries? Finally, are there important public interests that are thus far not reflected in these Rankings? If so, is there a needed and appropriate role for public policy in the development and distribution of University Ranking systems and what might that role be? This paper explores these questions through a comparative analysis of University Rankings in Australia, Canada, the UK, and the US.

  • is there a global definition of academic quality a cross national analysis of University Ranking systems
    2005
    Co-Authors: David D Dill, Maarja Soo
    Abstract:

    The global expansion of access to higher education has increased demand for information on academic quality and has led to the development of University Ranking systems or league tables in many countries of the world. A recent UNESCO/CEPES Conference on higher education indicators concluded that cross-national research on these Ranking systems could make an important contribution to improving the international market for higher education. The comparison and analysis of national University Ranking systems can help address a number of important policy questions. First, is there an emerging global consensus on the measurement of academic quality in these Ranking systems? Second, what impact are the different Ranking systems having on University and academic behavior in their respective countries? Finally, are there important public interests that are thus far not reflected in these Rankings? If so, is there a needed and appropriate role for public policy in the development and distribution of University Ranking systems and what might that role be? This paper explores these questions through a comparative analysis of University Rankings in Australia, Canada, the UK, and the US.

Saeed-ul Hassan - One of the best experts on this subject based on the ideXlab platform.

  • Webometrics: evolution of social media presence of universities
    Scientometrics, 2021
    Co-Authors: Raheem Sarwar, Afifa Zia, Raheel Nawaz, Ayman Fayoumi, Naif Radi Aljohani, Saeed-ul Hassan
    Abstract:

    This paper aims at an important task of computing the webometrics University Ranking and investigating if there exists a correlation between webometrics University Ranking and the Rankings provided by the world prominent University rankers such as QS world University Ranking, for the time period of 2005–2016. However, the webometrics portal provides the required data for the recent years only, starting from 2012, which is insufficient for such an investigation. The rest of the required data can be obtained from the internet archive. However, the existing data extraction tools are incapable of extracting the required data from internet archive, due to unusual link structure that consists of web archive link, year, date, and target links. We developed an internet archive scrapper and extract the required data, for the time period of 2012–2016. After extracting the data, the webometrics indicators were quantified, and the universities were ranked accordingly. We used correlation coefficient to identify the relationship between webometrics University Ranking computed by us and the original webometrics University Ranking, using the spearman and pearson correlation measures. Our findings indicate a strong correlation between ours and the webometrics University Rankings, which proves that the applied methodology can be used to compute the webometrics University Ranking of those years for which the Ranking is not available, i.e., from 2005 to 2011. We compute the webometrics Ranking of the top 30 universities of North America, Europe and Asia for the time period of 2005–2016. Our findings indicate a positive correlation for North American and European universities, but weak correlation for Asian universities. This can be explained by the fact that Asian universities did not pay much attention to their websites as compared to the North American and European universities. The overall results reveal the fact that North American and European universities are higher in rank as compared to Asian universities. To the best of our knowledge, such an investigation has been executed for the very first time by us and no recorded work resembling this has been done before.

Juan Antonio Dip - One of the best experts on this subject based on the ideXlab platform.

  • What does U-multirank tell us about knowledge transfer and research?
    Scientometrics, 2021
    Co-Authors: Juan Antonio Dip
    Abstract:

    The economic and social need to spread knowledge between universities and industry has become increasingly evident in recent years. This paper presents a Ranking based partly on research and knowledge transfer indicators from U-multirank data but using data-driven weights. The choice of specific weights and the comparison between ranks remain a sensitive topic. A restricted version of the benefit of the doubt method is implemented to build a new University Ranking that includes an endogenous weighting scheme. Furthermore, a novel procedure is presented to compare the principal method with U-multirank. At the best of my knowledge, the U-multirank data set has been unapplied to achieve alternative Rankings that include research and knowledge transfers dimensions. A significant result arises from the benefit of the doubt: the highest importance weight is assigned to the co-publications with industrial partners and interdisciplinary publication indicators. This paper fills a bit of the existing gap on the role of co-publications with industrial partners in the University efficiency around the world.

Canan Cilingir - One of the best experts on this subject based on the ideXlab platform.

  • a comparative analysis of global and national University Ranking systems
    Scientometrics, 2015
    Co-Authors: Murat Perit Cakir, Cengiz Acarturk, Oguzhan Alasehir, Canan Cilingir
    Abstract:

    Recent interest towards University Rankings has led to the development of several Ranking systems at national and global levels. Global Ranking systems tend to rely on internationally accessible bibliometric databases and reputation surveys to develop league tables at a global level. Given their access and in-depth knowledge about local institutions, national Ranking systems tend to include a more comprehensive set of indicators. The purpose of this study is to conduct a systematic comparison of national and global University Ranking systems in terms of their indicators, coverage and Ranking results. Our findings indicate that national Rankings tend to include a larger number of indicators that primarily focus on educational and institutional parameters, whereas global Ranking systems tend to have fewer indicators mainly focusing on research performance. Rank similarity analysis between national Rankings and global Rankings filtered for each country suggest that with the exception of a few instances global Rankings do not strongly predict the national Rankings.