Methodology Updates
The most dramatic year-to-year shifts in QS World University Rankings and other major rankings often result not from changes in universities but from changes in the ranking systems themselves. When a ranking publisher revises its methodology — altering indicator weights, adding or removing metrics, changing data sources, or adjusting normalisation procedures — institutions can move tens or hundreds of positions without any change in their actual educational offerings.
QS's 2023 methodology revision introduced two entirely new indicators (Sustainability and Employment Outcomes) and reweighted existing ones. As a result, some universities that had invested in environmental initiatives or strong careers services saw significant gains, while others that had relied on historically strong citation metrics or ratios saw unexpected drops. Stanford, for instance, rose in QS 2024 rankings partly due to its employment outcomes score; other institutions fell despite no deterioration in research quality or teaching.
THE's 2022 methodology revision restructured the Research pillar by introducing separate "Research Environment" and "Research Quality" components, changing how field-weighted citations are calculated and adding a "Research Strength" metric. This produced ranking shifts of 50–200 positions for some institutions.
The lesson for students is clear: before attributing a ranking change to a real improvement or decline in institutional quality, check whether the ranking body changed its methodology that year.
Survey Response Variability
For rankings that weight reputation surveys heavily — particularly QS, where the Academic Reputation Score counts for 40% — year-to-year fluctuations in survey responses can produce significant ranking changes even without any change in the underlying quality being surveyed.
Survey response variability stems from several sources:
- Response rate fluctuations: If a country's academic community responds at lower rates in a given year, universities from that country may lose nominations they would otherwise receive.
- Recency effects: Academics who attended a conference hosted by a particular university, read a widely-shared paper from its faculty, or heard a prominent lecture by one of its professors are more likely to nominate it that year — even if nothing fundamentally changed.
- Alumni network effects: Universities that actively organise alumni academic networks or engage with global scholarly communities through conferences and journals generate more nomination flow than equivalent universities that operate more quietly.
- Panel composition changes: QS and THE continuously update their surveyor panels. As different academics participate in different years, the aggregate preferences reflected in the survey shift.
Data Reporting Changes
Universities submit substantial volumes of data to ranking bodies each year, and how they classify, count, and report that data evolves over time. A university might reclassify certain academic positions (counting research-only staff as "faculty" when they previously didn't), recount international students to include students who enrolled as domestic but have foreign nationality, or adjust how they attribute collaborative research output.
Sometimes these changes represent genuine improvements in data accuracy; sometimes they represent strategic optimisation for ranking indicators. Research Output reporting, in particular, can be affected by whether a university claims credit for joint papers authored by faculty who hold dual affiliations.
The 2023 Columbia University scandal demonstrated the consequences of systematic data misreporting. Columbia had submitted incorrect statistics on faculty qualifications and student-to-faculty ratios to U.S. News for years, inflating its domestic ranking. When the correct data was submitted, it fell from #3 to #12 — a dramatic shift that reflects data correction rather than any change in educational quality.
Actual Institutional Changes
Real institutional changes do produce genuine ranking movements, though they typically operate over years rather than months:
- Research investments: A university that significantly increases research expenditure, hires prolific scholars, or establishes a new research institute will see its Citation Impact and Research Output indicators improve over a 3–5 year lag (reflecting the time from research investment to publication to accumulating citations).
- Faculty hiring: Recruiting a Nobel laureate or a cluster of highly cited researchers in key fields directly improves ARWU staff award and highly cited researcher indicators within a year of the hire.
- Internationalisation: A deliberate policy to recruit more international students or faculty will move internationalisation indicators relatively quickly — within 1–2 years.
- Mergers and restructuring: When universities merge, their combined output and student body is recounted, often producing immediate jumps or drops in size-sensitive indicators.
The Denominator Effect
Many ranking indicators are ratios — citations per faculty, PhD graduates per academic staff, research income per staff member. Whenever a university changes its denominator (total faculty count), all its ratio-based scores change, even if the numerator values remain constant.
A university that expands hiring rapidly may see its citations-per-faculty score drop simply because new staff haven't yet published enough to maintain the ratio. Conversely, a university that downsizes its faculty (without losing research output) will see ratio-based scores improve. These denominator effects can cause confusing signals in year-to-year comparisons.
Case Studies: Major Ranking Shifts
Several notable ranking shifts illustrate these dynamics:
- Nanyang Technological University (NTU), Singapore: Founded in 1991, NTU rose from outside the top 100 in 2010 to #19 in QS 2023 through deliberate investment in faculty recruitment, research funding, and internationalisation — a genuine improvement reflected across multiple indicators.
- Chinese universities (2010s–2020s): Several Chinese universities, including Shanghai Jiao Tong University and Zhejiang University, rose sharply in ARWU and THE through massive government research investments under the Double First-Class University Plan.
- MIT vs Harvard oscillation: MIT and Harvard have traded the #1 position in QS rankings multiple times. These swaps typically reflect small movements in survey responses rather than meaningful differences in quality between two consistently elite institutions.
Should You Worry About Year-to-Year Changes?
For most students making application decisions, year-to-year ranking fluctuations below a certain threshold are noise rather than signal. A university moving from #45 to #38, or from #120 to #145, is unlikely to reflect any meaningful change in the quality of education it provides. The student experience at an institution doesn't dramatically change in a single year because it gained or lost five reputation survey nominations.
What matters more is sustained performance across multiple years and multiple rankings. If a university consistently ranks in the top 50 of both QS and THE over 5+ years, that consistency is more meaningful than any single year's position. Conversely, a university that shows a consistent downward trend across multiple rankings and multiple years may be experiencing real institutional decline worth investigating.
As a practical rule: treat any single-year change of fewer than 20 positions in overall rankings as within the normal range of methodological noise. Focus on 5-year trends, multiple ranking systems, and subject-specific performance rather than obsessing over annual position changes.