#114 - In the World of Confusions
- Adam Pawel Pietruszewski
- Mar 13
- 4 min read
Why SDG indicators differ across global and regional dashboards
I have recently reviewed the Europe Sustainable Development Report 2026 and, to my surprise, found some very disturbing discrepancies with the Global Sustainable Development Report . I realised how easily dashboards that claim methodological alignment produce different realities.
Looking briefly at Poland, I realised that the obesity indicator presents an entirely different reality. In the Global SDR, the percentage of the adult population with BMI above 30 is indicated as about 27–28%, whereas in the Europe SDR it is only 18–19%. A massive difference of roughly 9 percentage points of the entire population.
I started digging and the number of surprises became quite overwhelming. The challenge of competing statistics, which derive at different conclusions, is a serious policy problem in today's world. I wrote about this previously in the context of income and wealth inequality
I was not prepared, however, that such confusions can be created by one organisation.
After all, both reports are prepared by the SDG Transformation Center, which claims that the European report “builds on the methodology of the global SDG Index” and uses the same conceptual framework for measuring progress on the 17 SDGs.
There are good reasons why a regional report is customised. I am not going to go through all of them here. The outcome, however, is that those legitimate and logical methodological choices resulted in two reports whose rankings contradict each other to a surprising extent.
Contradicting rankings
What countries are doing well when it comes to sustainable development?
In the Europe SDG Report, Switzerland ranks very high — 9th place. In the Global SDR, however, Switzerland ranks only 24th among European countries.
Similarly, the Netherlands and Ireland improve their positions substantially in the European ranking, whereas France, Croatia, Spain or the Slovak Republic move in the opposite direction.
Country | Rank Europe SDR | Rank Global SDR (Europe only) | Change |
Switzerland | 9 | 24 | +15 |
Netherlands | 12 | 22 | +10 |
Ireland | 18 | 27 | +9 |
France | 14 | 5 | −9 |
Croatia | 19 | 8 | −11 |
Spain | 22 | 14 | −8 |
Slovak Republic | 24 | 16 | −8 |
Source: Sustainable Development Report 2025, Europe Sustainable Development Report 2026
Source of the differences
Below is a rough reconciliation of the differences in the indicators used in both reports.
The UN global SDG framework defines 231 indicators.
The Global SDR uses 126 indicators aligned with that framework, while the Europe SDR uses 115 indicators, of which only about 54 overlap with the global set.
Within the 61 indicators used only in the Europe SDR, many correspond to UN SDG concepts but use Eurostat datasets or alternative operational definitions, while others are defined specifically for the European context.
Dataset | Indicators |
UN global SDG framework | 231 indicators |
Global SDR | 126 indicators |
Europe SDR | 115 indicators |
Overlap (global ↔ Europe) | ~54 indicators |
Source: Author’s calculation based on report indicator lists.
Secondly, the Europe SDR relies heavily on Eurostat and other regional statistical systems as data sources.
Eurostat itself monitors EU progress towards the SDGs using a dedicated indicator set selected on the basis of statistical quality and relevance for EU policy priorities.
The obesity example
The obesity indicator that triggered this investigation illustrates the problem clearly.
In the European report the indicator uses Eurostat self-reported BMI data, which tends to underestimate obesity because respondents typically under-report weight and over-report height.
WHO data used in the global report is largely based on measured surveys combined with statistical modelling, which usually produces higher estimates.
Both approaches have methodological limitations, but the resulting numbers differ substantially.
Several other indicators show similar differences due to different definitions or statistical sources, including: unemployment rate; gender pay gap; income inequality (Palma ratio vs EU-SILC indicators). These discrepancies tend to cluster in a few domains: health indicators; labour market outcomes; inequality measures.
What really drives the ranking differences
The most important driver of ranking differences, however, appears to be indicator selection.
The European version tends to reward:
welfare systems
labour market outcomes
governance quality
infrastructure and innovation.
Countries such as Switzerland and the Netherlands, which may perform less strongly in some global environmental indicators, move up in the European ranking because the balance between environmental and social indicators changes.
Below is a rough estimate of the distribution of indicators by thematic focus.
Indicator type | Global SDR (approx.) | Europe SDR (approx.) |
Environmental / ecological | ~35–40% | ~30% |
Social / quality-of-life | ~30% | ~40–45% |
Economic / infrastructure | ~20% | ~20–25% |
Institutional / governance | ~10% | ~15–20% |
Source: Author’s calculation based on report indicator lists.
A deeper question
This shift may reflect a broader tension in sustainability debates.
High-income societies focus on social welfare, institutional quality and economic stability. At the same time, sustainability discussions inevitably raise uncomfortable questions about consumption patterns and environmental limits, which tend to be avoided.
The European report proudly states:
“From a global perspective, European countries remain the world’s leaders in sustainable development and well-being in 2026, and the European model remains an inspiration for many parts of the world.”
This may well be true — but the conclusion depends strongly on how the indicators are defined and weighted.
Does it Even Matter?
Imagine an SDG-focused meeting between European countries.
France proudly praises its top global evaluation, while Switzerland is equally proud of its top European ranking.
Misaligned rankings give everyone a convenient way out of a more difficult discussion:
What is the desired model of sustainable development?
That should be the real purpose of such rankings — to identify best practices and operating models that actually work and can inspire others.
In my view, a stronger integration of reporting, ratings and conclusions would help to keep the general public in the loop. Today's world of confusions with overlapping dashboards and competing indicators is too difficult to interpret for anyone who does not wish to spend significant time diving into methodological details.
I publish one short reflection like this each week. If you would like to receive them by email, you can sign up to the newsletter here

References and Notes
Europe Sustainable Development Report 2026
Sustainable Development Report 2025
#90 - How You Can Prove Anything With the “Right” Statistics



Comments