2020
How many white Christmases will there be in 50 years?
Category: Best data-driven reporting (small and large newsrooms)
Country/area: Sweden
Organisation: Newsworthy
Organisation size: Small
Publication date: 24/12/2019

Credit: Jens Finnäs, Clara Guibourg
Project description:
How many white Christmases will there be in 50 years? With climate change and fewer days with sub-zero temperatures, large parts of Sweden look set to have barely any snow at all. Using observed and modelled climate data we were able to show readers what winters look like where they live – today and in the future.
Take Stockholm, for instance: The Swedish capital currently has about one and a half months of snow every winter. In 50 years, scientists predict it will have only between four and 15 days, depending on how quickly the world acts on climate change.
Impact reached:
Climate change is the issue of our times, but it can be difficult to report on it in a way that engages audiences. That’s why we wanted to focus not on abstract temperature anomalies, but instead the very tangible effects these are having on people’s everyday lives. Using granular climate data we were able to show our audience how things look in their local area, describing a change in snow days that they can already see starting to happen – and revealing future changes, which in many cases are quite dramatic.
Everything was done using publicly available data on current and future snow depth, from the Swedish Meteorological and Hydrological Institute (SMHI), which we joined together to produce new insights.
This was published as a national version by Aftonbladet (the largest Swedish tabloid) and on our site. The story was also distributed to and published by local newsrooms across the country through our Newsworthy subscription service in 290 local versions (one for each municipality). The local newsrooms were free to use the findings and visualisations.
For us this is one of the most important impacts of this project: producing original data journalism in a small team, and then sharing it with local newsrooms around the country, enabling them to make use of data in a way they otherwise could not have done.
Techniques/technologies used:
The bulk of the data analysis and visualisation was done using R. The actual climate projections and forecasts about future chances of snow we were able to pick up from the meteorological institute.
We depended on QGIS for some geospatial analysis, such as mapping towns to their nearest weather station with snow depth data. It was a challenge to combine historical data about snow depths tied to specific weather stations with climate projections that are much more granular. We had to determine what specific coordinates the best represent a certain municipality (which is not always obvious in a spacious country like Sweden).
To produce local versions of the story we used an homemade “robot writer” that generates text from data. This is a Node.js based interface that allows us to write dynamic articles with programmatic logic.
What was the hardest part of this project?
One of the hardest parts was not necessarily data-related, but a broader editorial point: Thinking about how to communicate climate change in an engaging way as well as how to communicate the uncertainty in projected data and the definition of future pathways to readers. Should we emphasize the worst case scenarios or the more modest ones?
Climate forecasts are often presented in a rather abstract manner. We wanted to make them as tangible as possible: Will there be snow in my home town in the future?
Another challenge was the fact that Newsworthy is no larger than a two-man band. We don’t have the editorial muscles of a large, established newsroom. All research, analysis, text and chart generation and distribution was done by Clara Guibourg and Jens Finnäs.
What can others learn from this project?
The project serves as an excellent case of how a small team of data journalism experts can reach and empower local reporters at scale. All the data used for the story was public, but not necessarily accessible for ordinary reporters. We had to scrape APIs and parse geospatial data files to get the numbers that, in the end, were rather straightforward to understand.
It also serves as a starting point for a broader conversation about how we can produce data journalism on climate change which actually engages people.
From a more technical standpoint, we could also use this project to introduce topics like mapping in R, or doing geospatial analysis in QGIS, including things like matrices and points-in-polygon analysis.
Project links:
us10.campaign-archive.com/?u=f5c9b898477bcd7a7e64e37d9&id=f2ce185eef