Sweden introduced a historical furlough package to prevent layoffs and help businesses stay afloat during the pandemic. But where the 38 billion crowns ($4.2bn) had gone was initially secret. When the lid was lifted, we had prepared scripts to join the data with other sources and so were able to show unique figures on where the money had gone. This meant we were able to reveal, for instance, public companies that had received support and paid out dividends, as well as companies that had received support and gone bankrupt soon thereafter.
We knew when and where the data was going to be made public, and so were able to prepare as much as possible in advance to cover this breaking data story. By working with placeholder data and writing scripts ahead of time we were able to publish our first stories on the day, both nationally and in collaboration with Sveriges Radios local stations locally around the country.
Our data analysis was used for a week’s coverage, with new angles daily in both the local and national stories. About 35 stories were published in total.
Working together with Sveriges Radio’s local newsrooms in order to leverage the power of data analysis combined with local journalism and human stories is a very important goal for our data team, and therefore one of the most important impacts for us was that the collaboration with reporters from Sveriges Radios local stations worked so well. The local reporters were able to find case studies around Sweden that brought the data to life and showed the people behind the figures.
We joined open data on corona furlough support to a number of other data sources, in order to get unique figures, for instance data on dividends, revenue and bankruptcy.
Our team used a combination of Python and R to scrape, download and clean all required data. The data analysis and visualisation was done in R and local reports, to distribute to all reporters working on the project, were produced using R Markdown.
What was the hardest part of this project?
The biggest challenge for us was preparing to cover a breaking news data story, producing data analysis not only for one national story, but for 25 local versions for Sveriges Radios stations across the country.
Preparing scripts in advance and thoroughly exploring additional data sources and potential angles in advance with local reporters meant we were able to share our first data analysis with all the reporters on the same morning data was released.
The other challenge for us was how best to share a large quantity of data with reporters who weren’t used to working with data.
We’re a new team and this was only the second major project we’d ever done in collaboration with local newsrooms. We knew we wanted to steer clear of sharing intimidating Excel spreadsheets with them, and settled on producing 25 locally relevant reports programatically in R Markdown. We asked local reporters for feedback on these reports ahead of the release, and made changes to ensure they were useful and easily comprehensible on the day.
The benefit of these reports is that they contained not only locally relevant data in more easy-to-use tables and charts, but also all the additional text needed to understand the figures.
What can others learn from this project?
The most important thing that we ourselves feel we learned from this project was finding a good method for data journalists’ collaboration with local newsrooms, and finding the best way for us to distribute data to reporters not necessarily familiar with Excel or numbers. We think our learnings could serve as inspiration to other newsrooms in the same position, from using R Markdown to produce locally relevant reports on the data, to collaborating early on with many channels of communication, including several Q&A sessions and an active group chat with all reporters involved.
Another learning for us has been the value of preparing scripts in advance, in order to be able to cover a breaking news data story. We planned ahead and discussed, together with reporters, which angles and additional data sources, and prepared scripts that would run the new data against these other sources, in order to quickly have unique stories on the day.