Follow the Money — Exploring Campaign Finance in the 2022 US Midterms

Entry type: Single project

Country/area: United States

Publishing organisation: NA. This was an independently published story posted directly to GitHub. While I am a first-year computer science student at Reed College and a newly elected editor of the Reed College Quest, the campus publication, I was unfortunately unable to have this story published by the Quest at the time it was written.

Organisation size: Small

Publication date: 2022-11-21

Language: English

Authors: Declan Bradley


Inspired to pursue journalism by his mother and grandmother, Declan Bradley began writing for his high school paper at age 13 and never looked back. After winning several awards from the National Scholastic Press Association for his work in high school, including the organization’s first ever [Innovation Pacemaker Award](https://studentpress.org/nspa/2022-innovation-pacemakers/), Declan became interested in data journalism after working with Washington Post Engineering Director Jeremy Bowers at an Associated Collegiate Press workshop in Washington, D.C. Declan now hopes to pursue a career in data journalism, and is excited to explore the new frontiers of reporting made possible by the digital revolution.

Project description:

Follow the Money is an interactive data journalism story analyzing major campaign donations to candidates for congressional office in the 2022 US midterms, using source data crawled from opensecrets.org. Readers are invited to explore the complex funding networks that underpin US politics through a Javascript-based web application that visualizes the sums exchanged by nearly 5000 candidates, corporations, and PACs during the 2022 election cycle.

Impact reached:

I’ll start with honesty: this project’s metrics would not be impressive. I created it out of passion and because I believed in the importance of the material, but I’m only 18. As a first year computer science student at Reed College, I simply don’t have the platform to take it much farther than the limited reach of a teenager’s social media accounts. But I’ll tell you what I hope the impact of this project could be. Growing up in the United States during the years when Donald Trump sat in the Oval Office shook me in ways I still don’t fully understand. Even years later, I still viscerally remember sitting with my mother to watch one of the former President’s few press briefings in the White House, and hearing her say, quite softly, “I can’t believe I’m watching this,” with a kind of despair I wasn’t used to hearing from my parents. Looking back with the benefit of hindsight, I think that she, like many Jewish people in the US during that time, felt powerless. When I began researching the 2022 midterms as a student journalist, it was that sense of powerlessness that stuck with me: the fear that a system, a nation, meant to serve its people could still be incomprehensible at times. My desire to change that was what led me to this story, an effort to process thousands of campaign finance records — public but so scattered and time consuming to research as to be out of reach of most voters — and make them accessible to all. Knowledge is power. That is the idea that has always guided me in my journalism work, and I hope that this story, and others like it, can serve to give that knowledge, that power, back to the people.

Techniques/technologies used:

The most important tool I used in the creation of this project was undoubtedly [opensecrets.org](https://opensecrets.org/), a nonprofit journalism site that collects and publishes data on US elections. I scraped the source data for the story from the Open Secrets website and public API using the R programming language, and processed the resulting data in RStudio. A GitHub repository containing a copy of those R scripts, and the complete dataset I ultimately used, can be found under project link 3. I imported the resulting data on transactions during the 2022 midterms into a piece of software called Gephi, which specializes in processing large network graphs. I then used Gephi to arrange the data points for representatives and donors based on the force-directed Fruchterman Reingold algorithm. It is interesting to note that I provided Gephi with no information about the political alignment of any datapoint, so any appearance of partisan arrangement in the final graph is purely a result of the efficiency calculations the algorithm made based on financial connections. For more information on this process, see the GitHub repository for the standalone interactive. Gephi generated a network graph in the GEXF file format, which I then transformed into an interactive web application using the React and sigma.js Javascript libraries and the Materialize CSS library. This standalone application is available under project link 2. Finally, I wrote a cover story based on the data using simple HTML and CSS and published it to GitHub Pages, which serves as the primary link above.

Context about the project:

In an article for Vox published in December 2018, which I read at the time and look back on as a major influence on my high school journalism, David Roberts wrote that “journalism in the late 20th and early 21st century was constrained on the supply side, and that shaped many of the professional practices and social norms around it. … The internet changed all that. There are no longer supply constraints — it is trivially cheap and easy to publish something on the web — and there are virtually no constraints left on the supply of information. Libraries are online. Government records are online. Every public figure’s every move is blogged or tweeted. … [As a result,] there’s more need for explanation. Because they were supply constrained, newspapers and newspaper journalists focused on what was new, what just happened, the incremental development. But lots of times, readers had no way of making sense of those developments or contextualizing them. They were getting the leaves, but they’d never gotten the trunk.” Even though it’s been years since I first read Roberts’ argument, I can’t imagine placing this story in any other context than that. The data I used was free and open — completely so. In fact, it was published by an organization whose sole mission is to obtain such data and make it as widely available as possible. There were no highly placed sources, no clandestine meetings with Deep Throat in a parking garage (much to the disappointment of my middle-school self). In a way I got lucky: I was born in an era in which I don’t need the connections, access, or experience of a professional journalist to start writing the stories I’m passionate about. But even with a world of data at my fingertips, when I started researching campaign finance early last year I was stumped. The numbers were opaque, the language even more so, and the data tables for individual representatives were complex and hard to place in any larger context. I was only seeing the leaves. But I believed this story was important, so I fought to understand it, and when I started to see a pattern emerging — a way of visualizing the flow of money behind power as a network of lines and connections — I taught myself the skills I needed to tell the story in a way that would let other people see it too.

What can other journalists learn from this project?

In his excellent book _Breaking News: The Remaking of Journalism and Why It Matters Now_, former _Guardian_ Editor in Chief Alan Rusbridger lays out ten guiding principles of what he calls “post-print open journalism.” His first, and the one I find most interesting, is that open journalism “encourages participation. It invites and/or allows a response.” While I hesitate to say that my own work as a student has much to offer professional journalists, this project has been guided since its beginning by that passage in Rusbridger’s book. The final story presented above was written as explainer journalism on campaign finance, geared toward a college audience, but my own personal sense of the project’s core value lies almost entirely in the interactive web application itself. When I first presented an early version of that application to other student journalists at the Associated Collegiate Press Digital D.C. workshop in July, I was pleasantly surprised by how quickly the presentation format broke down as the audience called out names of local representatives for me to look up and eager requests to focus in on particularly interesting subnetworks and connections. That’s the vision I have for this project: that individual voters could use it, and other visualizations like it, to research representatives and corporations of interest to them and engage in their own investigative journalism. So if there were an idea I wanted other journalists to take away from this project, it would be Rusbridger’s principle that the journalist doesn’t always shape the story. In some cases, readers’ personal use of the tools that we build can be as valuable as the story itself.

Project links: