On Election Night we analyzed and presented the results of each one of the approximately 100,000 voting booths throughout the country. We first associated the data from the polling places with the one from the tables and geolocalized each one. Then, we scraped the data from the forms and uploaded them for the users to check and view the results of their table, and checking the original handwritten document with signatures from table authorities and political parties monitors.
This is a relevant news application hat empowers users to find their own narrative within the larger dataset.
In Argentina, the designated authority of each one of the 100,000 voting tables prepares a form containing the results. The form is then scanned and sent to the counting centers that load the forms by hand for the vote counting.
The opportunity to use technology to transfer control to the voters and politcal parties monitors and let them see, analyze and compare in detail. In this case our interactive data visualization served to the transparency of the whole election result as we could reach and show the detail of every polling station including the original form manually completed and signed by the desk authorities.
This news application was part of the whole elections package and got + 245.000 views, providing our users with the tools they needed to sift through the datasets in order to discover what was most relevant to them.
Python 3.6 as main programming language, Postgre SQL database and Cloud services of AWS were used at the backend.
The first step was to relate the database of schools and that of the associated voting tables compiled from different places. Then we geolocated them using previous databases and compared name and address of each school of both databases. Moreover, we used the Google Cloud service to geolocate the remaining ones or verify them. We were able thus to locate about 15,000 schools throughout the country.
The next step was to collect all data on the election day. This process was made as abstract/customizable as possible because we didn’t know how the government was going to show the results of the vote at the schools on their website or how they were going to upload the pdf version files of the telegrams.
Two days after the elections, the results were published, the scraper was modified a bit, then we began to save the results in a database and created a link to download the telegram pdf files.
The last step was to create a json with all the information of the votes received from voting tables of schools plus their telegrams in json format so that the frontend may be fed and process data.
The project uses a json which contains votes cast from school along with a link to download the pdf files of telegrams. The technologies used were Vue.js, LeafletJS and Mapbox for the map.
What was the hardest part of this project?
The hardest part was discovering that Government had changed which polling stations ID numbers were related to the schools.
We had to restart the process and revised all the documentation again.
Also the geolocalization was corrupted
We downlaoded again the schools and we had to join the schools and polling stations again using SQL.
The tables were a relational database. We had to link every classroom (polling station) to each school and then geolocated them in the map.
What can others learn from this project?
Never take for granted that your work is half done with your past elections databases.
The population is not static.
Every election (every 2 years in Argentina) you have to refine your data.
We could finally polish our databasese so we could relate reporting from Primaries 2019 to Primaries 2017.