I have 27 years and 4 years of experience as a data journalist in Brazil. During that time, I was responsible for producing several journalistic investigations and also teaching other people techniques for analyzing and scraping data. In my career, I have always tried to teach the little I knew, which also made me learn a lot. I am constantly called upon to give free lectures and courses, which I do with pleasure. Through an initiative created by me, the Colaboradados platform, I write tutorials and teach programming techniques for free to people. I believe I should win this award for also representing a minority in the country: that of a female data journalist. In my country, the area is predominantly male and it is not uncommon to suffer prejudices about my abilities just because I am a woman. Winning this award will be the valorization of my work not only as a data journalist, but as someone who tries at all times to bring data journalism to everyone’s teaching and knowledge.
Description of portfolio:
I have been a data journalist since 2018, with a primary focus on public administration data. I am currently a producer at GloboNews, a large Brazilian television journalism company, where I focus on the production of guidelines that use data crossing, mainly those made available by public entities.
I am also the director and co-founder of Colaboradados, a collaborative vehicle on government transparency and data journalism in Brazil. In the project, I built several ways of approaching these issues and making them more palatable for civil society, having been cited as a reference for opening data in the Open Data Project of CGU (Comptroller General of the Union), the Federal Government agency responsible for the defense of public assets, transparency and the fight against corruption
In the project, we publish reports that address topics on the issue of open data in Brazil. In one of these reports, we showed how government websites in cities in the state of Alagoas were encrypting currencies on users’ computers. In another, we investigated how several transparency portals in Brazilian municipalities had devices that allowed adding, modifying and even deleting public data. This second report was a reason for mobilization within the Brazilian Public Ministry that began to map other sites that had the same vulnerability elsewhere in Brazil.
I was also responsible for creating and programming a bot that monitors when government portals are unavailable to public access. The robot, known as @colabora_bot on Twitter, notifies its followers whenever these portals are not accessible and requests that society covers the maintenance of public entities. The way the bot practices monitoring governmental portals has reverberated internationally, and the project was one of the finalists in the world’s largest data journalism award, the Sigma Awards.
Through my work with Colaboradados, since February 2019 I have been more closely monitoring the problem of making public data available to society. From formats that do not work to simple information that is not available, the public administration has several problems in making public data available, which harms not only the work of journalists but the idea of monitoring society for the public.
I also spent time at UOL, in 2019, another great Brazilian journalism vehicle. There, my mission was two: to produce guidelines with data and also to make the rest of the newsroom understand what data journalism was. It was on UOL that I wrote reports as an analysis in which it shows how Oscar always gave preference to men in his awards and how the faces of the first victims of the coronavirus in Brazil were, highlighting the stories behind the numbers of the pandemic.
Finally, I also worked as a fact-checker at the Aos Fatos agency. There, I was responsible for spearheading a journalistic investigation in an attempt to understand how the works promised by the Brazilian government were after 4 years of the 2014 World Cup in the country. In the middle of this investigation, the government decided to take the data on the subject from the air, in an attempt to censor our report. Then using the means of scraping and analyzing data, we continue the investigation and warn against censorship. Later, I managed to get the data back to the population.