2021 Shortlist

New Indonesian law on wildlife crime

Country/area: Indonesia

Organisation: Earth Journalism Network, Oxpeckers.org, Haluan.co

Organisation size: Small

Publication date: 11 Sep 2020

Credit: Rezza Aji Pratama (writer and researcher), Wan Ulfa Nur Zuhra (data visualization), Fiona Macleod

Project description:

This project was looking into dozen of court verdicts to see how the wildlife criminal suspects punished for their crime. Due to the outdated the conservation Act 1990, I found that the verdict was too lenient and failed to give a deterrent effect. However, the was a case that given a high verdict of up to four years in jail and fined Rp1 billion (US$68,000). The verdict, which became the highest punishment of wildlife crime ever in Indonesia, was given by the Pekanbaru Court given to four members of the tiger cubs syndicate operating between Malaysia and Indonesia. The main

Impact reached:

The impact of the project had various impacts on the stakeholders. The use of the Quarantine Act 2019 by the Pekanbaru Court inspired other prosecutors to sue wildlife criminals with the same act. In July 2020 for example, Tanjung Karang Court sentenced two smugglers of a hundred exotic birds with the Quarantine Act. Along the year, the use of the new act became more common in the fight against wildlife crime.

Moreover, this project also added more pressure to the government and the House of Representatives to consider The Conservation Act 1990 revision. Although had been included in the 2015-2019 National Legislation Programme, the subtitle bill continuously hampered and finally withdrew in May 2019. Some experts said that reform of The Conservation Act 1990 urgently needed to provide a deterrent effect to the wildlife crime in the country. The lenient verdicts, due to the outdated law, which uncovered by this project became a reminder to the stakeholders to prioritize the reform of the Act. 


Techniques/technologies used:

The main data resources of this project are the court verdicts which can be accessed by the public. To collect the data, I have to scrape the website of the Supreme Court and transfer them into a spreadsheet. I made a long-list of the verdicts court by and categorized them into proper and clean data. 

After making the details of each case, I look into the sentence section and begin to analyze the data. This technique helped me to find out the case of the Irawan Shia syndicate which was given the highest sentence by the panel of the judges. Finally I am able to find the main reason why the four members of syndicate were fined 10 times higher than the other similar cases. In addition, I also transfer the spreadsheet data into a visual graphic with Flourish Studio. 

To enrich the story, I also utilize the StoryJS Map to provide a better understanding of Irawan Shia’s syndicate. The scrolling-telling map traced the journey of the syndicate from its origin in Malaysia across to Rupat Island before finally being arrested by the Police in Pekanbaru. 


What was the hardest part of this project?

The main obstacle of the project was due to the access of the data. Although the data was available at the Supreme Court website, the data was uncategorized and not very user-friendly.  To do this I have to put a keyword and select the case related to the wildlife crime. I found this very exhausting since I have to eliminate thousands of other cases which are not related but kept appearing when I type the keyword.

When I finally found the wildlife case, I had to download the record of the trial in PDF format. The next step was to deduce an important information based on the PDF which contains a dozen pages and put it into a spreadsheet. Finally, I have to repeat the process of collecting data with the other dozens of cases. Actually, there are hundreds of wildlife cases that can be found at the Supreme Court website. However, due to the deadline and limited human resources, I was only able to collect 50 of them. 


What can others learn from this project?

As long as I know, I was the first journalist in the country who scraped into the Supreme Court website, collecting dozens of trial records, and produced stories based on the data. Some journalists might download a single specific case document from the website. However, to do it on this scale of the project and make my own database in the spreadsheet was not very common in the practice of journalistic work in the country. 

Actually, this technique has been done by some non-profit organizations in their research. However, the NGO usually only publishes the result of the research which sometimes is not compatible with the journalistic needs. 

By doing this on my own, I have raw data that helped me to build the specific angle of my stories. The journalist can apply this technique for different kinds of issues. One of my colleagues, for example, doing a similar approach to uncover the corruption case verdict. I believe, by gathering and building our own database, the journalists will have a better understanding of every issue.


Project links: