The story is analyzing aid efficiency to Syrian refugees around the world. analyzing the QWIDS OECD database for 2011-18, I found that donors decided to back out granting Syrian refugees fleeing active fighting an resettlement – a permanent solution for a temporary problem – by pumping in billions of short-term emergency aid for those displaced and in camps: a temporary solution for a permanent problem. and multiple that amount was spent to protect the borders against illegal immigration.
Usually the impact of such stories that discuss a global problem is not quickly and clearly visible, but I hope that my story has changed the way decision-makers in the world deal with refugee issues.
I used Excel for data analysis, which alows me to sort the data and make comparisons between the amounts spent on aid sectors. I also used Excel Charts to display the data visually, which helped me understand the money flows trends, and helped me understand the interrelationship between the information represented by the numbers. Pivot table tool in excel was especially helpful for analyzing some questions, while math formulas was proper answering other questions
I used Tabula In order to scrape data from PDF files into tables that can be dealt with in Excel, it’s especially useful for scraping published databases in studies
To create an interactive visualization I used the free tools available online such as Flourish and Datawrapper
What was the hardest part of this project?
As a Syrian and also as a refugee in Jordan, I wanted to discuss this issue from the perspective of the refugee himself, and to raise the problems that we face in our lives as refugees, at the same time, I wanted to remain professionally and unbiased to the utmost degree.
Since this story is linking issues that seem at first glance unrelated. The hardest part of the work was obtaining enough raw data on each angle of the story for analysis. Then the technical problems came forward, the dataset I got on aid to Syria was massive and broken down into sub items, also i had the same setuation with the aid to Turkey. So, I needed a reliable method to combine all those files without a single errors. and that method should be able to be ran on my 10 years old laptop.
Ispent two weeks of experiments and tests, during which I learned the skills of changing the normal Excel file type to Binary, so that it is two times smaller and lighter on the computer. Then I learned how to break down the big problem into smaller parts and test the smaller parts to save time.
I tested dozens of plugins, applications and scripts that work in conjunction with Excel or Google Sheets to implement specific tasks that facilitate the work.
What can others learn from this project?
Frankly I can’t tell on behalf of other journalists what can they learn from this project, but I can tell what I learned from it:
Dealing with big data is not a problem, there are always tools and solutions, you just need to find them. Look for them online and ask your colleagues who have done similar work.
Think broadly, and try to see the bigger picture, your hypothesis may be hiding there because it’s so big.
Don’t deal with all of the problems you have at once. Find a way to break down the big problem into smaller ones, once you make it with the smaller problems, your big one is gone.
Don’t let side problems overwhelm you and confuse you, footnote them, and deal with them individually step by step.
Take a break for a day or two, whenever you get a large chunk of the work done. This will refresh your brain and help you notice what you missed in the previous part of the work. Of course, do this when you’re all done and don’t rush to publish the story before re-checking with a fresh brain