Imagine swiping open your phone to receive a new message. It’s from Yeshi, a young Ethiopian girl who is going to take you on a journey to collect water. It will take two and a half hours, the average time that millions of young African women spend trekking to water sources. Along the way you will see pictures of the bright yellow jerry cans she carries, videos of her dramatic, arid surroundings, sound clips of the music in her village, and receive messages updating you how she fares in the oppressive heat. You can send messages and pictures back, and Yeshi marvels at how different your worlds are.

Yeshi however is not real. She is a Facebook Messenger chatbot, launched by Charity:Water and Lokai, to primarily raise awareness about the Ethiopian water crisis. Yeshi’s wider aim however is to instill an unprecedented degree of empathy in its users. Chatting with Yeshi is the closest you will get to walking miles alongside a young girl, tasked with providing water for her family instead of receiving an education. While you’re sheltered from the stifling heat and dust that the real girls Yeshi is representing actually face, this form of AI immerses you in a story taking place everyday, thousands of miles away. Her story is totally un-relatable, yet unfolding right in your palm.

Yeshi however also represents just how data innovation is lacing itself into the solving of complex global issues and, fortunately, breaking away from archaic practices of international development. Shock ads fetishizing hungry children from far away countries have no place in how we should communicate global issues. Features such as AI are able to communicate these tides with compassion, not condescension.

As such, data is able to do more for development. Countries such as China, Nigeria, and India have exploding populations and overburdened national infrastructures,  yet they leapfrog the west technologically in a number of ways. Therefore data-driven innovation is poised to create tsunami levels of social change in contexts where standard methods of development simply can’t scale.

Take UK start-up Babylon that partnered with the Government of Rwanda in 2016 to launch the ‘babyl’ app. Any Rwandan with a mobile phone can manage their prescriptions, have video consultations, and access general health information. With a 73% of the Rwandan population having a phone and no prior national health coverage, it’s no surprise that 231,000 people signed up in the first four months. Suddenly, national infrastructure such as healthcare systems can be devolved to millions of people. These technologies therefore have a democratising impact. On Babylon’s website they even state their aim as being ‘to put an accessible and affordable health service in the hands of every person on earth.’

However, there is a dilemma: countries that have the most to gain from data-driven innovation typically have the weakest means to ensure this data isn’t being abused. Countries such as India have no code of ethics for AI or data science and the minority of citizens have access to justice regardless, yet huge amounts of sensitive data is being used in development projects all across the country.

In the case of babyl, all clinical records are stored and available to patients whenever they need them including notes, images, videos and audio of consultations. Who carries the onus to ensure that this private information is not harvested unethically? Who is stopping this personal information being sold on to pharmaceutical companies capitalizing on emerging markets, or being fed back to the government for political purposes? While Babylon may have altruistic company policies, what if a less empathetic equivalent were to abuse the data rights of those living in the countries it claims to be helping?

The private sector continues to define what data ethics are and does so on its own terms. In 2016 Avanade found that 84% of multinational firms say they are going to to invest in digital ethics in the next 5 years, yet it is unlikely these firms will follow their codes as meticulously in Bangladesh as they would in the USA if they know the legal accountability it minimal. Tech advancements such as AI demand a constant stream of personal data in order innovate and better deliver their development aims. The private companies leading these initiatives are therefore granted the personal information of thousands of people, with no safeguards in place to ensure that this information is being used ethically.

It would be patronising however to hold up Western handling of data rights abuses as examples to be proud of. The Cambridge Analytica scandal left people divided on how much they actually ought to care about their personal data being used by private firms, whilst the US’s congressional hearings of Zuckerberg made prize meme material. Although imperfect, the hearings provided some degree of accountability and Facebook’s fine this week from the UK’s data watchdog reminds us that someone somewhere is paying attention on our behalf. What can be said however for countries that are hosting these global private firms who use intimate data in their development project, but are ultimately playing by their own rules?

Ethical handling of our data should be a right. Progress for the sake of progress, without strong safeguards in place, could leave the fate of our information in the hands of the data companies.