Loading

The advent of ChatGPT in 2021 brought the potency of Artificial Intelligence to the public eye. However, the critical role AI-powered technology plays in humanitarian assistance and its associated ethical considerations are missing from the popular discourse.

The Emergency Telecommunications (ETC) Chatbot, launched in 2021, provides life-saving information to affected populations in emergencies and connects them to resources through personalized two-way communication.

ETC was born out of a World Food Programme (WFP) initiative, supported by the WFP Innovation Accelerator and Google for Startups to scale 10 high-potential solutions to end hunger worldwide.     

ETC improves upon existing chatbots through cognitive capabilities powered by generative AI and Large Language Models (LLMs). 

Recent breakthroughs in Natural Language Processing (NLP) and Machine Learning (ML) allow the Chatbot to constantly improve itself through data collection and processing. As a result, conversations flow more smoothly and the Chatbot can handle more sophisticated topics, including personalized coordination between aid organizations and the affected communities. 

Automation does not inherently lead to better outcomes. Therefore, the design of chatbots must be built for a purpose, informed by the issue at hand, with consideration for the relevant political and social context. When computer scientists develop the bot, they should strive to engage both humanitarian actors and the population it intends to serve to account for specific communication needs. 

Since its initial launch, the chatbot has been used in Libya, Iraq, Ecuador and Ukraine. Each time, the chatbot takes on a new name and adopts functions that reflect local circumstances, from war to natural disasters.

In Libya, the chatbot was nicknamed “Mila” (“To chat” in Greek) and offered services in both English and Arabic. Besides helping Libyans access lifesaving humanitarian information, it also provides valuable insights into users’ needs, which humanitarian programs can use to improve their operations. Seeing the success of Mila, the Libyan Ministry of Health later adapted the bot into an official informational tool as part of its COVID–19 response. 

Within one year of Russia’s war on Ukraine, 19,000 people used the ETC-based “vBezpetsi” (“Safe Spaces”) chatbot, to access aggregated information on cash assistance, protection, health, shelter, education and job opportunities from different humanitarian partners across national borders. These included organizations from Poland, Hungary, Slovakia, Romania, Moldova, the Czech Republic and Bulgaria. 

Although they have been proven useful, it is unlikely that these chatbots will fully replace human personnel in emergency responses. The International Federation of Red Cross and Red Crescent Societies (IFRC) identifies chatbots as a “complementary component of a larger ecosystem” that should be tailored to align with the priorities and communication practices of the affected communities. 

While ETC has demonstrated great potential, it has limitations. For example, the program depends on affected populations having internet access, and may not be able to meet user expectations in emergency situations.

Empirical evidence shows that users of chatbots still expect to converse with a real human being to work through more complex issues and receive personalized responses. The clear expectation for high-quality communication, particularly in urgent scenarios, highlights the need for accessible and abundant human resources in responding to emergencies. The chatbots help free humanitarians from routine questions to perform more urgent tasks. 

Liv McAuslan (SFS’ 25), who is interested in humanitarian efforts and has worked at USAID and the UN Refugee Agency, believes in the tremendous benefits of technologies in supporting humanitarian efforts. She highlights that these tools require financial and logistical support to succeed.  

“The high cost of technological developments hinders innovation in a time where funding is already so limited for humanitarian organizations. This is where I see a gaping need, but also an extreme opportunity to catalyze private sector partnerships in the humanitarian field,” she said. 

Beyond pragmatic considerations, practitioners and policymakers have to grapple with more complex and nebulous ethical questions. 

The use of ETC could exacerbate “linguistic injustice.” Since the bots are trained on specific information and data, they are better equipped to respond to the most commonly used languages. Less common dialects, indigenous languages and minority languages are often less well represented in the training data and bot’s algorithm, causing less inclusive and weak programming. 

Ethical concerns over AI-powered chatbots are also exacerbated by the “black box” effect, which describes the difficulty in understanding the technically opaque inner workings of the algorithm that processes data input. Our limited current knowledge of how information is processed raises concerns over potential biases, inaccuracies and gaps in the output these AI tools generate. The lack of clarity and transparency often makes detecting discriminatory messages and mitigating negative social impacts difficult. 

Other ethical challenges that AI-powered technologies present include data privacy violations and potential abuse by malicious actors to exploit marginalized and silenced communities, such as through disinformation. 

“When human lives are at stake, there needs to be strict protections in place to prevent misuse and weaponization of AI,” said McAuslan.

While many hidden risks and considerable limitations exist, the ETC chatbot is a powerful example of how innovation can be strategically used to complement human efforts in delivering critical services and assistance to the people most in need.

SHARE
Previous articleThe Moscow Storm
Next articleGeorgetown Resident Assistant Coalition Votes in Favor of Unionization
mm
Sharon Huang is a sophomore in the School of Foreign Service majoring in Business and Global Affairs. Driven by an interest in international development and refugee and migration issues, she aims to uplift marginal communities through innovative problem-solving informed by data analysis. On campus, she writes about campus affairs and global events at The Georgetown Review and The Caravel and serves as the Director of Events at BridgeUSA. She is a 2023 Global Health Institute student fellow and a 2024-2025 Laidlaw Scholar.