Transport systems are dangerous and unsafe for women and girls. One of the developments to keep women safe is the creation of machine learning applications powered by AI. To ensure the safety of women in public spaces, government agencies, nongovernmental organizations, and women’s rights advocates have come up with initiatives involving the use of technology, particularly women safety apps. These initiatives seek to give women a sense of security amid threats of sexual harassment and violence in streets and public transport, while acknowledging the vulnerability of women in public spaces. From SOS alerts to emergency contacts, to location sharing, safe/unsafe navigation alerts, and only-women ride-sharing services, these personal safety apps are designed based on the idea that a smart digital solution can help women feel safer.

Unfortunately, these AIs, while supposedly helpful to potential victims, are actually victim-blaming. AI developers and designers create machine learning apps under the assumption that existing transport systems are unsafe for women, and that potential victims should be ready to face violence or harassment. Some apps give women the option to change their routes and routines, be it through avoiding unsafe places or riding female-only transportation. The assumption here is that women are the ones who should avoid unsafe places or transportation where perpetrators may be present. Data sets derived for machine learning presuppose that someone must be exposed to such unsafetyness first so that safer routes may be predicted.

Majority of personal safety apps are focused on intervening at the time of a criminal event, and post-event. (Maxwell et al. 2020, p. 7). Women would already have been harassed or exposed to danger before the apps could actually protect them! As an implication, women are continuously being conditioned to avoid going out at night, walking in dark alleys, or riding in male-driven taxis. Rather than being empowered to make their own commuting or transportation choices, these apps make women adjust to possible would-be offenders. Women are responsible for their own safety, expected to be careful, and are thereby blamed for not being careful. Victims may even be blamed for not having or using these safety apps!

While the apps may be helpful in easing the fear of women, they normalize sexual violence. They also put the burden on women to protect themselves. This contributes to a victim-blaming culture wherein “the victim(s) of a crime or an accident is held responsible—in whole or in part—for the crimes that have been committed against them (Canadian Resource Centre for Victims of Crime 2021),” and in the case of the apps, the crimes that will be committed against them. This is similar to other victim-blaming cases wherein women are blamed for the harassment because they dress in a certain way, or they have a certain body shape or personality. This type of victim-blaming culture also prevents women from sticking to their usual travel routes and routines. Some machine learning platforms even segregate women commuters and drivers. Such programs are not sustainable assurances of safety and protection from harassment. Such segregation even deepens gender divides and reinforces gender inequality, making long-term equality between men and women more difficult to achieve. Women safety apps with tracking features even normalize the concept that others have a right to know where a woman is at all times. This may also lead to another type of violence against women wherein abusers may stalk women and limit their movement and mobility.

Existing AIs do not address the fundamental issue of gender inequality which causes violence and harassment, the deeply ingrained attitudes toward the treatment of women, the notion that women need protection from men, and harmful gender norms of masculinity. Existing women safety apps do not truly empower women to reclaim their mobility. They are victim-blaming AIs that expect women to adjust to situations and take it upon themselves to install apps to protect themselves against would-be perpetrators. AI-driven women safety apps also normalize the unsafetyness of women. They do not tackle the underlying issue of perpetrators’ violence against women. Rather than empowering women to fully take control of their mobility, these apps normalize violence and reinforce victim-blaming mentalities.

If future AI interventions were to work, the challenge is how to rethink the motivational models behind these interventions. Since existing apps do not substantially reduce the vulnerability of women to victimization while in transit, there is a need to redirect frameworks. If this type of victim-blaming thinking and the normalization of violence were to be revised, the safety of transit systems and the burden of protection should not be on the potential victims but on the institutional systems that should protect them. Revising this line of victim-blaming thinking further, the perpetrators and potential abusers are the ones who should be in fear (rather than the women in transit) since they are the ones who are in violation of women’s rights anyway!

AI-driven apps can empower women if they shift the burden to potential perpetrators. They should focus on the empowerment of women so that they may gain control over their safety. They should put the transiting female community into a proactive stance by looking at ways to deter would-be perpetrators of violence instead of participating in the act of victim-blaming.