The safety of women is a worrying concern in many countries. And it’s not just about walking the streets alone at night or even driving on the road – alone. Here the focus is not just on physical assaults. Technology is also the villain, the archetypal stalker.
And, well, one could easily blame the coronavirus for this. There has been a global rise in online harassment of women and girls in the past year, usually by abusive partners or ex-partners who are stuck at home in front of a screen due to coronavirus lockdowns, according to UN Women.
The victim is constantly walking on eggshells. The thought of being shamed on social media is all too nightmarish and humiliating.
As worldwide restrictions push more people online, digital gender abuse is likely to worsen now that the internet is an absolute necessity and there is no escape from it.
Nowhere is this more apparent than in South Asia, where authorities have introduced harsher penalties and expanded surveillance networks, including facial recognition systems, to prevent such crimes.
The problem is that there is no guarantee that the data collected by police officials from the victims won’t be used against the latter.
In Pakistan, police have launched a mobile safety app after a gangrape. In India’s Lucknow city, police earlier this year said they would instal cameras with emotion recognition technology to spot women being harassed.
But use of these technologies offers no evidence that they help reduce crime. As there are no data protection laws, alarm bells are ringing very loudly among privacy experts and women’s rights activists who say surveillance is going north and can hurt women even more.
There is a strong chance that instead of walking into safety, the women could tread a dangerous area. There could be more harassment from the police if the technology is not used in the right manner.
All this in a country where a rape is reported every 15 minutes, and crimes against women nearly doubled to more than 405,000 cases in 2019, compared to about 203,000 in 2009.
Lucknow is one of eight cities implementing a Safe City project that aims to create a “safe, secure and empowering environment” for women in public places.
But the project – alongside the 100 Smart Cities programme that relies on technology to improve services – is reportedly being used to exponentially increase surveillance. By targeting women disproportionately (authorities) are creating new problems in a society where women are already constantly tracked in their homes.
Worldwide, the rise of cloud computing and artificial intelligence technologies has popularised the use of facial recognition for a range of applications from tracking criminals to admitting concert-goers.
But technology and privacy experts say the benefits are not clear and that they invade people’s privacy. Without data protection laws, there is little clarity on how the data is stored, who can access it and for what purpose.
It is ironic that the very technology that is supposed to be protecting them is leading them to disaster. It’s not empowering women, rather it promotes the idea that women need to be watched for their own safety. But in Asia, it is being widely deployed.
The problem is that technology has its unpleasant downside. Images from cameras in Islamabad of couples travelling in vehicles were leaked in 2019, while women at Balochistan University said they were blackmailed and harassed by officials with images from campus CCTV cameras the same year.
Following the gang rape last year on a highway with CCTVs, the Punjab Police launched a mobile safety app that collects the user’s personal information when she sends an alert to the police during an emergency.
That includes access to phone contacts and media files — leaving women vulnerable to further harassment, say privacy rights groups.