Apple’s group FaceTime glitch meant users could access microphones and cameras of other iPhones
Photo: Getty Images


Each time technology fails, more women are put at risk

Apple’s group FaceTime glitch meant users could access microphones and cameras of other iPhones. It’s another danger victims of domestic abuse will have to face, says Emily Baker

Added on

By Emily Baker on

Apple is currently dealing with a crisis, thanks to the newly introduced group chat feature of FaceTime. Yesterday, iPhone users began to recognise a glitch that would allow them to secretly listen to other people’s microphone, and in some cases watch live footage via their camera, without the other person’s permission. While Apple has disabled group FaceTime while they work on a fix (which they say will be available later this week), it’s a worrying reminder of just how insecure and potentially vulnerable the technology we use every day can make us. It’s even more troubling for the millions of women at risk of domestic abuse.

Charities and campaigners for women’s rights have warned time and time again of the dangers technology can pose to women, especially when it comes to stalking. Research by Women’s Aid shows 18% of women have experienced stalking in some capacity since they were 16 years old, and almost a third of respondents’ abusers had used spyware or GPS location tech to aid their stalking. This research was carried out in 2014 – five years ago now – and the technology in our pockets has come on leaps and bounds since then.

Take our willingness to welcome the bodiless, omniscient assistant Alexa into our homes, for example. Sure, she’s brilliant for letting us know what the weather is like outside and she’s more than happy to order our shopping straight from Amazon, but the programmed entity in the corner of the living room also has no conscience. Online assistants and online technology will do anything you tell them to, as Ferial Nijem found out when she realised her ex-boyfriend was using their home assistant and it’s connection to her “smart home” to terrorise her. He would turn on loud music during the night, turn the lights on and off, open the blinds and then shut them again and use the home’s security cameras for surveillance. “If anybody would walk into this situation, they would think they were walking into a horror movie,” said Nijem.

Apple’s group FaceTime glitch hands abusers a host of tools and cannot be taken lightly

Apple’s group FaceTime glitch hands abusers a host of tools and cannot be taken lightly. The most apparent use for accessing a microphone or camera of a victim is for surveillance reasons – to find out where they are, what they’re doing, who they’re with. But another, more insidious yet just as dangerous reason is to collect information they can use against their victim, to bully them and to gaslight them. While Apple may have caught the bug early and taken steps to quash the problem quickly, the fact it was released into the world without these aspects of the technology triple-checked shows how far down the priority list safety is for tech companies.

It’s another sign of our digital lives and our real, tangible lives merging, with our “digital footprints” – the information we give out online, whether publicly or privately – having consequences beyond the parameters of the internet. Internet safety expert Jennifer Perry’s guide for victims of digital stalking, published in conjunction with Women’s Aid, tells those who have been stalked to do everything they can to limit their so-called digital footprint as much as they can. But that now-outdated advice can only go so far when technology has advanced enough to activate features without us even knowing. What’s more, the government’s recent update to domestic-abuse laws only touches upon the issue of online abuse, instead choosing to push the issue into the online harms white paper, which isn’t due for presentation until later in the year.

But we cannot afford to wait. As technology progresses, so does the ability of abusers to exploit that advancement and it should not be down to women and other victims of domestic abuse to limit their own usage. Tech companies have to take responsibility for their wrongs and they have to do more to protect those most vulnerable.


Sign up

Love this? Sign up to receive our Today in 3 email, delivering the latest stories straight to your inbox every morning, plus all The Pool has to offer. You can manage your email subscription preferences at My Profile at any time

Photo: Getty Images
Tagged in:
women and tech
domestic violence
Emotional abuse

Tap below to add to your homescreen

Love The Pool? Support us and sign up to get your favourite stories straight to your inbox