Advertisement

Editor’s Note: Welcome to my weekly column, Virtual Case Notes, in which I interview industry experts for their take on the latest cybersecurity situation. Each week I will take a look at a new case from the evolving realm of digital crime and digital forensics. For previous editions, please type “Virtual Case Notes” into the search bar at the top of the site.

On July 2, 2017, police in Bernalillo County, New Mexico responded to a late night 911 call. Upon arrival, they found a bruised woman and her young child outside—the woman said she’d been hit in the face with a handgun and that her life had been threatened by her partner, who barricaded himself inside their home. After six hours of negotiations and assistance from a police dog, 28-year-old Eduardo Barros was arrested and ultimately charged with aggravated battery, aggravated assault and possession of a firearm by a felon.

What may sound like a distressing but, unfortunately, not uncommon domestic violence situation became a more intriguing story when police reported the supposed source of the 911 call—according to the Bernalillo County Sheriff’s Office in their official press release on the incident, the call was triggered by the smart home device and voice-activated assistant known as “Alexa.”

“During the altercation Eduardo asked the victim, ‘Did you call the sheriff’? This question, based on the victim’s statements, prompted a smart home device known as ‘Alexa’, to contact law enforcement,” the release states. “In the 911 recording the victim can be heard yelling, ‘Alexa, call 911’.”

Since the time that these details were released, the plausibility of Alexa—which was developed by Amazon and comes integrated with the company’s “Echo” smart home device—calling 911 has been widely disputed. Alexa and the Echo don’t have full calling capabilities, according to the Associated Press, and neither does another popular smart home device, the Google Home.

Accurate or not, the story has generated some interest in the potential for smart devices to intervene in domestic violence situations.

“We see both risks to safety and strategies for good use out of most technology,” said Erica Olsen, director of the Safety Net Project at the National Network to End Domestic Violence, in an interview with Forensic Magazine. Safety Net is a project that focuses on “all things technology as it impacts survivors of abuse and harassment/stalking,” Olsen says, and has been in place since 2002.

“We look at the ways that abusers misuse technology as a tactic of abuse, often to monitor and stalk or harass and control a victim,” Olsen explained. “We also look at the ways that survivors can strategically use technology to increase their privacy and their safety.”

As advancements in smart technology increase, and things such as smartphones, smart home devices, smart wearables and smart home security systems become increasingly common, both the potential risks and the potential benefits to domestic abuse survivors evolve.

One product of the wide proliferation of smart technology is the creation of mobile apps specifically meant to intervene or collect evidence in domestic abuse situations. One of the most well-known apps of this kind is the Aspire News app, created in 2013 by Robin McGraw, founder of survivor advocacy foundation When Georgia Smiled and wife of television personality Dr. Phil McGraw.

The app was designed to be disguised as a regular news app from which users could discreetly record audio during a domestic abuse situation as well as secretly reach out to contacts for help. However, Olsen told me the app no longer seems to be available on the iOS App Store, and when downloaded on an Android device, shows only a blank screen with none of the tools and resources it is supposed to offer.

“Unfortunately, it’s one of many apps that we saw kind of come on the market a few years ago when this was a brand new idea, and then kind of lose momentum and interest. And they just have not been updated or maintained,” Olsen said. She noted this as a major problem with many apps that promote themselves as being able to intervene in domestic violence, among other issues. “Technology that is meant for people’s safety, we have got to make sure that we are doing it for the long haul. That these technologies are never going to be something that we can just put out on people’s phones and not need it to be maintained or updated.”

Another major problem with apps like Aspire News, she added, is that they potentially giving survivors false hope by not making their limitations clear.

“My biggest concern is usually that they are they are not as transparent in their limitations to the user,” she said. “I remember seeing one a couple years ago (…) that said it was an emergency safety app, that you can contact police and emergency from the click of a button. And I remember it saying on the website, ‘It’s like having a police officer in your back pocket!’ and I was thinking, ‘No, it’s absolutely not like that!’

“We found that most of the emergency ones are not helpful. I’ll be pretty blunt about that,” she added. She said it’s much more effective, in an emergency situation, to dial 911, as most smartphones allow users to bypass their security locks to do so.

Olsen also noted that the increasing use of spyware by abusers to track activity on their targets' devices can render the disguise strategy useless, as the abuser can see exactly what the victim is doing on the app.

On its website, the Safety Net Project has compiled an App Safety Center which provides more information about app safety for survivors, as well as considerations for app developers on how they can best help survivors.

Beyond mobile app development, Olsen said the developers of smart devices themselves—and other smart systems—can improve the situation for survivors by being mindful that both perpetrators and victims of domestic violence will inevitably be in the user base of such widespread technology. She used smart home security systems as one example.

“A lot of these technologies are kind of being developed with the assumption that your home is the safest place. That the people who live with you in your home would never misuse these technologies, so by default there’s all access, and sharing, and control that’s given to the people that are in your home,” she said. She explained that with many smart home security systems there is one main account holder who has total control over the system, and that the process for removing that user’s control is often not equipped for situations where that access can be dangerous.

“What happens if the somebody (that) sets this account up has access to the locks of the front door, can turn off the security system, can turn on surveillance cameras and see what’s happening in or outside of the home? What happens when that account holder is not supposed to be in the home anymore? Even if there’s an order of protection in place kicking that person out? But they still have access based on this device and the system,” she said.

On the flip side, smart home security can be a tool for survivors to protect themselves from stalkers, harassers and former abusers, and generate evidence for a criminal case.

“The smart home security systems can give people either a wonderful peace of mind, or information, evidence (and) documentation of abuse and stalking,” she said.

As for voice recognition technology and intelligent personal assistants like Alexa, they could be used to help survivors, even without the ability to directly call 911. These assistants can be an avenue for survivors to seek out and pinpoint resources to aid to them in their situation. But this avenue must be opened up by developers, who must stay aware of how important such resources can be.

“Siri was all over the news a few years ago because it didn’t respond if you asked for a rape crisis phone number, or suicide hotline, and they made those changes,” Olsen said. “There (is) some thought about intervening in crisis situations, but I think even before we go there, there’s just the fundamental ‘How are people using these technologies?’ (…) If this is what they’re using daily, they’re going to be using it to try to get that information and that hotline. ‘Am I being abused? What is abuse?’ People are going to ask those questions of these devices.”

Advertisement
Advertisement