This story is part WWDC 2022CNET’s full coverage with and about Apple’s annual developer conference.
What’s happening
Apple has announced a new Safety Check feature to help potential victims in violent relationships.
Why it matters
This is the latest example of a technology industry that deals with difficult problems of personal technology to which there are no clear or simple answers.
What’s next
Apple is working with victim and survivor advocacy organizations to identify other characteristics that can help people in crisis.
Among the long-sought and popular new features that Apple plans to introduce on the iPhone this fall, such as cancel sending for iMessage texts and emails as well as a function to find and remove duplicate photosis one that is not just a convenience – its use can mean life or death.
Apple announced a Safety Check on Monday, new feature in iPhone or iPad settings, designed to help victims of domestic violence. The setup, which comes this fall with iOS 16, is designed to help someone quickly sever ties with a potential abuser. Security Check does this by helping a person quickly see who they are automatically sharing sensitive information with, such as their location or photos. But in emergencies, it also allows a person to easily and quickly disable access to and sharing information on any device other than the one in their hands.
Notably, the app also includes a highlighted button in the upper right corner of the screen, labeled Quick Exit. As the name implies, it is designed to help a potential victim quickly hide that she was watching Safety Check, in case the abuser does not allow them privacy. If the abuser reopens the settings application, where the Security Check is stored, it will start on the default general settings page, effectively covering the victim’s tracks.
“Many people share passwords and access their devices with a partner,” Katie Skinner, Apple’s privacy engineering manager, said at the company’s WWDC event on Monday. “However, in violent relationships, this can jeopardize personal safety and make it difficult for victims to get help.”
Security checks and careful coding are part of a larger effort by technology companies to stop their products from being used as tools for misuse. It is also the latest sign of Apple’s readiness to enter construction technology to address sensitive topics. And although the company says it is serious in its approach, it has come under criticism for some of its moves. Last year, the company announced efforts to uncover images of child exploitation on some of its phones, tablets and computers, a move critics are concerned about may violate Apple’s commitment to privacy.
Still, advocates for the victims say Apple is one of the few large companies to work publicly on these issues. While many technology giants, including Microsoft, Facebook, Twitter and Google, have built and implemented systems designed to detect offensive images and behaviors on their sites, have struggled to create tools that stop abuse while it is happening.
Unfortunately, the abuse has worsened. A survey of domestic violence practitioners conducted in November 2020 found that 99.3% had clients who had experienced “technology-facilitated stalking and abuse,” according to the Women’s Services Network, which worked on a report with Curtin University in Australia . Moreover, organizations have learned that reports of GPS Tracking of victims has jumped by more than 244% since they last conducted a survey in 2015.
In the midst of all this, technology companies like Apple are increasingly working with victims ’organizations to understand how their tools can be misused by the perpetrator and help a potential victim. The result is features such as Safety Check’s quick exit button, which proponents say is a sign that Apple is building these features in what they call “trauma-informed.”
“Most people can’t comprehend the sense of urgency” of many victims, said Renee Williams, executive director of the National Center for Victims of Crime. “Apple was very receptive.”
Apple says more than a billion iPhones are used worldwide.
Apple / screenshot from CNET
Tough questions
Some of the biggest victories of the technology industry have come from identifying abusers. In 2009, Microsoft helped create image recognition software called PhotoDNA, which is now used by social networks and websites around the world to identify images of child abuse when posted online. Since then, similar programs have been developed to help identify celebrities videos of terrorist recruitmentlive streaming from mass shootings and other things that big tech companies are trying to keep off their platforms.
As technology has become more prevalent in our lives, these efforts have become increasingly important. And unlike adding new video technology or increasing computer performance, these social questions don’t always have clear answers.
In 2021, Apple made one of its first public moves into victim-centered technology when it announced new features for its iMessage service designed to analyze messages sent to users labeled as children. to determine whether their contributions contained nudity. If his system suspected the image, he would blur the attachment and warn the person receiving it to make sure he wanted to see it. Apple’s service would also direct children to resources that could help them if they are victimized through the service.
At the time, Apple said it had built messaging scanning technology with privacy in mind. But activists worried that Apple’s system was also designed to warn an identified parent if their child still decides to look at the attached image they suspect. That, some critics say, could trigger abuse by a potentially dangerous parent.
Apple’s additional efforts to uncover potential child abuse images that could be synced to its photo service via iPhone, iPad and Mac have been criticized by security experts who worried that it might be abused.
However, advocates for the victims acknowledged that Apple is one of the few companies working on tools designed to provide support to victims of potential abuse while this is happening. Microsoft and Google have not responded to requests for comment on whether they plan to introduce Safety Check-like features to help victims who may be using Windows and Xbox software for PC and video game consoles or Android mobile software for phones and tablets.

Last year, Apple introduced a child safety system in iMessages.
Apple
Learning, but a lot needs to be done
The technology industry has been working with victim organizations for more than a decade, looking for ways to adopt a way of thinking about safety within their products. Proponents say there has been a lot in the last few years security teams have grown within the technology giants, in some cases staffed by people from the nonprofit world who worked on issues taken over by the technology industry.
Last year, Apple began consulting with some Victim Rights advocates on Safety Check, seeking input and ideas on how best to build the system.
“We’re starting to see recognition that there is corporate or social responsibility to ensure that your apps can’t be abused too easily,” says Karen Bentley, Wesnet’s CEO. And she said it is especially difficult because, as technology has evolved to become easier to use, it also has the potential to be a tool of abuse.
That’s part of why she says Apple’s security check is “brilliant” because it can quickly and easily separate someone’s digital information and communications from their abuser. “If you experience domestic violence, you will probably experience some of that violence in technology,” she said.
Although Safety Check has moved from the idea to test software and will be widely available in the fall with an iOS 16 software update package for iPhone and iPad, Apple said it plans to do more work on these issues.
Unfortunately, Safety Check does not address the ways in which abusers can track people using devices they do not own – for example, if someone puts one of Apple’s $ 29 AirTag trackers in their coat pockets or their car to snoop on them. The security check is also not designed for phones placed under children’s warrants, for people under the age of 13, although the feature is still in testing and could change.
“Unfortunately, bullies are persistent and constantly updating their tactics,” said Erica Olsen, director of the Safety Net project, a National Domestic Violence Network program that trains companies, community groups and governments on how to improve victim safety and privacy. “There will always be more work in this space.”
Apple said it is expanding training with its employees who communicate with customers, including vendors in its stores, to know how features like Safety Check work and to be able to teach when appropriate. The company has also created guidelines for its support staff to help identify and assist potential victims.
In one case, for example, AppleCare teams are learning to listen when an iPhone owner calls expressing concern that they have no control over their own device or their own iCloud account. In another case, AppleCare can instruct someone how to remove their Apple ID from the family group.
Apple also updated its Personal Security User Guide in January to instruct people on how to reset and regain control of an iCloud account that could be compromised or used as a tool for abuse.
Craig Federighi, Apple’s head of software engineering, said the company would continue to expand its personal security features as part of its greater commitment to its customers. “Protecting you and your privacy is and will always be at the center of what we do,” he said.