Apple recently rolled out its iOS 17 update for its iPhones that include a new feature that is raising alarm bells for police departments across the country. Police are warning parents that iPhone’s new NameDrop feature, which allows the exchange of images and contact information wirelessly to a nearby iPhone or Apple Watch, could put children in harm's way.
The new feature is automatically turned on after updating your iPhone to iOS 17. The feature allows iPhones (and certain Apple Watches) to exchange information from the contact card attached to the device which can include a user’s personal information such as their image, phone number and email address, when two devices are within centimeters of each other.
Related: Why owning Apple AirTags may be a dangerous choice
Many police departments from cities across the nation such as Chester, Pa., Oakland, Ca., Greenville County, South Carolina, and others have issued warnings to parents reminding them to shut off the setting as it can potentially put children at risk of having their privacy violated.
“If you have an iPhone and have done the recent iOS 17 update, they have set a new feature called NameDrop defaulted to ON. This feature allows the sharing of your contact info just by bringing your phones close together. To shut this off go to Settings, General, AirDrop, Bringing Devices Together. Change to OFF. PARENTS: Don’t forget to change these settings after the update on your children’s phones, also, to help keep them safe as well!” wrote the Chester Police Department in a Facebook post.
Apple features have come under fire for putting children’s safety at risk in the past. In an attempt to strengthen safety for children using its devices, in 2021, Apple released expanded child safety protections which includes a Communication Safety feature that scans and reports images of nudity. “Children will be warned when they receive or attempt to send images or videos containing nudity in Messages,” said Apple on its website.
The feature later faced backlash from users as privacy concerns arose over the new technology scanning for nudity on devices. Users cited fears of the technology being used for purposes other than child safety.
“When people hear that Apple is ‘searching’ for child sexual abuse materials (CSAM) on end user phones they immediately jump to thoughts of Big Brother and ’1984,’” Ryan O’Leary, then-research manager of privacy and legal technology at market research firm IDC, told CNN in 2021 .“This is a very nuanced issue and one that on its face can seem quite scary or intrusive. It is very easy for this to be sensationalized from a layperson’s perspective.”
Action Alerts PLUS offers expert portfolio guidance to help you make informed investing decisions. Sign up now.