FROM THE FTC: Out of the mouths of babes? FTC says Amazon kept kids’ Alexa voice data forever – even after parents ordered deletion

https://www.ftc.gov/business-guidance/blog/2023/05/out-mouths-babes-ftc-says-amazon-kept-kids-alexa-voice-data-forever-even-after-parents-ordered?utm_source=govdelivery

Business Blog

Out of the mouths of babes? FTC says Amazon kept kids’ Alexa voice data forever – even after parents ordered deletion

By

Lesley Fair

May 31, 2023

  FacebookTwitterLinkedIn

“Stop it!” Moms and Dads may have to repeat that instruction to their kids, but when parents said it to Amazon in an effort to get the company to delete children’s voice data obtained through its Alexa voice assistant, Amazon should have honored those requests immediately. But according to a complaint filed by the Department of Justice on the FTC’s behalf, Amazon responded by deleting files in some databases while maintaining them elsewhere – meaning the information was available for Amazon to use for its own purposes. The lawsuit alleges Amazon violated the Children’s Online Privacy Protection Act Rule by flouting parents’ deletion requests, retaining kids’ voice recordings indefinitely, and not giving parents the straight story about its data deletion practices. Amazon also allegedly violated the FTC Act by falsely representing that Alexa app users could delete their geolocation information and voice recordings and by engaging in unfair privacy practices related to deletion, retention, and employee access to data. The $25 million settlement with Amazon sends a clear message about the consequences of putting profits over privacy.

In addition to the massive amount of other information Amazon collects about customers, the company has found a direct route into millions of consumers’ homes through its Alexa-powered devices, which respond to users’ spoken requests. First some background about how consumers of all ages interact with the Alexa technology. When Alexa devices detect someone saying the “wake” word, Alexa begins recording what it hears in two formats: an audio file and a text transcript. Alexa then responds by performing the requested tasks.

Given Alexa’s access to so much highly personal data, privacy is an important consideration for many Alexa users. Not surprisingly, Amazon made privacy a centerpiece of its marketing. For example, on Amazon.com’s Alexa Privacy Hub, the company claimed “Alexa and Echo devices are designed to protect your privacy” and “Alexa and Echo devices provide transparency and control.”

Users also can interact through the Alexa app, which can collect their geolocation information. Amazon repeatedly assured Alexa app users that they  could delete their geolocation information. But even when consumers clicked on the appropriate delete buttons, the FTC says Amazon deleted the data from certain locations but retained data elsewhere, in direct contravention of its privacy promise. Amazon first discovered the problem in early 2018, but the FTC says it wasn’t until September 2019 that Amazon finally took some corrective action. “Some” being the operative word here because as the complaint alleges, due to faulty fixes and process fiascos, it wasn’t until early 2022 that Amazon finally got the problem fully under control.

Amazon made similar promises that Alexa users could “view, hear, and delete [their] voice recordings . . . at any time.” But as the complaint explains, once again Amazon deleted the files in certain locations, but retained transcripts of the recordings elsewhere in a form the company could use for product development. Adding insult to privacy injury, for at least a year, Amazon gave 30,000 employees access to Alexa users’ voice recordings – even though many of those staffers had no business need for the files.

Now to how the FTC says Amazon’s misrepresentations and compliance failures resulted in violations of COPPA. Given Alexa’s presence in consumers’ homes, and Amazon’s sale of child-centric products like Echo Dot Kids Edition, FreeTime on Alexa, and FreeTime Unlimited on Alexa, many users of the technology are under 13. In fact, more than 800,000 children interact directly with Alexa using their own Amazon profile, which links to their parent’s profile and contains the child’s name, birth date, and gender. As with adults, Amazon saved children’s voice recordings as audio and text files and used persistent identifiers to connect those files to the child’s Amazon profile.

The complaint alleges three ways in which Amazon’s practices usurped parents’ rights under COPPA to maintain control over their kids’ personal data. First, Amazon programmed Alexa to keep kids’ recordings forever, a practice that violated Section 312.10 of the COPPA Rule. That provision mandates that companies may keep kids’ data “for only as long as is reasonably necessary to fulfill the purpose for which the information was collected.” After that, according to the plain language of the Rule, they “must” securely delete it.

Second, Section 312.6 of COPPA gives parents “the opportunity at any time . . . to direct the operator to delete the child’s personal information.” But according to the complaint, children’s voice recordings were subject to the same ineffective deletion procedures as adults’ recordings. So even when parents told Amazon to delete those files, Amazon retained transcripts stored with persistent identifiers linked to the child’s account.

Another fundamental COPPA privacy protection is Section 312.4’s requirement that companies “provide notice and obtain verifiable parental consent prior to collecting, using, or disclosing personal information from children.” By failing to follow through with its stated practices, Amazon didn’t give parents complete and truthful notice about their ability to have their children’s personal information deleted – falling short of what COPPA requires. 

In addition to imposing a $25 million civil penalty, the proposed settlement prohibits Amazon from making misrepresentations about geolocation and voice information. The company also can’t use geolocation, voice information, and children’s information for the creation or improvement of any data product after customers have directed Amazon to delete that data. What’s more, Amazon must delete kids’ inactive Alexa accounts, notify users about the FTC-DOJ lawsuit, and implement a privacy program specifically focused on the company’s use of geolocation information.

Here are some pointers other businesses can take from the action against Amazon.

COPPA compliance demands continuous vigilance.  For companies covered by COPPA, compliance isn’t a one-and-done, check-the-box set of technicalities. Rather, the Rule creates substantive rights for parents and mandates ongoing legal responsibilities for businesses. Just one example is the requirement that companies clearly explain to parents up front their right to have their kids’ information deleted and the accompanying obligation to honor those requests. But even if parents don’t exercise their deletion rights, companies can’t hold on to the data “just because.” Once the purpose for which it was collected has passed, companies must securely delete it. And both for specific parental requests and COPPA’s general data deletion requirement, make sure you also delete relevant files stored in back-ups, separate databases, or other locations.

Voice recordings are biometric data deserving of scrupulous care.  As the FTC’s May 2023 Policy Statement on Biometric Information establishes, voice recordings – and transcripts of recordings – fall within the category of data derived from biometric sources that “raise significant consumer privacy and data security concerns.” Now factor in biometric data about kids and those concerns are elevated to the nth degree.

Don’t secretly use customers’ personal information – especially data about kids – to feed your algorithms.  Through this law enforcement action and the proposed settlement with Ring, the FTC is sending an unmistakable message to companies developing artificial intelligence: You can’t placate consumers with promises of privacy when you intend to use their data for other purposes. This principle takes on added significance in the context of kids. Children’s speech patterns are markedly different from adults, so Alexa’s voice recordings gave Amazon a valuable data set for training the Alexa algorithm and further Amazon’s commercial interest in developing new products. That’s just one reason why Amazon’s hollow promise to honor parents’ deletion requests was particularly troubling.

__________________________________________________

https://consumer.ftc.gov/consumer-alerts/2023/05/ftc-says-amazon-didnt-protect-alexa-users-or-childrens-privacy?utm_source=govdelivery

Consumer Alert

FTC says: Amazon didn’t protect Alexa users’ or children’s privacy

By

Jim Kreidler

Consumer Education Specialist

May 31, 2023

  FacebookTwitterLinkedIn

Image

Alexa

When a company makes a promise, you expect them to keep their word, right? According to the FTC, Amazon promised to delete children’s personal information and Alexa users’ voice and geolocation information but broke the law instead. In an announcement today, the FTC and the Department of Justice said Amazon violated children’s privacy law by keeping kids’ Alexa voice recordings forever to feed its voice-related algorithms.

Amazon promised parents they could “manage their content and devices” and delete their children’s voice recordings, but according to the FTC, the company didn’t always fulfill parents’ deletion requests, kept children’s sensitive voice data indefinitely, and put people’s data at risk of harm from unnecessary access. The FTC says Amazon also failed to fulfill Alexa users’ requests to delete their voice and geolocation information. Instead, Amazon kept using that information for their algorithms.

The proposed order would, among other things, require Amazon to pay $25 million, change its deletion practices, and implement strong privacy safeguards.

This case — like the Ring case — shows that the FTC continues to take action against companies that don’t safeguard people’s personal information, especially when it comes to sensitive biometric data, like voices or videos (as in the Ring case).

Here are some steps to take to protect your family’s personal information:

  • Learn how to protect your kids’ privacyAs a parent, you have control over the personal information companies collect online from your kids under 13. Any site attempting to collect personal information from your child has to get your consent, and it has to honor your choices about how that information is used.
  • Find out if you have the right to tell a company to delete your data. Some state laws give you that right. Learn more at the U.S. State Privacy Legislation Tracker from the International Association of Privacy Professionals.
  • Check if you can customize your privacy settings. If a device or an app doesn’t need the info it collects, such as your location, turn off that feature. If the device or app does need it, consider limiting access to only when the device or app is in use.

Learn more about protecting your privacy online and on apps at ftc.gov/yourprivacy.

___________________________________________________

https://consumer.ftc.gov/consumer-alerts/2023/05/rings-privacy-failures-led-spying-and-harassment-through-home-security-cameras?utm_source=govdelivery

Consumer Alert

Ring’s privacy failures led to spying and harassment through home security cameras

By

Alvaro Puig

Consumer Education Specialist

May 31, 2023

  FacebookTwitterLinkedIn

Image

The front door of a home with a video doorbell and another home security camera visible in a bedroom.

Before the days of internet-connected devices, the worst that could come from a ring of your doorbell was often an unexpected sales pitch, or neighborhood pranksters ringing and running. But as video doorbells become ubiquitous, a different kind of trouble could come knocking: people spying on you or hacking into your private videos.

That was the case with the home security company, Ring. The FTC says Ring’s poor privacy and lax security let employees spy on customers through their cameras, including those in their bedrooms or bathrooms, and made customers’ videos, including videos of kids, vulnerable to online attackers. Hackers exploited those vulnerabilities and harassed, insulted, and propositioned children and teens through their Ring cameras. Some hackers even live streamed customers’ videos.

In a settlement, Ring agreed to establish a privacy and security program and delete the videos it shouldn’t have — in addition to paying $5.8 million to affected customers. For more on the settlement, see the FTC’s business blog.

As this case and others show, the FTC is taking action against companies that don’t protect your private information, especially sensitive biometric information, like videos and voice recordings, that are tied to who you are.

If you have video cameras at home, here’s what else to know and do.

To learn more, check out your guide to protecting your privacy online.

_____________________________________________________

https://www.ftc.gov/business-guidance/blog/2023/05/not-home-alone-ftc-says-rings-lax-practices-led-disturbing-violations-users-privacy-security?utm_source=govdelivery

Business Blog

Not home alone: FTC says Ring’s lax practices led to disturbing violations of users’ privacy and security

By

Lesley Fair

May 31, 2023

  FacebookTwitterLinkedIn

Many consumers who use video doorbell and security cameras want to detect intruders invading the privacy of their homes. Consumers who installed Ring may be surprised to learn that according to a proposed FTC settlement, one “intruder” that was invading their privacy was Ring itself. The FTC says Ring gave its employees and hundreds of Ukraine-based third-party contractors up-close-and-personal video access into customers’ bedrooms, their kids’ bedrooms, and other highly personal spaces – including the ability to download, view, and share those videos at will. And that’s not all Ring was up to. In addition to a $5.8 million financial settlement, the proposed order in the case contains provisions at the intersection of artificial intelligence, biometric data, and personal privacy. It’s an instructive bookend to another major biometric privacy case the FTC announced today, Amazon Alexa

Ring sells internet-connected, video-enabled cameras and doorbells used by millions of Americas to secure their homes. Ring rang that security bell throughout an extensive marketing campaign, pitching its products as “Small in size. Big on peace of mind.” But the FTC says that despite the company’s claims that it took reasonable steps to ensure that Ring cameras were a secure means for consumers to monitor private areas of their homes, the company exhibited a fast-and-loose approach to customers’ highly sensitive information.

Although Ring’s employee handbook prohibited the misuse of data, the disturbing conduct of some staffers suggest that the admonition wasn’t worth the paper it was printed on. Rather than limit access to customer videos to those who needed it to perform essential job functions – for example, to help a consumer troubleshoot a problem with their system – Ring gave “free range” access to employees and contractors. Is there any doubt where that slipshod policy and lax controls would inevitably lead? During a three-month period in 2017, a Ring employee viewed thousands of videos of female users in their bedrooms and bathrooms, including videos of Ring’s own employees. Rather than detecting what the employee was up to through its own technical controls, Ring only learned about the episode after a female employee reported it. That’s just one example of what the FTC calls Ring’s “dangerously overbroad access and lax attitude toward privacy and security.”

Although Ring changed some of its practices in 2018, the FTC says those measures didn’t solve the problem. You’ll want to read the complaint for details, but even after those modifications, the FTC cites examples of an unauthorized “tunnel” that allowed a Ukraine-based contractor to access customer videos, an incident where a Ring employee gave information about a customer’s videos to the customer’s ex-husband, and a report from a whistleblower that a former employee had provided Ring devices to numerous individuals, accessed their videos without their knowledge or consent, and then took the videos with him when he left the company.

But the threats to consumers’ personal privacy didn’t just come from inside Ring. The complaint charges that until January 2020, Ring failed to address the risks posed by two well-known forms of online attack: “brute force” (an automated process of password guessing) and “credential stuffing” (taking usernames and passwords stolen during other breaches and using them to access Ring). The FTC says Ring’s security failures ultimately resulted in more than 55,000 U.S. customers experiencing serious account compromises.

How serious was the invasion of consumers’ privacy? In many cases, bad actors took advantage of the camera’s two-way communication functionality to harass and threaten people whose rooms were monitored by Ring cameras, including kids and older adults. Describing their experiences as terrifying and traumatizing, customers reported numerous instances of menacing behavior emanating from voices invading the sanctity of their homes via Ring:

  • An 87-year old woman in an assisted living facility was sexually propositioned and physically threatened;
  • Several kids were the object of hackers’ racist slurs;
  • A teenager was sexually propositioned;
  • Hackers cursed at women in the privacy of their bedrooms;
  • A hacker threatened a family with physical harm if they didn’t pay a ransom in Bitcoin; and
  • A hacker told a customer they had killed the person’s mother and issued the bone-chilling warning “Tonight you die.”

One Ring employee put it this way: “Unwittingly, we aid and abet those [hackers] who breached the data by not having any mitigations in place.”

Creepy employees and sinister hackers weren’t the only ones who wrested control of personal data from consumers. According to the complaint, without getting consumers’ affirmative express consent, Ring exploited their videos to develop image recognition algorithms – putting potential profit over privacy. Hiding its conduct in a dense block of legalese, Ring simply told people it might use their content for product improvement and development and then extrapolated purported “consent” from a check mark where consumers acknowledged they agreed to Ring’s Terms of Service and Privacy Policy.

The complaint alleges Ring violated the FTC Act by misrepresenting that the company took reasonable steps to secure home security cameras from unauthorized access, unfairly allowed employees and contractors to access video recordings of intimate spaces within customer’s homes without customers’ knowledge or consent, and unfairly failed to use reasonable security measures to protect customers’ sensitive video data from unauthorized access.

In addition to the $5.8 million required payment for consumers, the proposed order includes some provisions common in FTC settlements and important new provisions that merit careful attention not only from the tech sector but from any company using consumer data (and, let’s face it, that’s just about everyone). You’ll want to read the order carefully, but here are some highlights. The order prohibits Ring from making misrepresentations about the extent to which the company or its contractors can access customers’ videos, payment information, and authentication credentials. In addition, for the period when Ring had inadequate procedures for getting consumers’ consent, the company must delete all videos used for research and development and all data – including models and algorithms – derived from those videos. 

Ring also must implement a comprehensive privacy and security program that strictly limits “human review” of customers’ videos to certain narrow circumstances – for example, to comply with the law or to investigate illegal activity – or if the company has consumers’ express informed consent. The company also must up its security game with specific requirements of multi-factor authentication, encryption, vulnerability testing, and employee training.

In addition to notifying customers about the lax video access practices cited in the complaint, in the future, Ring must notify the FTC of any security incident that triggers notification under other laws and any privacy incident involving unauthorized access to videos of ten or more Ring accounts.

What can other companies take from the proposed settlement in this case?

Who’s in charge of consumers’ data? Consumers.  Consumers – not companies – should be in control of who accesses their sensitive data. Furthermore, decades of FTC precedent establishes that businesses can’t use hard-to-find and harder-to-understand “disclosures” and pro forma check boxes to fabricate “consent.” 

Develop algorithms responsibly.  According to the FTC, Ring accessed customers’ videos to develop algorithms without their consent. If you’ve entered the AI arena, the FTC’s message is clear: Consumers’ personal information isn’t grist that companies may use at will. The FTC will hold businesses accountable for how they obtain and use the consumer data that powers their algorithms. In addition, companies that rely on human review to tag the data used to train machine learning algorithms must get consumers’ affirmative express consent and put safeguards in place to protect the privacy and security of that information.

The FTC is “biometrically” opposed to the misuse of highly sensitive categories of personal information.  Biometric data – whether in the form of fingerprints, iris scans, videos, or something else – warrants the highest level of protection. If you haven’t read the FTC’s May 2023 Policy Statement on Biometric Information, make it next on your reading list.

The FTC works to keep consumers safe at home.  If there is one place people should be free from prying eyes, it’s at home. And if there’s one group that merits particular protection at home, it’s children. Now imagine the terror experienced by people – including youngsters – who were threatened in their beds by someone who gained access through a device bought for protection. The FTC’s action against Ring demonstrates the risks to privacy and security that a company’s cavalier data practices can inflict on consumers and kids.

The Warrior

I am Honored to be Your Friend: we "HONOR" WOMEN & MOMS, and MILITARY Females with our NEW, EXCITING "G.i.J.i.M.O.M." Series: http://thesiborg.com/ http://familymediasite.com/ http://tdmcomics.com/

We are ®Reece ENTERPRISES/©REECENETRICS™/®FAMILY MEDIA COMPANY™/©TDM Comics International; a small but slowly/Strategically growing group of Companies, Creating Comics, and Entertainment Products & “Brands” geared Towards the World Wide Diverse People, of many Cultures and Nations to “spread the love of Positive Images for peoples of All Colors, World wide!”

Our Comics Books have Different Strategic Designs, as Our Own Special ways of Supporting Literacy, Reading, and The ARTS & Libraries of Education.

Terry Reece, aka “the Warrior” Super Hero
Founder/Chairman/CEO
Writer/Copywriter/Creator of The Closet Cove and the L.A.Z.E.R.U.S. project, and the "G.i.J.i.M.O.M." Series Brand
warrior_75210@yahoo.com