• Tech News
  • Fintech
  • Startup
  • Games
  • Ar & Vr
  • Reviews
  • How To
  • More
    • Mobile Tech
    • Pc & Laptop
    • Security
What's Hot

Samsung Galaxy A56 vs Galaxy A36: Which mid-ranger is the best buy?

January 15, 2026

Motorola Edge 60 Fusion review: The high-end budget phone to beat

January 14, 2026

A Knight of the Seven Kingdoms is not the show I was expecting

January 14, 2026
Facebook Twitter Instagram
  • Contact
  • Privacy Policy
  • Terms & Conditions
Facebook Twitter Instagram Pinterest VKontakte
Behind The ScreenBehind The Screen
  • Tech News
  • Fintech
  • Startup
  • Games
  • Ar & Vr
  • Reviews
  • How To
  • More
    • Mobile Tech
    • Pc & Laptop
    • Security
Behind The ScreenBehind The Screen
Home»Tech News»Google AI flagged parents’ accounts for potential abuse over nude photos of their sick kids
Tech News

Google AI flagged parents’ accounts for potential abuse over nude photos of their sick kids

August 22, 2022No Comments6 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Google Search will improve snippets to avoid misinformation
Share
Facebook Twitter LinkedIn Pinterest Email

A involved father says that after utilizing his Android smartphone to take pictures of an an infection on his toddler’s groin, Google flagged the photographs as little one sexual abuse materials (CSAM), in keeping with a report from The New York Instances. The corporate closed his accounts and filed a report with the Nationwide Middle for Lacking and Exploited Youngsters (NCMEC) and spurred a police investigation, highlighting the problems of attempting to inform the distinction between potential abuse and an harmless photograph as soon as it turns into a part of a person’s digital library, whether or not on their private system or in cloud storage.

Considerations in regards to the penalties of blurring the strains for what ought to be thought-about personal had been aired final 12 months when Apple introduced its Baby Security plan. As a part of the plan, Apple would domestically scan pictures on Apple units earlier than they’re uploaded to iCloud after which match the photographs with the NCMEC’s hashed database of recognized CSAM. If sufficient matches had been discovered, a human moderator would then assessment the content material and lock the person’s account if it contained CSAM.

The accounts had been taken away as a consequence of content material that “may be unlawful”

The Digital Frontier Basis (EFF), a nonprofit digital rights group, slammed Apple’s plan, saying it might “open a backdoor to your personal life” and that it represented “a lower in privateness for all iCloud Images customers, not an enchancment.”

Apple finally positioned the saved picture scanning half on maintain, however with the launch of iOS 15.2, it proceeded with together with an elective function for little one accounts included in a household sharing plan. If mother and father opt-in, then on a toddler’s account, the Messages app “analyzes picture attachments and determines if a photograph comprises nudity, whereas sustaining the end-to-end encryption of the messages.” If it detects nudity, it blurs the picture, shows a warning for the kid, and presents them with sources meant to assist with security on-line.

See also  She-Hulk: Attorney at Law review: Green is good in the MCU

The principle incident highlighted by The New York Instances befell in February 2021, when some physician’s workplaces had been nonetheless closed as a result of COVID-19 pandemic. As famous by the Instances, Mark (whose final identify was not revealed) seen swelling in his little one’s genital area and, on the request of a nurse, despatched pictures of the difficulty forward of a video session. The physician wound up prescribing antibiotics that cured the an infection.

Based on the NYT, Mark obtained a notification from Google simply two days after taking the pictures, stating that his accounts had been locked as a consequence of “dangerous content material” that was “a extreme violation of Google’s insurance policies and may be unlawful.”

Like many web corporations, together with Fb, Twitter, and Reddit, Google has used hash matching with Microsoft’s PhotoDNA for scanning uploaded pictures to detect matches with recognized CSAM. In 2012, it led to the arrest of a person who was a registered intercourse offender and used Gmail to ship pictures of a younger lady.

In 2018, Google introduced the launch of its Content material Security API AI toolkit that may “proactively determine never-before-seen CSAM imagery so it may be reviewed and, if confirmed as CSAM, eliminated and reported as shortly as potential.” It makes use of the instrument for its personal companies and, together with a video-targeting CSAI Match hash matching resolution developed by YouTube engineers, affords it to be used by others as effectively.

Google “Preventing abuse on our personal platforms and companies”:

We determine and report CSAM with skilled specialist groups and cutting-edge expertise, together with machine studying classifiers and hash-matching expertise, which creates a “hash”, or distinctive digital fingerprint, for a picture or a video so it may be in contrast with hashes of recognized CSAM. Once we discover CSAM, we report it to the Nationwide Middle for Lacking and Exploited Youngsters (NCMEC), which liaises with legislation enforcement businesses around the globe.

A Google spokesperson instructed the Instances that Google solely scans customers’ private pictures when a person takes “affirmative motion,” which may apparently embrace backing their footage as much as Google Images. When Google flags exploitative pictures, the Instances notes that Google’s required by federal legislation to report the potential offender to the CyberTipLine on the NCMEC. In 2021, Google reported 621,583 instances of CSAM to the NCMEC’s CyberTipLine, whereas the NCMEC alerted the authorities of 4,260 potential victims, a listing that the NYT says contains Mark’s son.

See also  Which gaming PC should you buy on Prime Day 2022?

Mark ended up shedding entry to his emails, contacts, pictures, and even his cellphone quantity, as he used Google Fi’s cellular service, the Instances studies. Mark instantly tried interesting Google’s resolution, however Google denied Mark’s request. The San Francisco Police Division, the place Mark lives, opened an investigation into Mark in December 2021 and obtained ahold of all the data he saved with Google. The investigator on the case finally discovered that the incident “didn’t meet the weather of a criminal offense and that no crime occurred,” the NYT notes.

“Baby sexual abuse materials (CSAM) is abhorrent and we’re dedicated to stopping the unfold of it on our platforms,” Google spokesperson Christa Muldoon mentioned in an emailed assertion to The Verge. “We observe US legislation in defining what constitutes CSAM and use a mix of hash matching expertise and synthetic intelligence to determine it and take away it from our platforms. Moreover, our staff of kid security specialists critiques flagged content material for accuracy and consults with pediatricians to assist guarantee we’re capable of determine cases the place customers could also be in search of medical recommendation.”

Whereas defending youngsters from abuse is undeniably necessary, critics argue that the observe of scanning a person’s pictures unreasonably encroaches on their privateness. Jon Callas, a director of expertise initiatives on the EFF referred to as Google’s practices “intrusive” in an announcement to the NYT. “That is exactly the nightmare that we’re all involved about,” Callas instructed the NYT. “They’re going to scan my household album, after which I’m going to get into hassle.”

See also  Google unveils Blockchain Node Engine for Web3 developers

Source link

abuse accounts flagged Google kids nude parents Photos potential Sick
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

10 Ways to Secure Your Online Accounts

December 7, 2025

I love the Google Pixel wristlet accessory – but it has a big problem

December 6, 2025

The Mighty Nein season 1 review: Makes The Legend of Vox Machina seem like a kids’ show

November 17, 2025

I hate to say it, but don’t buy the Google Pixelsnap Stand

October 23, 2025
Add A Comment

Comments are closed.

Editors Picks

Synatic Becomes Licensed Partner of ACORD to Ensure Compliance for Insurance Data Standards

July 2, 2022

Apple Still Has Big iPad Plans for March

March 12, 2024

Revolut rebrands kids’ account

August 18, 2022

Meta Quest Pro review

October 25, 2022

Subscribe to Updates

Get the latest news and Updates from Behind The Scene about Tech, Startup and more.

Top Post

Samsung Galaxy A56 vs Galaxy A36: Which mid-ranger is the best buy?

Motorola Edge 60 Fusion review: The high-end budget phone to beat

A Knight of the Seven Kingdoms is not the show I was expecting

Behind The Screen
Facebook Twitter Instagram Pinterest Vimeo YouTube
  • Contact
  • Privacy Policy
  • Terms & Conditions
© 2026 behindthescreen.fr - All rights reserved.

Type above and press Enter to search. Press Esc to cancel.