Jump to content



Photo

How Google scans your Gmail for child porn

google.

  • Please log in to reply
9 replies to this topic

#1 +techbeck

techbeck

    It's not that I am lazy, it's that I just don't care

  • 19,652 posts
  • Joined: 20-January 05

Posted 05 August 2014 - 11:48

John Henry Skillern was arrested last Thursday for the possession of child pornography. The 41-year-old restaurant worker was allegedly sending indecent images of children to a friend, but while it was Houston police that obtained a search warrant for Skillern's tablet and computer and placed him in custody, it was Google that tipped them off to his illegal activities. Skillern was using Gmail to send images of child sexual abuse, each of which had been given a unique digital fingerprint. When those images were sent via Google's email service, they were identified by the company's automated systems, allowing it to pass Skillern's details on to the police via the National Center for Missing and Exploited Children (NCMEC).

 

David Drummond, Google's chief lawyer, outlined his company's automated tagging system last year in The Daily Telegraph. Drummond explained that Google has used the technology since 2008, building up a database that notifies the company when known child porn images are found through its search engine or in the inboxes of its 400 million Gmail users. When Google does identify such images are being shared, as they allegedly were by Skillern, it informs law enforcement officials.

 

Other companies have access to similar photo tagging technology, including Microsoft, whose PhotoDNA software can also detect flagged images of abuse. PhotoDNA can calculate a mathematical hash for an image of child sexual abuse that allows it to recognize photos automatically even if they have been altered. The tech is now used by both Twitter and Facebook, after Microsoft donated it to the NCMEC in 2009. Videos, too, have become the focus of such digital fingerprinting programs. Google has its own Video ID software for detecting footage of child sexual abuse, and British company Friend MTS donated its Expose F1 detection program to the International Centre for Missing & Exploited Children (ICMEC) earlier this year.

 

More....

http://www.theverge....-for-child-porn




#2 GotBored

GotBored

    Brain Trust

  • 1,444 posts
  • Joined: 24-June 13
  • OS: Windows
  • Phone: Nokia 3110

Posted 05 August 2014 - 12:01

That is great news, every service should use something similar to track down pedophiles.



#3 +RedReddington

RedReddington

    member_id=28229

  • 9,945 posts
  • Joined: 14-May 03

Posted 05 August 2014 - 12:03

You know what I am really surprised this has made the headlines it has. I am really surprised this doesn't happen every day. 



#4 +Anarkii

Anarkii

    Member N° 1,455

  • 5,547 posts
  • Joined: 02-October 01
  • Location: Sydney, Australia
  • OS: Windows 10 Technical Preview 9879
  • Phone: iPhone 6+, iOS 8.1

Posted 05 August 2014 - 12:03

Long as they are scanning for just CP and not reading your personal emails or looking at shared private personal photos you might be sharing with loved ones then this is great news, and I hope more tech like this is widely used to catch the scum.



#5 Nagisan

Nagisan

    Neowinian Senior

  • 5,242 posts
  • Joined: 02-June 06

Posted 05 August 2014 - 12:31

The sad thing is this only goes after people who redistribute CP, it does nothing against those who create the content (as it only detects known images). In other words its only going after those who indulge in viewing the content, not those who are directly abusing the children by creating the content. Sadly there's really no way an automated system could properly identify new content, yet at least.



#6 +fusi0n

fusi0n

    Don't call it a come back

  • 3,939 posts
  • Joined: 08-July 04
  • OS: OSX\Windows 8.1\Ubuntu 14.04
  • Phone: iPhone 6 Plus

Posted 05 August 2014 - 12:32

This is great news, however.. How many false positives are scanned or gone through? Hopefully none. Hopefully they did their due diligence and it's a non issue..



#7 Skiver

Skiver

    Neowinian Senior

  • 3,797 posts
  • Joined: 10-October 05
  • Location: UK, Reading

Posted 05 August 2014 - 12:45

I've got an Android phone so my whole life is pretty much seen by google so scan away :)

Agreed with Nagisan though, unfortunately this is only catching the consumers and not creators. 



#8 OP +techbeck

techbeck

    It's not that I am lazy, it's that I just don't care

  • 19,652 posts
  • Joined: 20-January 05

Posted 05 August 2014 - 12:49

Agreed with Nagisan though, unfortunately this is only catching the consumers and not creators. 

 

A lot of the time, they are one in the same.  Creators exchanging with other creators.



#9 Nagisan

Nagisan

    Neowinian Senior

  • 5,242 posts
  • Joined: 02-June 06

Posted 05 August 2014 - 13:00

A lot of the time, they are one in the same.  Creators exchanging with other creators.

I doubt this system is catching many of those honestly. It says it uses a database of known content. If someone is creating their own content, none of it is known yet to the database, meaning if two creators make their own stuff and trade with each other, the system won't detect either as both are unknown to the system.



#10 Hum

Hum

    totally wAcKed

  • 63,572 posts
  • Joined: 05-October 03
  • Location: Odder Space
  • OS: Windows XP, 7

Posted 06 August 2014 - 08:37

I don't feel remotely safer because Google found someone sharing photos.