20 posts in this topic

Microsoft tip leads to child porn arrest in Pennsylvania

 

A tip-off from Microsoft has led to the arrest of a man in Pennsylvania who has been charged with receiving and sharing child abuse images.

It flagged the matter after discovering that an image involving a young girl had been allegedly saved to the man's OneDrive cloud storage account.

According to court documents, the man was subsequently detected trying to send two illegal pictures via one of Microsoft's live.com email accounts.

Police arrested him on 31 July.

Source and more

Share this post


Link to post
Share on other sites

Good work by Microsoft. (Y)

Share this post


Link to post
Share on other sites
the man was subsequently detected trying to send two illegal pictures via one of Microsoft's live.com email accounts.

 

Edward Snowden would be disappointed in him on all sorts of different levels.

Share this post


Link to post
Share on other sites

I think Google and Facebook use the same or similar technology to find child sex predators.

Share this post


Link to post
Share on other sites

I think Google and Facebook use the same or similar technology to find child sex predators.

Google certainly do. Or at least, they do with regards to child sex abuse.

Share this post


Link to post
Share on other sites

Neowin's Google article: "Google takes down sex offender by going through his personal e-mail"

Neowin's future Microsoft article: "Microsoft, unlike Google, had a hunch said individual was storing kiddie action pics.  Doesn't scan like Google, NOPE!  They personally go through e-mails, we dont.  Scroogle everyone... Scroogle! That campaign wasn't just for stupid people... Okay?!! Please don't leave... :("

 

I can see it now.

1 person likes this

Share this post


Link to post
Share on other sites

...I can see it now.

I can't comment for the reporters, or even if this will be reported by Neowin. But let's stick to the topic rather than using it to bash others, shall we?

Share this post


Link to post
Share on other sites

Edward Snowden would be disappointed in him on all sorts of different levels.

 

You would think that scum like these would like to protect themselves. It is no news that PhotoDNA is used by major tech companies to stop exactly this group from doing harm.

Share this post


Link to post
Share on other sites

Before people start shouting about how Microsoft is going through your OneDrive, it's an automatic process. They use an algorithm to identify inappropriate pictures. Besides, it would be ridiculous that they have people going through over 250 million OneDrive accounts.

Share this post


Link to post
Share on other sites

You would think that scum like these would like to protect themselves. It is no news that PhotoDNA is used by major tech companies to stop exactly this group from doing harm.

With the way technology moves I'm sure there are hundreds of people just like this guy, except they know how to protect themselves. This type of technology is only going to catch the criminals that are too careless to protect themselves.

Share this post


Link to post
Share on other sites

Before people start shouting about how Microsoft is going through your OneDrive, it's an automatic process. They use an algorithm to identify inappropriate pictures. Besides, it would be ridiculous that they have people going through over 250 million OneDrive accounts.

 

The screening is an automatic process, no doubt. However, they probably have people that go through - or give the appropriate legal departments permission to access - the accounts of those people who have been flagged by the process.

Share this post


Link to post
Share on other sites

Though I'm glad justice was served. I only see services like this being used less for more encrypted and third party solutions. Which in the end means less people will get caught.

 

I just wonder what happens when a family is caught sharing children in bathtub pictures over these services....

1 person likes this

Share this post


Link to post
Share on other sites

Probably a pretty safe bet that guy also wasn't using something like trucrypt :laugh:

Share this post


Link to post
Share on other sites

I just wonder what happens when a family is caught sharing children in bathtub pictures over these services....

 

Good question.

Share this post


Link to post
Share on other sites

The screening is an automatic process, no doubt. However, they probably have people that go through - or give the appropriate legal departments permission to access - the accounts of those people who have been flagged by the process.

 

Yes, that's the correct way to verify the presence of child pornography. You can't convict somebody without any kind of verification.

 

It's very possible that they could simply inform law enforcement as soon as it's flagged by the automated process, then allow law enforcement to get warrants based on the information and do the actual verification. It's not necessary for Google or Microsoft employees to do view the content themselves to do the verification.

 

In any case, this is completely different from Google also scanning all emails for advertising keywords to target advertising at you. Microsoft doesn't make a profit off of scanning your emails in this manner. They simply scan image attachments in order to comply with federal law protecting children. (I only mention this point because of this http://www.neowin.net/forum/topic/1224727-microsoft-tip-leads-to-child-porn-arrest-in-pennsylvania/#entry596523473)

Share this post


Link to post
Share on other sites

I just wonder what happens when a family is caught sharing children in bathtub pictures over these services....

If it works anything like how the Google process was described, it checks photos against a database of known images, therefore it wouldn't see one-of-a-kind images (that is, those taken yourself) as child porn and therefore wouldn't flag it. But it could theoretically flag false-positives on images that have a similar fingerprint to an image in the database but are not (or even a personal image, such as your child in a bath, that could be classified as) child porn. That would probably come down to the police to determine whether or not it is illegal, though the person would likely be banned from the service that flagged the image.

 

 

As for the process when something is flagged, someone mentioned in the recent Google news about a similar scenario, when an image is flagged it is forwarded along with personal details (of the owner of the account) to a service that manually checks the reports. If it is confirmed as child porn, that service then forwards all the appropriate material to the police local to the person who was flagged. Or something along those lines, at least.

Share this post


Link to post
Share on other sites

Google uses a program to auto scan emails for child porn. 

:( :/ :s :crazy: :angry: :shiftyninja: (N)

 

Few days later.....

 

Microsoft uses similar process to get someone in trouble for child porn.   

:) :D :laugh: :woot: :ike: :drool: (Y)

 

Anyway, no big deal.  These scans are designed to search for specific info and a person isnt sitting down and looking at every single file you have stored/sent.  Just never ceases to amaze me at the amount of morons out that would do something like this.


 

Funny..two different authors though so at least there is that.

1 person likes this

Share this post


Link to post
Share on other sites

Neowin's ...

 

I can see it now.

well, Neowin need to emulate the Foxnews's spin methodology sometimes.

Share this post


Link to post
Share on other sites

As Nick has asked before, can this stay on topic rather than using an excuse to bash Neowin's reporters?

 

This is the last warning.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Recently Browsing   0 members

    No registered users viewing this page.