Man arrested for child erotica stored on his SkyDrive

Nearly a year ago, we reported on a user called WingsofFury and his disagreement with Microsoft over the type of data stored in his SkyDrive. This prompted Microsoft to clarify its policies surrounding your data and privacy.

Regardless, Microsoft will continue to monitor its user’s content, to a point at least, and will act in whatever way(s) it deems necessary to see any and all illegal activities, utilising its services, stopped.

Well that’s exactly what’s happened to Florida resident David Stuart. In what was most likely mandatory security checks on its user’s data, Stuart’s account was flagged as suspicious by Microsoft’s internal security after finding over 3,000 images of “child erotica” had been uploaded.

The Tampa Bay Times has said:

On March 5, Microsoft, headquartered in Redmond, Wash., reported suspicious activity on David Stuart's SkyDrive account to the National Center for Missing and Exploited Children in Alexandria, Va. The organization's "Cybertip reports" were then forwarded to the Central Florida Internet Crimes Against Children task force, which notified Largo police.

While not out and out pornographic, the images did contain young children wearing adult clothes and make-up. The originating sites of the images have been shut down. The uploads to SkyDrive were traced back to an IP address registered to Stuarts mobile home.

While Stuart has been jailed, with bail set at $75,000, his wife Sharon is standing by her husband. She has said:

He's a very loving father and a good provider to us. It's not true ….. Whatever happens, I'm staying with him. I'm looking forward to our third anniversary in November.

All the major technology giants – Microsoft, Google, Apple, Facebook, Twitter, etc – are required by federal law to report any possible or suspected child pornography stored on their storage applications and social media.

Source: Tampa Bay Times via WinBeta | Image courtesy of Microsoft

Report a problem with article
Previous Story

Windows Phone and iOS accelerated growth over Android in US

Next Story

TechEd Announcements: New Windows Server, SQL Server, Azure, and more

83 Comments

Commenting is disabled on this article.

Before I start let me say this: I'm not defending the guy who got caught with child pornography in any way. It is clearly obvious that the data we put on MS Skydrive, GoogleDrive and other Cloud services based in the US is being looked at. Can't this problem be solved if people use cloud services based in countries that don't kiss the US's butt and don't share anything with the US (like China for example)?

Microsoft just ruined this guys family life. Child porn is one thing, this isn't that at all. FL wants to make everyone a convicted felon. Pinellas County is the worse on in all Florida.

Blow Florida off the map.

The guy didn't take the pictures, he saved them from some website somewhere and this is a crime? If he took the pictures himself that would be a problem.

I am going to save a bunch of pictures of deadly arsenal on my Skydrive and see if the feds think they will find gins at my house. This is just crazy.

This is what I don't understand on some of the comments here.

Somone will state to encrypt before uploading, companies have access to your files and such. You know what? If you don't have anything to hide, or don't upload anything that it is considered illegal, then your account won't be flagged. Nor, anyone from any of these companies will search your account.

Now do something illegal that will flag your account, of course, someone will look into it, and provide information to authorities.

It is just common sense folks.

RommelS said,
This is what I don't understand on some of the comments here.

Somone will state to encrypt before uploading, companies have access to your files and such. You know what? If you don't have anything to hide, or don't upload anything that it is considered illegal, then your account won't be flagged. Nor, anyone from any of these companies will search your account.

Now do something illegal that will flag your account, of course, someone will look into it, and provide information to authorities.

It is just common sense folks.

I agree. But kids dressed in adult clothes and wearing makeup, isn't pornographic or erotic. It is something dids do. Again as another poster above asked, what about parents on the show Toddlers and babes wearing adults clothes and makeup?

Porno is usually naked pictures of kids, not kids fully dressed.
Obviously this guy didn't think it was illegal.

TechieXP said,

I agree. But kids dressed in adult clothes and wearing makeup, isn't pornographic or erotic. It is something kids do. Again as another poster above asked, what about parents on the show Toddlers and babes wearing adults clothes and makeup?

I have to agree that something doesn't add up here between the description of these photos and CP. Now, the guy had 3,000 pictures of these images, so he's likely to be quite the fan of whatever he was looking at.

I suspect law enforcement and the media are being too circumspect in describing this content for public consumption to actually give us a sense of why this is CP.

As you and others have mentioned, if these pictures do match the general description we've been given, we're seeing 10x worse on childhood beauty pageants on TV.

Fortunately, a jury of his peers will actually get to evaluate these issues directly.

TechieXP said,

I agree. But kids dressed in adult clothes and wearing makeup, isn't pornographic or erotic. It is something dids do. Again as another poster above asked, what about parents on the show Toddlers and babes wearing adults clothes and makeup?

Porno is usually naked pictures of kids, not kids fully dressed.
Obviously this guy didn't think it was illegal.

I am not elating to that particular part of the article - more of generalization.

I am not going to judge the guy nor Microsoft for doing what they did. However, it does feel weird that a guy has 3K of these photos, so something is missing here (as Excalpius said), but that is how far I am going to speculate. I'll wait for the result of the jury.

So kids wearing adult clothes and makeup is now considered pornographic material? What about commercials that portray children the exact same way?

TechieXP said,
So kids wearing adult clothes and makeup is now considered pornographic material? What about commercials that portray children the exact same way?

Add to that a trend of parents taking pictures of their children dressed like adults (be them actors, celebrities, singers, you name it) or trying to replicate some famous piece of art... =/

I'm glad the dude got busted.

That said, you are foolish to not encrypt important files before uploading to any public cloud storage service. I advise this to all my customers. They not only need a tiered storage policy but a tiered security policy for any public cloud initiatives.

Wasnt Microsoft working with law enforcement on child erotica/pornography?
Microsoft developing auto recognizion software to police enforcements don't manually have to search through all those horrible photos?
And since they scan 'the internet' with it, makes sense they use this on their skydrive and azure too.

This article say noting. I can upload anything to skydrive. If I make it public and some People press the "Report" link, it's my problem, but not that MS inspect skydrive files.

I don't know why MS (as a private for a profit company) spends own resource to do that.

Monitoring files usually is an expensive task and it is not likely that it will help to attract more customers. So it costs money and gives nothing in exchange.

I consider more fair a limit for over usage of bandwidth.

ps: unless MS is receiving a monetary compensation because that (may be related with Patriotic Act).

Brony said,
I don't know why MS (as a private for a profit company) spends own resource to do that.

Monitoring files usually is an expensive task and it is not likely that it will help to attract more customers. So it costs money and gives nothing in exchange.

I consider more fair a limit for over usage of bandwidth.

ps: unless MS is receiving a monetary compensation because that (may be related with Patriotic Act).


Fail comment fails.

It's really simple - Microsoft are storing the files on their servers. Therefore they are obligated to ensure they're not storing anything illegal. Otherwise they are either breaking the law, or assisting someone else to break the law.

It doesn't cost much - they simply have a background, low priority job that runs an sophisticated image scanner against image content on the server and passes it to a human operator if suspect content is found.

It's not about attracting customers, it's about being within the law. This should be blindingly obvious.

That's what bugs me. ISPs were able to take 'its not our job to police traffic', but cloud providers can not since data resides there.

If I rent a storage locker, I have an expectation of privacy. I simply don't understand why this doesn't apply to virtual ones.

protocol7 said,
Guess I better delete my Toddlers & Tiaras episodes off my Skydrive fast.

Yeah why don't the people associated with THAT show get arrested. 5 year old girls looking like prostitutes? I'm sure that's completely fine. /s

"young children wearing adult clothes and make-up" is not child erotica, it is just creepy.

I wonder what other data Microsoft is sniffing around for - funny pictures of drunk people because they did not consent to having them taken? A picture of your naked ex because you two are no longer together? A photoshopped picture of Mitt Romney? An erotic novel? Your code because that for loop looks quite similar to the one used in oracle software?

One day you will wake up and realize your account was banned because they could not find a license for song X that you used in your video for 20 seconds without acquiring rights for it.

if you are concerned about putting sensitive stuff on skydrive, encrypt it. geez . whats laughable is people believing some smaller file storage companies in keeping their data safe. I would trust microsofts security over any of those any day of the week. after all, I would take my files getting scanned by a computer over my data being stolen and posted all over the internet.

I uploaded porn to sky drive on August 19th 2012 for a test to see how long it would stay there. Just checked, its still there. Guessing they only delete/detect if you share from there.

Brony said,
Is funny because MS says that they can detect any porn with a precision of 99%.

Maybe he is in to some effed up stuff!

** I jest, i jest **

just to make this clear..... if you possess a JC Penny or Sears catalog from the 80's 90's or early 2000's you possess "child erotica" by the definition so many use now days..... underwear ads kids where actually in, swimsuit ads, etc,etc... lets start raiding everyone's houses to seize this illegal material!

neufuse said,
just to make this clear..... if you possess a JC Penny or Sears catalog from the 80's 90's or early 2000's you possess "child erotica" by the definition so many use now days..... underwear ads kids where actually in, swimsuit ads, etc,etc... lets start raiding everyone's houses to seize this illegal material!
I think you might be right. What's the world coming to, when you're not even sure what's considered kiddie porn anymore.

It's called *local community standards - that is pretty much the standard used everywhere on the planet; it's far from unique to the United States. The standards differ not just from country to country, but even within a country. Look just at the Scandinavian nations - not everywhere are their standard "looser" than the "puritan" United States (or United Kingdom, for that matter).

neufuse said,
just to make this clear..... if you possess a JC Penny or Sears catalog from the 80's 90's or early 2000's you possess "child erotica" by the definition so many use now days..... underwear ads kids where actually in, swimsuit ads, etc,etc... lets start raiding everyone's houses to seize this illegal material!

There's still pictures of kids and teens wearing 2 pieces swimsuit in Sears catalog these days. By the logic of this piece of news (if it is reported correctly) we should all be arrested asap.

You can also see a nude young girl in a very good german movie i've seen some time ago called Nowhere in Africa (Nirgendwo in Afrika http://www.imdb.com/title/tt0161860/). I'm not sure if i'm in a danger of being arrested for having seen this movie. I'm scared.

LaP said,
By logic of this piece of news (if it is reported correctly) we should all be arrested asap.

Obviously whatever was in these images was worse than some kids running around in swimsuits, or NCMEC and the police wouldn't have moved forward with an arrest in the first place.

Max Norris said,

Obviously whatever was in these images was worse than some kids running around in swimsuits, or NCMEC and the police wouldn't have moved forward with an arrest in the first place.

Definately. I just don't see someone getting arrested with a 75000$ bail for "the images did contain young children wearing adult clothes and make-up".

There's surely a piece of information missing from the news.

neufuse said,
just to make this clear..... if you possess a JC Penny or Sears catalog from the 80's 90's or early 2000's you possess "child erotica" by the definition so many use now days..... underwear ads kids where actually in, swimsuit ads, etc,etc... lets start raiding everyone's houses to seize this illegal material!

Silly isn't it? I mean, this guy may well have had some dodgy stuff but that description, out of context, doesn't sound all that bad.

It's either terribly out of context or it's the same sort of logic that bans parents from bringing a camera to school sports days.

I'm not condoning the fact that this guy was keeping and collecting "child erotica", that's messed up... But what exactly makes it illegal? I always thought that in order for images to be considered illegal or pornographic, the content needed to be of a subject performing some sexual act or posed in a sexual nature.

I'm guessing they suspect he has the real thing somewhere?

That would be an interesting job, going through everyone's stuff to verify it's not illegal! i know they have computers that scan everything, but like for pictures, it must get a lot of stuff that needs to be verified by a person.

The software is very efficient at finding 'questionable content'. It;'s then sent for a human to verify & review before taking any action.

Facebook uses the same software.

Buttus said,
That would be an interesting job, going through everyone's stuff to verify it's not illegal! i know they have computers that scan everything, but like for pictures, it must get a lot of stuff that needs to be verified by a person.

Nah, they probably use known child porn hashes ("digital fingerprints") and compares these to the stored files on upload, so that you're tagged if you do it. This makes the system 100% accurace, 0% false positives. Well, at least this is how I'd do it.

This system would only work well for well-circulated child porn with known hash values, but I guess you can assume that a pedophile working with a few hundreds of child porn images/videos is working with at least a few known ones by the FBI.

Calculating a hash value for an uploaded file is pretty fast and simple to do, and storing child porn hash values is storage efficient. Microsoft would even get around the problem of having to store actual child porn on their servers too, which might be problematic from a legal perspective, not to mention the ethical problem of having staff there need to actually view the child porn.

Stuff like this ****es me off. Microsoft is pushing really hard to make Skydrive a 2nd hard drive located in the cloud, but if you put something objectionable on it you're f****d. Where do you draw the line?

Per the article "the images did contain young children wearing adult clothes and make-up", so now the guy who may have never done anything wrong is in jail, and his name is all over the internet. 10 years from now, where will the line be? Any images of an ex who doesn't want you to have her nude pics anymore? That's a scary thought.

So basically you're upset that 1) he posted something that he clearly shouldn't have, 2) didn't bother to read the terms of service before posting something that may get him into legal trouble, 3) Microsoft was legally required to report it, 4) NCMEC felt it was serious enough to forward to local law enforcement, and 5) Local authorities agreed and arrested him? Yea, can clearly see how Microsoft is in the wrong here. /s

Its really bull**** and I hope they realize it soon (or more importantly, figure out a better way to minimize their liability which is really the driver of this invasion).

That so many care so little for privacy is stunning.

Dashel said,
That so many care so little for privacy is stunning.

Rule #1 for keeping something private... don't post it to a third party server on the internet, especially when they tell you up front that they're specifically looking for this sort of thing in the first place, and that's ignoring the fact that they're required to report it by law. Any expectations of privacy has gone down the toilet.

I'm more surprised at how little a lot of people seem care about materials that apparently are close enough to child pornography that it merited NCMEC's notice and an arrest warrant, makes me start to question this community. It obviously goes beyond "innocent" pictures, and that many of them? It was a bigger problem in the making, glad it was stopped before it got worse.

Edited by Max Norris, Jun 3 2013, 7:15pm :

Your expectation of privacy has gone down the toilet. Its location is irrelevant in a 'cloud' powered world and if you don't get that, we are in bigger trouble. The only 'rub' of the issue is the provision for 'sharing/publishing', which is where the legal rollercoaster goes off the rails.

Apparently close enough is a red herring and a bull****, arbitrary claim. As is Skydrive's EULA in general. Those are 'my' camera pics, of whatever the **** I want. Its silly to tell customers to use their cloud to backup their data, then get in their Kool-Aid over what is allowed.

It should be our black box with all the usual protections we would have if our property was in any other public storage locale.

Dashel said,
Apparently close enough is a red herring and a bull****, arbitrary claim.

It's not an arbitrary claim, it's common sense. If it was innocent pictures, the law enforcement authorities involved wouldn't have started prosecution in the first place, at worst maybe file deletion for a TOS violation. Microsoft just did what they're legally obligated to do, it's was eventually the responsibility of the local authorities to decide on what to do with it. People are getting bent out of shape like somebody at Microsoft fired off an arrest warrant in Word and used their secret police force to arrest the guy.

Dashel said,
Those are 'my' camera pics, of whatever the **** I want. Its silly to tell customers to use their could to backup their data, then get in their Kool-Aid over what is allowed.

Sure is. The servers however are not. You want privacy, keep your camera pics to yourself, it's not a hard concept to grasp. Nothing wrong with using it for backups, but if you're going to store what apparently is illegal materials, well that's just idiotic.

Dashel said,
It should be our black box with all the usual protections we would have if our property was in any other public storage locale.

You're absolutely right, unless of course you agreed to something completely different by the terms of service. Don't like the terms, don't use it, but don't cry foul and claim ignorance later when it blows up in your face.

Dashel said,
Apathetic millennials **** me off.

How are you getting apathetic out of this? I care enough to side with the protection of children versus a potential pedophile in the making. Care enough to point out foolish and unrealistic expectations of privacy on a public service, especially when illegal materials are involved and the user can't even be bothered to read the rules first. But you're right.. it's Neowin, I should adapt that "derp it's Microsoft, screw common sense, anybody who disagrees has a Kool-Aid problem" mentality.

"That so many care so little for privacy is stunning."

Glad to see you have the privacy of these little kids foremost in your priorities. What a hero. Sordid mind but solid citizen.

The guy just wanted to make sure his data was backed up. Also I think when it comes to stuff like this they might actually want to look at his computer to make sure it was from his machine the data was uploaded. being that cloud storage can be access from anywhere so long as you have valid login credentials.

Didn't somebody on Neowin post they had porn in their Skydrive just to see how long until they got banned? You might want to take that **** down...just to be safe...

On topic, I thought Microsoft didn't spy on you. Truth is, every company does it. It's not just Google.

Tyler R. said,
Didn't somebody on Neowin post they had porn in their Skydrive just to see how long until they got banned? You might want to take that **** down...just to be safe...

On topic, I thought Microsoft didn't spy on you. Truth is, every company does it. It's not just Google.

Microsoft doesn't mine data. They do, however, monitor their services, just like any other company would.

Tyler R. said,
On topic, I thought Microsoft didn't spy on you. Truth is, every company does it. It's not just Google.

It is ok as long as they are looking for illegal files...

Any company has access to your data no matter what service it is. Anyone who thinks otherwise is delusional.

@Dot Matrix - I call it what you want, man. No matter how you look at it, they are still accessing your data. In this case, I support Microsoft helping to take this guy down, but at the same time, it proves they are willing to crack open your folder on a moments notice and show everything to the FBI/Police.

@Techbeck - "Any company has access to your data no matter what service it is. Anyone who thinks otherwise is delusional."

QFT. Can I have an Amen, people?

Tyler R. said,
Didn't somebody on Neowin post they had porn in their Skydrive just to see how long until they got banned? You might want to take that **** down...just to be safe...

On topic, I thought Microsoft didn't spy on you. Truth is, every company does it. It's not just Google.

Someone won't go to jail over regular porn... it's because in this case it was child erotica.

techbeck said,

It is ok as long as they are looking for illegal files...

No, it is not OK.

Because it says that every file could be accessed by MS, or more specifically, by a MS employee.
And it is funny, since pedophiles/other criminals most likely, they will put their photos encrypted.

techbeck said,

Any company has access to your data no matter what service it is. Anyone who thinks otherwise is delusional.

I use boxcryptor for my information (source code and projects)

Tyler R. said,
Didn't somebody on Neowin post they had porn in their Skydrive just to see how long until they got banned? You might want to take that **** down...just to be safe...

On topic, I thought Microsoft didn't spy on you. Truth is, every company does it. It's not just Google.

I uploaded porn to sky drive on August 19th 2012, its still there. Guessing they only delete/detect if you share from there.

techbeck said,

It is ok as long as they are looking for illegal files...

Any company has access to your data no matter what service it is. Anyone who thinks otherwise is delusional.

Yup, if it's on the internet or on servers in any way, it's no longer private.

Tyler R. said,
@Dot Matrix - I call it what you want, man. No matter how you look at it, they are still accessing your data. In this case, I support Microsoft helping to take this guy down, but at the same time, it proves they are willing to crack open your folder on a moments notice and show everything to the FBI/Police.

@Techbeck - "Any company has access to your data no matter what service it is. Anyone who thinks otherwise is delusional."

QFT. Can I have an Amen, people?

No, I'm sorry, but they don't. Not if you're using a proper service where you have control of the encryption.

Dalek said,

No, I'm sorry, but they don't. Not if you're using a proper service where you have control of the encryption.

Any company has access to your data you store on their services. Whether or not they look at it, who knows but they do have access to it. And how they search for files like this is similar to searching for spam/keywords for ads.

Brony said,

No, it is not OK.

Because it says that every file could be accessed by MS, or more specifically, by a MS employee.
And it is funny, since pedophiles/other criminals most likely, they will put their photos encrypted.

I use boxcryptor for my information (source code and projects)


Encrypt your files before uploading them?
Any storage/cloud provider can in theory access your data, Microsoft and Google are not alone in this, the same for Dropbox, Mega, rapidshare and the others.

I can confirm you that they will eventually ban you. I have been banned and I still keep the emails here. First time was i386 folder and second time was erotica images. I have keep them both on SkyDrive for a while so I guess it takes them time to scan. Anyway, the recovery process is very difficult and the customer service was uncooperative. The thing that really ticks me off that it took away my whole account, including mails and app purchases. Good thing that wasn't my MSDN account.

Edit: I never shared them either. Agree with greenwizard88 below - SkyDrive is so not a good backup service.

Tyler R. said,
On topic, I thought Microsoft didn't spy on you. Truth is, every company does it. It's not just Google.

I'm pretty sure that Microsoft just uses the NCMEC's Hash Sharing project to detect the majority of CP. If the customer has one image that matches then the account can be flagged for manual review.

Poof said,

I'm pretty sure that Microsoft just uses the NCMEC's Hash Sharing project to detect the majority of CP. If the customer has one image that matches then the account can be flagged for manual review.

Exactly. It would raise a number of legal problems if they actually viewed the photos. Microsoft would then need to have actual child porn on their servers to compare to, as well as staff to do it, and that's... Well let's just say that it's not workable from a legal perspective.

They not only are likely to use hashes: they must use hashes. Microsoft is just a private business; they aren't the FBI.

The good news here is then also that MS isn't going through your files, looking for stuff. Just calculating hash values, something that servers around all the world constantly do, regardless if humans are intended to look at the stuff or not. Server software often want hashes just to merely differentiate the files.

For photos they use PhotoDNA, an automated system that scans photos looking for porn. They they find it, they escalate it to a person to validate if it's of an underage person (or another type of illegal porn) or not.

Tyler R. said,
On topic, I thought Microsoft didn't spy on you. Truth is, every company does it. It's not just Google.

Skydroogled

techbeck said,

Any company has access to your data you store on their services. Whether or not they look at it, who knows but they do have access to it. And how they search for files like this is similar to searching for spam/keywords for ads.

No they don't.

I have a completely encrypted cloud system which only I have access to.

Dot Matrix said,

Microsoft doesn't mine data. They do, however, monitor their services, just like any other company would.
Microsoft doesn't mine data? I ran an Apache web server, put some file to test. I use Internet Explorer to download to test. I check my log. I trace the IP because the link was not given to anyone else. The IP was trace to Microsoft. My server on my own computer. Microsoft take my data without me giving permission just because I use IE to test the download. So don't even BS with me about Microsoft is not mining data.

Krome said,
Microsoft take my data without me giving permission just because I use IE to test the download.

You mean the SmartScreen phishing and malware filter that's completely optional and can be disabled? (And even asked about in advance?)

Krome said,
Microsoft doesn't mine data? I ran an Apache web server, put some file to test. I use Internet Explorer to download to test. I check my log. I trace the IP because the link was not given to anyone else. The IP was trace to Microsoft. My server on my own computer. Microsoft take my data without me giving permission just because I use IE to test the download. So don't even BS with me about Microsoft is not mining data.

Did you do a packet trace? Did you analyze them to see what they were carrying? Just like any other browser, IE runs services that are hosted on Microsoft servers. Having traced an IP to Microsoft means nothing. They also collect usage stats if you are signed up for it.

Max Norris said,

You mean the SmartScreen phishing and malware filter that's completely optional and can be disabled? (And even asked about in advance?)
Optional but yet it's turned on by default. Way to call it optional.

Krome said,
Optional but yet it's turned on by default. Way to call it optional.

Of course it's on by default, not doing much turned off, now would it?

Krome said,
Optional but yet it's turned on by default. Way to call it optional.

Considering it flat out asks you before you even use the thing, yes, that's a pretty good definition of optional.

Dalek said,

No they don't.

I have a completely encrypted cloud system which only I have access to.


Then it's not really a cloud system is it? It's a VPN solution.

Major Plonquer said,
Microsoft doesn't sell what it finds to advertisers. Google does.

No, they just take harmless pictures and turn someone in with very little evidence. Nothing was pornographic in those pics.

techbeck said,
No, they just take harmless pictures and turn someone in with very little evidence. Nothing was pornographic in those pics.

Aside from being required by law to do so (Google is to), apparently there was something in those 3000 images of kids that says that it was anything but harmless, or the police wouldn't have acted on it. As much as the trolls like to spin it otherwise, that's completely out of MS's hands.

Max Norris said,

Aside from being required by law to do so (Google is to), apparently there was something in those 3000 images of kids that says that it was anything but harmless, or the police wouldn't have acted on it. As much as the trolls like to spin it otherwise, that's completely out of MS's hands.

Shall have to see how this story develops I guess.

Major Plonquer said,
Microsoft doesn't sell what it finds to advertisers. Google does.

Both Google and Microsoft run advertisement networks that offer targeted ads to their customers.

How do you think MS gets the info for ad targeting? Magic?