Microsoft uses Bing and SkyDrive to extract text from images

Sometimes when people are traveling with their smartphone, they take photos of things like shopping labels, signs and other objects with lots of text so they can read them later. Today, Microsoft has announced that a new collaboration between their Bing and SkyDrive teams has resulted in a new SkyDrive feature that will make reading texts in images much better.

The technology that was used by the Bing and SkyDrive teams is called optical character recognition, or OCR. In a post on the official SkyDrive blog, Microsoft explains how this will work, stating, "With this new SkyDrive release, our OCR tool will automatically run on your camera roll photos so you can instantly see the extracted text whenever you view your photos on"

As you can see in the above photo, showing the description of an art piece, the text inside that label can now be read more clearly in the photo's properties pane. At the moment, the feature works only with English, Portuguese, Spanish, French, and German, but it does support most image formats. Microsoft says this is just the first salvo in a series of new SkyDrive features that will be designed to make photos stored on the service "smarter."

Source: Microsoft | Image via Microsoft

Report a problem with article
Previous Story

Nintendo to cut price of deluxe Wii U by $50 Sept. 20 ahead of Xbox One-PS4 launches

Next Story

Is Microsoft preparing a way to let people resell Xbox One digital games?


Commenting is disabled on this article.

Just logged in now and I don't have the OCR function. Do you have to create a folder called Camera Roll or should it just appear in any picture?

Sure, OCR is nothing new, but it's always pretty been its own thing--extract text from some source and that's the end of it. It's its seamless *integration* here that makes it worth a mention.

Side-note--personally, I keep forgetting that OneNote can search for text embedded in images, so I always find it pretty neat to have the screenshots saved in my notebooks come back as part of my search results.

It is a mistake in the article, Microsoft have simply said they've refined their own OCR tech to work with Skydrive.

In fairness to Microsoft, the amount of updates they have been pushing out to Outlook, Skydrive and the free Office web apps is very impressive. They have come a LONG way over the past 6-12 months, it seems like on a near weekly basis they are bolstering the service with new features.

Whilst I was fairly impressed with it at launch I have now moved over to using Outlook, SkyDrive and the free office web apps exclusively, fits my needs exactly.

Kudos to Microsoft for the sheer amount of time and effort they are clearly putting into improvements.

The technology that was created by the Bing and SkyDrive teams is called optical character recognition, or OCR.

ROFL. OCR is like ages old. Microsoft seems to have refined it for their own contextual purposes. They didn't create it from scratch or something. Innovation not an invention.

When ever Apple adds a new feature it's called revolutionary by the silicon valley tech media. When Microsoft adds a similar feature first, it's ho-hum and play another video of Ballmer being a monkey boy.

Today, in partnership with the Bing team, we're excited to release the first of several features that will make your SkyDrive photos smarter by using OCR (optical character recognition) to extract the text from photos in your camera roll when you view them on

Microsoft blog doesn't say that they have created it. It clearly states that they are using OCR. I think it's a mistake in this article.
Anyways I do not care about that but I am really happy that they added this feature.

OCR is nothing new. Its cool it works wit Bing/SkyDrive...but OCR has been around for a while. Will have to give it a go later on.

This looks great! I noticed this after taking a couple of pictures on my phone, which resulted in an upload to my skydrive... The pictures were taken at the best quality available (8.7mp) on my Nokia Lumia 920, so quality was going to be good either way. It just looked more clear, if that makes any sense.