Rise of the machine: Microsoft's Azure platform will change the world

How many times have you tried to figure out what you are having for dinner only to become disgruntled and order a pizza? It happens to all of us from time to time, but that's often because we run out of time to prepare for dinner or, frankly, we forget what's in the fridge.

But what if we didn't have to remember what's in the fridge or even what type of meal to cook. What if our food could tell us what to make? 

We often hear about Azure and how it is Microsoft's new show pony but what is it? In the simplest terms, it's a massive computer network that is made up of many smaller computers that are all linked together and have more storage than you can possibly imagine.

The cool thing here is that getting access to this massive system is quite easy and cheap, and you can use it on-demand too. The power of the Azure platform is immense and is magnitudes more powerful than the device you are using to read this post. Microsoft also makes it clear that they have the largest cloud infrastructure in the world, topping both Google and Amazon.

At Microsoft's Worldwide Partner Conference earlier this month in Washington, D.C, we had a chance to talk with some of the company's strategic partners who are using the cloud to work on big data problems, from medical diagnoses to predictive web analytics to measure changes in SEO performance based on keyword modifications.

But why now? Why is Azure and the cloud coming into fruition this year as opposed to last year, or last decade?

Machine learning will know what's for dinner, before you do

Microsoft has been working on Azure for many years and Satya Nadella, who was recently promoted to CEO, was a key figure in the development of the cloud platform, and is likely one of the key reasons he was chosen to lead Microsoft. His appointment and his vision for a cloud-first future are about taking what was built in the recent past, and pushing it to be fully utilized by the masses, thanks to it being easy to access by corporations and being supported by a wide set of developers too. But beyond that, Microsoft is making huge strides in making machine learning accessible to everyone, not only those with huge checkbooks. 

That's all great, but how is this helping me decide what to have for dinner? A few years ago, machine learning was a nebulous concept where those with a Ph.D in statistics and mathematics would sit around and try to create fancy algorithms that could be used for predictive purposes. Like many aspects of technology, at first, this process was expensive and complex.

Nadella was fundamental to the creation of Azure

But, as time went on, computing power increased and machine learning has been democratized, and is quickly becoming a commodity that anyone can buy. In fact, Microsoft is already headed down this path and has announced initiatives for Azure that will soon let you buy machine-learning algorithms from a marketplace.

Imagine having a data set, going to a marketplace and picking out a machine-learning algorithm, applying it to your database and reaping the rewards of decades of mathematical research without having to leave your desk. It's happening today and it will soon help you choose your next meal.

Big data will deliver better experiences with less work 

If you don't quite understand how machine-learning will change the world and more importantly, your habits, think about what you are having for dinner tonight.

In the not so distant future, Cortana could help you make your dinner. Imagine that your shopping habits are monitored in your own personal environment and based on your itemized receipts; the cloud knows what food you bought at the grocery store.

Based on the average shelf life of a food item that can be pulled in from external sources, three days later you are trying to figure out what to eat. The cloud knows that you bought charcoal last week too and a quick check of the weather says it is sunny; it's time to grill out.

Based on your purchases, Cortana could use machine learning to put together an idea for a delicious meal for you, complete with instructions on how to cook it, how many people it will feed, how long to cook it and based on your purchasing and dining habits, how to spice it perfectly for you. She could even recommend the perfect wine to go along with it.

Independently, knowing you bought steaks and that it is sunny outside offers no value. But when you combine that data intelligently, you can create experiences using machine-learning and big data that will change your habits. Instead of ordering a pizza, Cortana could tell you all the possible meal combinations you could eat tonight. Imagine if you say "Cortana, I have 20 minutes to cook, what can I make tonight?" These are the types of intelligent answers that machine-learning can provide us.

This is one very simple example but there is unlimited use of this type of data interrogation across a wide array of applications. On the London Underground, there was an instance in which machine-learning was used to better understand - based on the vibration of wheels on escalators - when they would fail. Knowing a wheel was about to fail, preventative maintenance could be done ahead of time, at a time when the tube is less busy

Machine-learning is best used when it removes computing from your workflow. By knowing what you can have for dinner, you don't have to go searching the web for recipes; the information will already be available for you in an easy-to-use format. This reduces your time in front of the computer and lets you get back to doing something besides research because the big data already answered your questions before you asked them.

So, will machine-learning change our habits? Yes. Is this some far-off and distant dream? Hardly. Microsoft and its partners are bringing machine-learning and big data processing down from the macro process to the micro decision. The funny thing about this entire process is you wont instantly notice it; it will just gradually happen. But one day, you'll look back and realize how much less time you spend searching and asking, and how much more time you spend doing and creating. 

This is just the beginning. 

Report a problem with article
Previous Story

Microsoft brings in former Best Buy exec to manage consumer sales group

Next Story

Teaser photo and new info revealed on Microsoft's Halo: Nightfall series


Commenting is disabled on this article.

Time to reflect that the term "Artificial Intelligence" was coined by Donald Michie while he was working with Alan Turing at Bletchley Park during WW2. Wikipedia incorrectly reports that the term was coined ten years later in 1955 by John McCarthy. Like the first electronic computer, Colossus, this is another example of British ingenuity under wartime duress where the Americans later took credit. After WW2, Churchill ordered all the work done at Bletchley, including the 12 Colossus machines, be destroyed.

Michie went on to set up the world-class Department of Machine Intelligence and Perception at Edinburgh University, still one of the world's leaders in this field. Like Turing, Flowers, Tutte and many other forgotten heroes at Bletchley, Michie kept his mouth shut and passed away in 2007.

Back in 1985, Michie showed a product called ExpertEase that using rule induction does much the same thing as Microsoft's latest offering running on a simple PC. PCs were a little slower in those days so it never caught on.

So one of the first apps that the first guys who built the first computer thought up was Machine Learning. I don't think they used it much for ordering pizza but it came in handy for cracking Tunny (Hitler's secret Lorenz code), shortening WW2 by several years and likely saving millions of lives.

The pizza, of course, was invented by Italians. They were trying to invent the helicopter at the time.

Edited by Major_Plonquer, Jul 23 2014, 12:34am :

I agree, they did some fantastic work at Bletchley Park.
Forgotten heroes indeed. I never even heard about all of this stuff till I was years out of school, and only did then because I took an interest in WW2 myself. Brilliant British education - ask anyone who Tim Berners-Lee is... Very few people will be able to tell you.

Another violation of privacy - in a fun way. Cortana creates your profile for government access. Time to Linux? :p

Very interesting article about a company that is yet discover how to cool down it's overheating Lumia 930 flagship phones!

I think the CEO and his team should focus on delivering simple and working devices rather than talking about a Cortana this is yet to learn the Queen's English.

Could it be possible that instead of contemplating the exciting scenario that "in the not so distant future, Cortana could help you make your dinner," airline passages may be stopped carrying Lumia 930 phones into the planes, as their overheating phones can cause accidents!

I think the CEO should forget defocus from worrying how Cortana could predict a train wheel coming off, and focus on sending fire brigades to the unlucky soles, who purchased Lumia 960.

Wow... Thanks Microsoft for tackling the bigger issue, pizza. Meanwhile Google, NASA, and IBM are actually creating machines that will actually change the world.

Kalint said,
Wow... Thanks Microsoft for tackling the bigger issue, pizza. Meanwhile Google, NASA, and IBM are actually creating machines that will actually change the world.

The only thing Google is doing is finding new ways to spam the world with ads using those machines.

Max Norris said,
Good job completely missing the context. (And extra credit, NASA uses Azure too.)

Well thats good NASA employees should know what to eat.

Here comes the fear again!

Machine learning is the basis behind all (and I mean exactly that) AI projects - going back to when IBM was a typewriting and calculating-machine company. (Remember, the CONCEPTS behind AI appeared first in science fiction of the pre-World War II period - from writers such as the late Dr. Isaac Asimov and Robert Heinlein, along with no less than Arthur C. Clarke - note that NONE of the three were idiotic!) In the near term, machine learning reduces the cost of computational logic to where it becomes affordable on an individual (not notionally national) basis - it's what I referred to in another thread as "mainframe-quality compute power without the mainframe"; do you see why IBM is scared YKW-less?

The only reason for us to fear machines surpassing us is if we let them - by basically trying to put machine advancement into a cage. Have we forgotten the proverb "What one brain can do, a better brain can, and will UNdo."? Why wouldn't - or couldn't - it apply to machines? Then there is the very real problem is that if we create machines with the qualities of humans, trying to cage them would be an immorality on the order of human slavery - try and chew on that, if you dare.

hmmm... The Prequel to "Terminator: rise of the Machine" is "Azure : Machine Learning"

and this leads perfect into:

"When humans blocked the machines' access to solar energy, the machines instead turned to harvesting the humans' bioelectricity as a substitute power source, while keeping them trapped in "the Matrix", a shared simulation of the world as it was in 1999, in which Neo has been living since birth. Morpheus and his crew belong to a group of rebels who hack into the Matrix and "unplug" enslaved humans, recruiting them as rebels. The rebels' understanding of the true nature of the simulated reality allows them to bend its physical laws, granting them seemingly superhuman abilities. Neo is warned that fatal injuries within the Matrix will also kill one's physical body, and that the Agents he encountered are powerful sentient programs that eliminate threats to the system. Neo's skill during virtual combat training lends credence to Morpheus's belief that Neo is "the One", a man prophesied to end the war between humans and machines."

Thanks Nadella :(

This is precisely my first thoughts at "Machine Learning".
It's only these few steps, but soon, this sort of thing becomes handy for those in the defence industry, and before we know it, a supercomputer is creating almighty hell.
I'm just glad I'm at an age where I'm relatively young in the grand scheme of computing power, and hopefully wont be around to witness what the movies foretell.
Sorry if that sounds a bit morbid.

Good read about machine learning in general, wonder how long until machine learning is localized to smartphones and laptops so that if if we disconnect, we don't lose the features?

Interesting perspective even though I have no idea how to use machine learning...If cortana can make the decision my wife can never make, there will be less fighting in our house :D