When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Microsoft will halt sale of emotion-reading tech and limit access to face recognition tools

Microsoft has confirmed that it will pull back software that judges a person’s emotional state by processing their image. Additionally, the company will also restrict access to its facial recognition technology.

Following Google’s footsteps, Microsoft is halting the sale of emotion-reading technologies. The company will also limit “unrestricted” access to facial recognition technology. Existing customers will have just one year before losing access to Azure Face, a set of Artificial Intelligence (AI) tools that attempt to infer emotion, gender, age, smile, facial hair, hair, and makeup. Speaking about the development, Sarah Bird, principal group product manager at Microsoft's Azure AI unit said:

These efforts raised important questions about privacy, the lack of consensus on a definition of 'emotions,' and the inability to generalize the linkage between facial expression and emotional state across use cases, regions, and demographics.

Microsoft has reportedly been reviewing whether emotion recognition systems are rooted in science. It is not immediately clear what Microsoft meant. However, it is possible that the company hasn’t been able to perfect the algorithms that guess a person’s emotional state based on an image. Moreover, the company could be bolstering its case against new rules and regulations about the use of such tools.

Microsoft will limit sale of emotion-reading tech and restrict access to face recognition tools

Apart from halting the sale of emotion-reading tech, Microsoft is also stopping unrestricted access to its facial technologies. The company has indicated that customers using its facial recognition technologies must obtain prior approval. It is obvious that Microsoft’s customers must have contractual obligations. However, it is not clear if Microsoft is placing additional restrictions or merely asking companies to sign a disclaimer absolving Microsoft of any legal penalties arising from any misuse.

For the time being, Microsoft has merely asked its clients “to avoid situations that infringe on privacy or in which the technology might struggle”. An obvious legally questionable purpose would be identifying minors. Incidentally, Microsoft isn’t specifically banning such uses.

Microsoft is also putting some restrictions on its Custom Neural Voice feature, which lets customers create AI voices based on recordings of real people.

Report a problem with article
PCIe 70 announcement
Next Article

PCIe 7.0 to bring insane speeds of 128GT/s for even faster SSDs, 800 Gig ethernet, and more

AWS logo on a building
Previous Article

Amazon is opening a new Quantum Networking Research Center

Join the conversation!

Login or Sign Up to read and post a comment.

3 Comments - Add comment