In the past year, Microsoft has introduced quite a few machine learning-based offerings that leverage its Azure Cognitive Services APIs. These include Azure Form Recognizer and a Garage project dubbed 'Read My World', among others.
On the first day of its Build 2020 developer conference today, Microsoft has announced new features for Azure Cognitive Services. The tech giant has also released enhancements for Cognitive Search, along with the Azure Bot Service and Framework.
For starters, a new apprentice mode has been added to the Personalizer API, allowing it to learn in real time while utilizing existing solutions. In essence, this helps it bypass the learning stage that usually precedes deployment. Moreover, containers that help deploy Cognitive Services from the cloud to the edge now offer version 3.0 for Language Understanding and Text Analytics sentiment analysis. The Language Understanding Service itself now has a revamped labeling experience for easier comprehension of language structures.
Meanwhile, the Speech service now also boasts expansions in the form of 27 new locales for Speech to Text, and 11 new locales with 15 new voices for Neural Text to Speech. With the former, speech transcription accuracy has been improved through 30% word error rate reduction, while the latter now showcases a 50% less pronunciation rate error for 13 locales. And finally, Role-based Access Control (RBAC) has now been added the QnA Maker service as well.
With regards to Cognitive Search, the cloud search service is now more firmly integrated with Azure Machine Learning. There is also a new feature termed Custom Search Ranking, which lets customers' personal ranking systems integrated with Cognitive Search for domain-specific search results. Both of these capabilities are currently in preview.
The Azure Bot Framework SDK now has an enhanced Teams experience, and its Composer tool has been made generally available as well. Organizations that use Power Virtual Agents to build bots, meanwhile, can utilize the bot reusability skill to help developers extend the agents and enhance their bot development efforts.
Microsoft has also made the adaptive dialog feature generally available; using this, bots that are capable of switching contexts midway through a conversation can be built more easily. As far as the Azure Bot Service is concerned, new capabilities have been added to make the development process of bots that leverage the usage of both AI and human agents, such as in customer service platforms like LivePerson, much simpler.