|Microsoft Expands Cognitive Services APIs|
|Written by Sue Gee|
|Wednesday, 17 May 2017|
At last week's BUILD 2017 Microsoft announced more intelligent APIs and launched its Cognitive Services Labs, which will enable developers to experiment with AI services that are still in early development. One of these, Project Prague for gesture control and interaction, is already in private preview. What is more, AI is becoming truly useful and usable.
Statistics revealed at BUILD indicate a high level of interest Microsoft Cognitive Services - 568,000+ developers from more than 60 of countries are using them.
Last month we reported on the 25 artificial intelligence APIs already part of Microsoft Cognitive services and the fact that three, the Face API, Computer Vision API and Content Moderator API, had reached general availability. Now the number of APIs has risen to 29 and Microsoft has provided Cognitive Service Labs to enable developers to be involved in AI projects that are the early stages of development.
According to the Cognitive Services team:
The Labs give developers an early look at the exciting emerging Cognitive Services technologies. You can try out and provide feedback before these technologies become generally available. This is basically a playground to explore this research and get a sense for what may be coming.
The first five of these REST APIs are in open preview and with a Microsoft account you can subscribe to them and start to use them. They are currently free with a generous number of transactions.
The final one Project Prague is described as "a cutting-edge, easy-to-use SDK that creates more intuitive and natural experiences by allowing users to control and interact with technologies through hand gestures" is the one that currently in limited beta. It is open only for a small number of participants, who will also need to be equipped with an Intel Realsense SR300 camera. Indeed the instructions for applying for the beta ends with:
Bonus points for sending a selfie with your camera doing the Vulcan Salute.
The new APIs include Bing Custom Search, Custom Vision Service and Custom Decision Service which enable users to bring their own data to Microsoft-provided algorithms to create services that are more tailored to their specific needs. Each of them is is open preview that you can currently try for free.
The Custom Vision Service (codenamed IRIS) is likely to be popular as it allows you to build classifiers to distinguish between images of different things. So instead of just playing with classifiers other people have constructed you can build your own.
The fourth addition Video Indexer, which was formerly Video Breakdown, a Microsoft Garage project, uses AI technologies such as face recognition, text recognition and voice to text etc) to automatically extract metadata from videos. This really is remarkably useful as explained in this video:
The other really useful new AI-powered solution unveiled at BUILD is still a Microsoft Garage project. Presentation Translator is an Office add-in for Powerpoint that enables presenters to display translated subtitles in real time. As you speak, in any one of 10 supported speech languages, it can display subtitles directly on your PowerPoint presentation in any one of more than 60 supported text languages and up to 100 audience members in the room can follow along the presentation in their own language using their own phone, tablet, or computer. Now that's what I call useful. Like other Microsoft translation products you need to fill in a form to request it, providing information on how you would use it.
At Build, Microsoft expands its Cognitive Services collection of intelligent APIs
Microsoft Cognitive Services APIs Released
Cortana Skills Kit Now In Developer Preview
Microsoft Cognitive Toolkit Version 2.0 Beta
Conversation and Cognition at Build 2016
MIcrosoft's Project Oxford AI APIs For The REST Of Us
To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.
or email your comment to: firstname.lastname@example.org
|Last Updated ( Wednesday, 17 May 2017 )|