The rise of machine learning has produced an explosion of APIs to make your applications more intelligent. Come see them in action!
On Tuesday, Sept. 20, machine learning expert Jennifer Marsman will introduce Microsoft Cognitive Services APIs. You’ll learn about the more than 20 Cognitive Services APIs that provide object recognition, face detection and identification, emotion recognition, OCR, computer vision, video services, speech and speaker recognition, language understanding, text analytics, sentiment analysis, knowledge exploration, search services and more.
You’ll see powerful demos, experience the simplicity of calling this code, and get ideas for adding this functionality to your own applications.
What are cognitive services?
Cognitive Services are a set of APIs that provide cognitive functionality like object recognition, emotion detection, facial identification, speech understanding, sentiment analysis, text analytics and more. Microsoft Research has used powerful machine learning algorithms to solve these commonly-used scenarios and make them available with a simple API call. For more information, see http://microsoft.com/cognitive.
Why should developers be interested in cognitive services?
Previously, you would need advanced knowledge of machine learning in order to perform tasks like object recognition, facial identification or text analytics. With Microsoft Cognitive Services, these powerful machine learning models are available to you with a simple API call. For example, you can send a picture to our facial detection API, and we will return age, gender, head pose, smile, facial hair information, facial bounding box and 27 landmarks for each face in the image.
Can you provide an example use case of how/why an application would use cognitive services?
In retail or at a trade show, a company could use emotion detection to see how people are reacting to their products. Facial identification could be used to find missing children quickly at an amusement park. The facial APIs could determine the male/female ratio at a nightclub, as well as identifying VIP guests or banned guests. The language understanding service can allow automated support bots to understand natural language. The object recognition capabilities can enable a blind person to read a menu in a restaurant or have his surroundings described to him.
We’re excited about all the possibilities, and we think you will be, too! Be sure to join us on Tuesday when Jennifer digs deeper into Microsoft Cognitive Services.