Azure Cognitive Services is relatively new functionality within Azure that exposes some truly amazing APIs that have the ability to do some truly amazing things.
Before I dive into SQL and DLL code to make use of Cognitive Services, lets take a second to understand what I am talking about – imagine this;
- A customer walks to an electronic kiosk in a shopping centre and says “Hi Cortana, I want to book a holiday, I really need a break from this bad weather. Any ideas on where I should go?“
- The kiosk is run by a “bot” capable of conversing in 30 languages and which has been trained on understanding context and intent.
- It recognises you as a 35 yo male who looks and sounds unhappy, and because it recognises your face and voice it knows it has talked to you before in another shopping centre last week when you asked for directions to a Surf Shop clothing store.
- It also recognises a beach image on your T-Shirt, making note of the link between your previously asked directions and your clothing.
- As you spoke in English, it replies in English – “Good to see you again. Now, would you consider a beach holiday to Bali or Thailand?” – Why beach? Well the recommendations engine has determined that is where 35yo male surfer types go when the local weather is bad!
- You negotiate a package using natural language, and close out the conversation.
- The “bot” visualises your increased sentiment from the initial baseline and says “I’m glad I could make your day better! Enjoy your flight next week!“
It may sound futuristic – but this is exactly what Cognitive Services (API’s) can do right now – and in my example I have only used 6 out of the 21 Azure Cognitive Services! Microsoft Research has built these powerful Azure ML Models and wrapped them up into a single, simple, consumable publicly available API.
Some other amazing deployments for Cognitive Services…
- At a trade show, or even a window display at a shopping centre, a company could use emotion detection to see how people are reacting to their products.
- Facial recognition could be used to find missing children quickly at an amusement park.
- The APIs can determine the male:female ratio and ages of patrons at a nightclub, and identify VIPs or banned guests.
- The object recognition capabilities can enable a blind person to read a menu in a restaurant or have their surroundings described to them
These are just some of the scenarios possible as described by Jennifer Marsman (Microsoft Principal Software Development Engineer). For those interested in this you can attend the Microsoft Data Science Summit on 26-27 Sep in Atlanta. See this info link – https://blogs.technet.microsoft.com/machinelearning/2016/09/07/artificial-intelligence-made-easy-cognitive-services-at-the-microsoft-data-science-summit/
For those not familiar with Azure Cognitve Services APIs, check out this link which has online demos you can try – https://www.microsoft.com/cognitive-services/en-us/apis
For those not familiar with the Azure Bot Framework, check out this link – https://dev.botframework.com/
Anyway – despite all this, for this post today we’ll just focus on something pretty simple – making usage of the Text Analytics API right within SQL Server 2016.
So lets get to scoring some sentiment!
PASS 2015 continues (and finishes up today!) in Seattle.
Its been an amazing conference this year with a few things really hitting home;
- Amazing technology announcements around SQL 2016 CTP3
- Incredible advances in almost every component in Azure Data Services
- Full and seamless SQL/Azure ecosystem integration – and by that I mean both On-Prem and/or within the Azure Cloud. The story of either On-Prem or Azure Cloud is compelling enough individually, however the Hybrid story is now a reality for SQL and enables dynamic and flexible architectures well beyond what competitors can offer.
- BUT what astounds me the most is actually the pace of change – barely a day goes by where I don’t receive a new services or feature update related to SQL 2016 CTP3 or Azure.
- I don’t recall a time (in recent memory) where the step changes have come so thick/fast – its certainly changed from where I started as a DBA on RDB/VMS back in 1994 where patches arrived by mail on tape cartridge! 🙂
- (As a quick aside a chief designer on RDB was Jim Gray, the same who joined Microsoft to lead the SQL Server architecture to stardom soon after Oracle bought-out and shelved DEC around 1995+)
Enough reminiscing already – moving along – Today I attended 5 back-back sessions, and again I cannot blog about all of them in the time I have (or want to spend), but the one which stands out the most was Azure SQL Data Warehouse and Integration with the Azure Ecosystem by Drew DiPalma of Microsoft.
This session focused specifically on the Azure ecosystem surrounding the Azure SQL Data Warehouse (SQL DW) and how it can seamlessly interact with other Azure components to create different operational solutions. To me this was very compelling, not necessarily due to the SQL DW technology (which I know well already as the on-prem APS appliance), but more-so as it showed just how easily all parts of Azure can happily work together.
PASS 2015 continues in Seattle, and today was my session at 1045am on Using Azure Machine Learning (ML) to Predict Seattle House Prices. The background and info on my session is here http://www.sqlpass.org/summit/2015/Sessions/Details.aspx?sid=7794
Overall I was pretty happy with how it went - and I think everyone who attended had a lot of fun with some of the games and tests I injected into the presentation. Everyone had a chance to be a Real Estate Agent :) - and at the same time learn some great methods around performing Azure ML Regression Predictive Analytics.
BUT – moving right along – I also attended 3 other sessions today, again I cannot blog about all of them in the time I have, but the one which made me think the most about technology implementations and how they can improves lives was Understanding Real World Big Data Scenarios by Nishant Thacker of Microsoft.
It wasn’t about use cases for big data (as this is a horse already bolted), but more around really innovative and interesting ways the ecosystem of Azure technologies could be deployed to solve some complex business problems, or moreso simply ways to make our lives better!
So PASS officially kicked off this morning leading into the next 3 days of back to back sessions.
You could certainly tell that the keynote was on… I mean the dining room was pumping…!
Oh that’s right, everyone is at the keynote!
So the Keynote session was hosted by Joseph Sirosh Group Vice President, Data Group.
The big tell for the key note was undoubtedly the SQL Server 2016 CTP3 and just whats packed to the rafters within the software. If you want to learn more about that then I recommended step across to this link here http://blogs.technet.com/b/dataplatforminsider/archive/2015/10/28/sql-server-2016-everything-built-in.aspx
Key Takeaways from the Keynote;
- SQL 2016 is really a major release that really solidifies the Microsoft view of a solid foot in both the On Prem and In Cloud data platform camps.
- “The future is both earth and sky!”
- The release offers much On Prem capability like Polybase (to APS), R integration (advanced analytics), Always Encrypted, SSAS/SSRS improvements
- The release also provides the ability to seamlessly integrate from On Prem to Azure Cloud – and/or back like Polybase (to HDInsight), Stretch Database – and SQL already has capability to use Azure VM’s for SQL AAG solutions and Azure backups.
- An interesting takeaway – the human size of human genome is approx 1.5 Gigabytes, or about 2 CDs worth of storage space. How small do you feel now?
I then attended 4 sessions, but today there is really only time to blog about this one, mostly for me it was the most impressive in regards to capability and just how far its come!
The session was SQL Server in Azure Virtual Machines – Features and Best Practices and was presented by Luis Vargas is a Senior Program Manager Lead in the SQL Server team.
PASS 2015 has kicked off in Seattle, well the precon’s have anyway on Mon & Tue. The actual conference starts on Wed-Fri!
I attended a precon session today called Optimize “All Data” with a Modern Data Warehouse Solution held by Bradley Ball and Josh Luedeman of Pragmatic Works.
The session had a focus on moderising the corporate data warehouse via focusing on Data Lifecycle Optimisation.
What does that mean?
Well – It means focusing on a define set of critical technology and business areas around the corporate data warehouse and strategically implementing a managed approach to improving the corporate data warehouse via introduction of technologies and processes. Specifically this looked at 6 areas around the corporate data warehouse to consider in your approach to modernisation;
- Architecture and Configuration
- Availability and Continuity
- Maintenance and Optimisation
- Enterprise BI
- Big Data Architecture and Deployment
- Business and Predictive Analytics