SQL Saturday 769 Melbourne & 771 Sydney

For those not aware there’s some excellent local SQL events coming up here in Melbourne and Sydney

  • SQL Saturday 769 (Sat 30 Jun 2018) Melbourne.  SQL Saturday 771 (Sat 07 Jul 2017) Sydney.  For those looking for some great free local SQL / Azure / BI / etc learning, you simply cannot go past a SQL Saturday anywhere in the world!  And this one right here in Melbourne will again be no exception.  There is a lineup of fantastic local speakers including Microsoft and MVP’s as well as international speakers too.
  • SQL Saturday Pre-Con Training (Fri 29 Jun 2018).  Melbourne.   Leading up to the main event are 3 pre-con training events which cover some very interesting topics around SQL Performance Analysis, Azure SQL Cloud Migrations and Data Science using Azure.

 

SQL Saturday 769 – Melbourne

SQL Saturday is an excellent free learning resource for all things SQL Server – all costs are covered by donations and sponsorships.  Some of the excellent sponsors this year are Microsoft, Wardy IT, SQLBI, and PASS.

Some of the session focus areas include SQL 2017/19 (many deep dives across almost all facets!), SQL DB/DW in Azure, CosmosDB, Azure Machine Learning, R, Data Lakes, BI, DAX, …and so much more!

The event is being held at Northcote Town Hall (189 High Street, Northcote, VIC 3070)

For those wanting to come along here are the links you need to know.  Please go to the website and register to attend.

 

SQL Saturday 769 Pre-Con Training Options

There’s also 3 pre-con training sessions held the day before on Fri 29 Jun 2018.  Definitely work a look in…

Session: Building Streaming Data Pipelines in Azure…

For those attending – I am presenting a pretty fun session on Building Streaming ETL Pipelines Using Azure Cloud Services.  

Session Details here – http://www.sqlsaturday.com/769/Sessions/Details.aspx?sid=78630

We’ll talk though the various different shape, speed and size of data sources available to modern business today, and discuss various PaaS streaming methods available to ingest data at scale all using the Azure Cloud Platform.  I also have a few pretty fun demos which will aim to show how all the Azure services can tie together to perform ETL/ELT!

Feel free to pop in to have a chat!

 

I hope to see you all in Melbourne at SQL Saturday!


Disclaimer: all content on Mr. Fox SQL blog is subject to the disclaimer found here

Advertisements

Using Elastic Query to Support SQL Spatial in Azure SQL DW

[read this post on Mr. Fox SQL blog]

Recently we had a requirement to perform SQL Spatial functions on data that was stored in Azure SQL DWSeems simple enough as spatial has been in SQL for many years, but unfortunately, SQL Spatial functions are not natively supported in Azure SQL DW (yet)!

If interested – this is the link to the Azure Feedback feature request to make this available in Azure SQL DW – https://feedback.azure.com/forums/307516-sql-data-warehouse/suggestions/10508991-support-for-spatial-data-type

AND SO — to use spatial data in Azure SQL DW we need to look at alternative methods.  Luckily a recent new feature in Azure SQL DB  in the form of Elastic Query to Azure SQL DW now gives us the ability to perform these SQL Spatial functions on data within Azure SQL DW via a very simple method!

So the purpose of this blog is to show how to perform native SQL Spatial functions on data within Azure SQL DW.

Continue reading

Azure Cognitive Services API’s with SQL Integration Services Packages (SSIS)

[read this post on Mr. Fox SQL blog]

I had a recent requirement to integrate multi-language support into a SQL DW via a SQL SSIS ETL solution.  Specifically the SQL DW platform currently only supported English translation data for all Dimension tables, but the business was expanding internationally so there was a need to include other language translations of the Dimensional attributes.

We wanted to do this without having to manually translate English text attributes that exist already, or new ones that are added or modified over time.  We wanted an automated method that simply “worked“.

Enter Azure Cognitive Services Translator Text API service!

So the purpose of this blog is to outline the code/pattern we used to integrate the Azure Cognitive Services API into SQL SSIS ETL packages.

Continue reading

Database Backup Options for SQL on Azure IaaS

[read this post on Mr. Fox SQL blog]

Recently I had a requirement to collate and briefly compare some of the various methods to perform SQL Server backup for databases deployed onto Azure IaaS machines.  The purpose was to provide a few options to cater for the different types (OLTP, DW, etc) and sizes (small to big) of databases that could be deployed there.

Up front, I am NOT saying that these are the ONLY options to perform standard SQL backups!  I am sure there are others – however – the below are both supported and well documented – which when it comes to something as critical as backups is pretty important.

So the purpose of this blog is to provide a quick and brief list of various SQL backup methods!

Continue reading

Tuning Throughput from Azure Event Hub to Azure Stream Analytics

[read this post on Mr. Fox SQL blog]

Recently I had a requirement to load streaming JSON data to provide a data feed for near real-time reporting.  The solution streamed data into an “Ingress” Azure Event Hub, shred the JSON via Azure Stream Analytics and then push subsections of data as micro-batches (1 sec) into a “Egress” Azure Event Hub (for loading into a stage table in Azure SQL DW).

In Event Hubs and Stream Analytics there are only a few performance levers to help tune a solution like this, or said another way, doing nothing with these levers can affect your ongoing performance!

So this blog is to show the performance differences when using different Azure Event Hub partition configurations and the Azure Stream Analytics PARTITION BY clause.

Continue reading

Microsoft Ignite US 2017 – Major Azure Announcements

[read this post on Mr. Fox SQL blog]

Microsoft Ignite is probably the biggest technical event that Microsoft host yearly with many major announcements across the entire solutions portfolio suiteand this year 2017 was certainly no exception to that!

This year it was held in Orlando, FL over 5 days (25 – 29 Sep) and was attended by more than 30,000 people across the two major events of Ignite and Envision.  The event covers all areas of Microsoft solutions including Azure, Office, Power BI, SQL Server, Windows, Dynamics, etc, etc and is a world of technical goodness!

The announcements across the Azure Cloud space in particular are significant and very exciting, and provide a strong lead as to the direction Microsoft are taking their technologies today – and in the very near future.

I have prepared a summary deck of what I think are the major announcements specifically across the Azure Infrastructure and Data space which are important to be aware of.  There are of course even more announcements that this across other solutions areas I mentioned above that I haven’t covered in this deck.

You can download the Azure Data & Infra announcements deck from [MY PRESENTATIONS] page here on my blog site Ignite US 2017 Announcements – https://mrfoxsql.files.wordpress.com/2017/10/azure_igniteus2017_whatsnew_v0-21.pdf

 

In addition to all the technical goodnessSatya also released a new book called “Hit Refresh” which outlines the inside story of Microsoft’s own digital transformation.

Hit Refresh is about individual change, about the transformation happening inside of Microsoft and the technology that will soon impact all of our lives—the arrival of the most exciting and disruptive wave of technology humankind has experienced: artificial intelligence, mixed reality, and quantum computing.

You can read about it here – and also grab a copy if interested to learn more – https://news.microsoft.com/hitrefresh/

 

Happy reading!

…AND of course, as I always say, please review and validate this yourself as your required outcomes may vary!


Disclaimer: all content on Mr. Fox SQL blog is subject to the disclaimer found here

Query Azure CosmosDB from a SQL Server Linked Server

[read this post on Mr. Fox SQL blog]

Recently I had a requirement to combine data that I already had in SQL Server (2016) with JSON document data already stored in Azure CosmosDB.  Both databases were operational and continuously accepting data so I didn’t want to go to the trouble of doing the delta load thing between them, instead I just wanted to be able to query directly on demand.

And so – the purpose of this article is to outline the method to connect direct to Azure CosmosDB from SQL Server using a SQL Linked Server.

Finally … SQL & NoSQL … together at last!

For those interested to learn more about Azure CosmosDB, check out my previous blog post here – https://mrfoxsql.wordpress.com/2016/05/11/azure-documentdb-preparing-loading-querying-data/

Or the official documentation here – https://docs.microsoft.com/en-us/azure/cosmos-db/

And so right up front – this solution only works for SQL Server on VM/IaaS – and is not supported for Azure SQL DB (ASDB) – mainly as ASDB doesn’t support SQL Linked Servers! (Damn, they say!)

Continue reading

Streaming Reporting: SQL Change Data Capture (CDC) to Power BI

[read this post on Mr. Fox SQL blog]

Extending on my previous post about redirecting SQL CDC changes to Azure Event Hub, I have had a few people ask for details/options to stream SQL data into the Power BI API.

Specifically – they were looking for an easy method to leverage the ADD ROWS functionality of the Power BI API so they could push real-time data into a Power BI service dataset.

This method provides the ability to update the Power BI Dataset with new rows every few seconds, instead of a Power BI report having to either use Direct Connect or Scheduled data refresh capability which can be very limiting.

If interested in how the SQL CDC and Event Hubs work together, then read here from my previous post – https://mrfoxsql.wordpress.com/2017/07/12/streaming-etl-send-sql-change-data-capture-cdc-to-azure-event-hub/

The purpose of this post is to quickly show how to extend and explore pushing new SQL data rows via Azure Stream Analytics into Power BI.

And so, lets get into some CDC to Power BI streaming action!

Continue reading

Streaming ETL: SQL Change Data Capture (CDC) to Azure Event Hub

[read this post on Mr. Fox SQL blog]

I had a recent requirement to capture and stream real-time data changes on several SQL database tables from an on-prem SQL Server to Azure for downstream processing.

Specifically we needed to create a streaming ETL solution that …

  1. Captured intermediate DML operations on tables in an on-prem SQL database
  2. Transmit data securely and real-time into Azure
  3. Store the delta changes as TXT files in Azure Data Lake Store (ADLS)
  4. Visualise the real-time change telemetry on a Power BI dashboard (specifically the number of Inserts, Updates, Deletes over time).

The first part was easy; SQL has a feature called Change Data Capture (CDC) which does an amazing job of tracking DML changes to seperate system tables.  If you dont know about CDC then see here – https://docs.microsoft.com/en-us/sql/relational-databases/track-changes/about-change-data-capture-sql-server

The second part wasn’t easy, and after some searching I came across this blog post by Spyros Sakellariadis which gave me inspiration and starter code for my streaming ETL solution.  Excellent post.  See here – https://azure.microsoft.com/en-us/resources/samples/event-hubs-dotnet-import-from-sql/

And so, the final architecture looks something like this…

The solution picks up the SQL data changes from the CDC Change Tracking system tables, creates JSON messages from the change rows, and then posts the message to an Azure Event Hub.  Once landed in the Event Hub an Azure Stream Analytics (ASA) Job distributes the changes into the multiple outputs.

What I found pretty cool was that I could transmit SQL delta changes from source to target in as little as 5 seconds end to end!

And so, lets get into some CDC to Event Hub data streaming action!

Continue reading

Making Phone Calls from Azure Event Hub Messages

[read this post on Mr. Fox SQL blog]

Recently I did a presentation at our local SQL Server User Group (SSUG) on Managing Streaming Data Pipelines Using Azure Data Services and as such wanted to build a compelling Azure demo that worked with simple streaming data which under certain event conditions would trigger an outbound phone call.

If interested the presentation deck is here – SSUG Melbourne – Building Streaming Data Pipelines Using Azure Cloud Services

The solution had several key components and stages outlined in the architecture below.

 

  1. A mobile phone app which generates JSON events with the X, Y, Z location of the device and G (g-force) detected in the device during movement.
  2. An Azure IoT Hub (AIH) which accepts the JSON events posted from the mobile device
  3. An Azure Stream Analytics (ASA) job that queries the Event Hub and routes the event data to several outputs, including…
    • Azure Blob Storage (for archive of all event data)
    • Power BI (for a live dashboard of all event data)
    • Azure SQL Database (ASDB) (for tabular storage of all event data)
    • Azure Event Hub + Azure Function (AF) (for queuing events which have a G Force reading greater than 3 and then triggering a phone call back to the original device from which the event originated)

The entire demo solution is actually really interesting (tsk, of course!) – and I will blog about other separate parts of this presentation at some point later.  However the part of the demo that received the most interest was the external phone call integration with Azure Functions.  

To be clear up front – Azure itself does not have native phone capability – so to make outbound phone calls I leverage an external “Twilio” API from within an Azure Function, and “Twilio” connects the outbound call.

And so, lets see the Twilio phone setup and c# Function code in action!

Continue reading