Azure Service Logging in the Modern Data Warehouse

[read this post on Mr. Fox SQL blog]

The “modern data platform” architecture is becoming more and more popular as organisations shift towards identifying, collecting and centralising their data assets and driving towards embracing a “data driven culture“.

Microsoft Azure has a suite of best-of-breed PaaS based services which can be plugged together by organisations wishing to create large scale Data Lake / Data Warehouse type platforms to host their critical corporate data.

When working with customers going down the Modern Data Platform path I often hear very similar questions;

  • What is the most suitable and scaleable architecture for my use case?
  • How should I logically structure my Data Lake or Data Warehouse?
  • What is the most efficient ETL/ELT tool to use?
  • How do I manage batch and streaming data simultaneously?
  • …etc

While these are all very valid questions, sorry, but that’s not what this blog is about! (one for another blog perhaps?)

In my view – what often doesn’t get enough attention up front are the critical aspects of monitoring, auditing and availability. Thankfully, these are generally not too difficult to plug-in at any point in the delivery cycle, but as like with most things in cloud there are just so many different options to consider!

So the purpose of this blog is to focus on the key areas of Azure Services Monitoring and Auditing for the Azure Modern Data Platform architecture.

Continue reading