Our Top Picks, Insights and Key Learnings from the Power BI Summit 2023 – Part Three

Author: Willem Herbst

With this being my first time attending the Power BI Summit, I was unsure what to expect exactly. The agenda was packed full of interesting sessions and discussions, and it quickly became apparent that I would struggle to catch all my selected sessions live. The number of speakers and amount of content was slightly overwhelming! (Fortunately, you can watch everything afterwards in your own time like most conventions nowadays.) One valuable session that stood out to me was ‘DataOps 101 – A Better Way to Develop and Deliver Data Analytics’, which I would like to share in this article.  

DataOps 101 – A Better Way to Develop and Deliver Data Analytics

The topic of DataOps has been gaining quite a bit of traction over the past few years, with most data teams iterating and implementing their understanding of what DataOps is, other than “just DevOps for data”. As consultants, this is a question we are posed frequently: “How do we organise our delivery teams and adopt better practices to streamline our data workflows in the BI space?” Most delivery teams are simply looking for a way to improve their delivery of BI products. In this session presented by John Kerski, he summarised the risks and issues of BI delivery accurately with the following slide:

DataOps can be adopted as an approach to mitigate these risks and issues. It came about due to three existing practices: DevOps, Agile, and Lean Manufacturing. While I was aware of the DevOps and Agile manifestos, I was happy to learn that we also have a DataOps Manifesto which consists of eighteen (18) principles. Five (5) of these principles were highlighted during this session:

1.      Make it reproducible

Reproducible results are required; therefore, we version everything: data, low-level hardware and software configurations, and the code and configuration specific to each tool in the toolchain. All BI tools struggle with native version control (never mind source control). However, there are several impressive third-party tools available to assist with this. One of these is pbi-tools, which has been developed to bridge this source control gap for the Power BI developer community. The tool is free and with open-source code (MIT License) on GitHub.

2.      Quality is paramount

Analytic pipelines should be built with a foundation capable of automated detection of abnormalities and security issues in code, configuration, and data. They should provide continuous feedback to operators for error avoidance. This step is often overlooked when developers build BI/ data solutions. We are accustomed to testing integration points and pipelines but rarely build tests for the data content quality. To quote John here, “When you find an error, build a test”, and “tests are your safety net”. How often do we develop scripts with tests for the data quality? This is something I might put more emphasis on with my clients going forward.

3.      Orchestrate

The beginning-to-end orchestration of data, tools, code, environments, and the analytic team’s work is a key driver of analytic success. Due to the nature of the BI workflows, we must remember that we have two primary orchestration points: the data pipelines and the visual analytics sitting downstream. Power BI has started addressing this with the recent introduction of Deployment Pipelines and its interaction with the Azure DevOps environment.

4.      Monitor quality and performance

Our goal is to continuously monitor performance, security and quality measures to detect unexpected variations and generate operational statistics. From my personal experience, the quickest way to lose your users’ trust is for them to find errors in the solutions before the delivery team does. While everyone accepts that we all make mistakes occasionally, we would like to catch these errors before they reach our users/ customers. After all, we are in the business of data, so we should be using data (on our data) to drive our operations.

5.      Reduce heroism

As the pace and breadth of the need for analytic insights increase, we believe analytic teams should strive to reduce heroism and create sustainable and scalable data analytic teams and processes. It is unfortunate, especially for smaller organisations that rely on that one “hero” to deliver their data solutions. Not only does it put an unhealthy amount of stress and pressure on these individuals, but it poses a risk that delivery cannot easily be progressed in the absence of these individuals. I appreciated this session because of its focus on the people and process aspects instead of the usual technology aspect we would see when discussing the topic of Business Intelligence. It again reminded me of the importance of designing and articulating foundational frameworks when planning to implement a platform like Power BI.

 

Want to hear more about the Power BI Summit?

Read Part One of this three-part series

Read Part Two of this three-part series

 

Previous Post Next Post