A New Episode of the Series Bad Practices In Power BI

I have been in this situation a hundred times. A client meeting. I am invited as the Power BI expert to discuss the client’s needs and explain how Power BI is the right tool. And it’s always the same.
“Meet Gil, our data visualization expert, and a Microsoft MVP”, announces my colleague, who owns the relationship with the client and oversights a dozen of other non-analytic projects with this account. I don’t expect him to recognize the subtle nuances here. But then the client speaks. “We really like Power BI as a visualization tool and would like to replace our current XYZ tools”. The discussion goes on, but the misconception is almost always the same. Power BI is still considered as a front-end data visualization tool. As a result, depending on the phase of the project, it may already be too late to use Power BI beyond the front-end visualization gig. How can you harness Power BI’s true powers when an entire organization already has developed a data warehouse with complex ETL and DevOps processes to provide the data for a visualization layer?
When so much time and money are already invested in the traditional development of backend business logic, how can you fight the cognitive dissonance and tell your client or project lead that Power BI can be better utilized? Do you know that Power BI can pave the way in any analytics project, reducing ETL development time, increasing data integrity, improving QA effectiveness and enabling an agile development culture where your end-users can start gaining insight and provide feedback from the first sprint and continuously thereafter in each sprint?
So, today as a new episode of “Bad Practices in Power BI” we will focus on one of the worst bad practices in Power BI – when we treat it as a visualization tool.
As a project lead or data architect that gets to decide how to implement a new BI solution, your bad practice starts during the planning phase when you choose to involve the Power BI guys at the end of the project. Your Power BI experts should start on day 1.

IT Architects, decision-makers and many Power BI practitioners are completely unaware of the data transformation (Power Query, dataflows, M) and modeling (Tabular, DAX) capabilities of Power BI and their key roles in creating a game-changer data culture that can truly combine self-service and enterprise analytics. For most of them, Power BI is a Tableau-compete tool, with very attractive interactive visualizations and Pie Charts that can be presented to their executives. As a result, they approach Power BI practitioners in the later stages of the project, and in many cases, they hire Power BI practitioners that lack skills in data preparations and modeling.

But we cannot blame IT and data architects for the lack of awareness. The lack of awareness starts with Power BI practitioners. Unaware of the potential of Power BI to turn them into full-stack BI demigods, these practitioners keep doing what they know the best: Manual data crunching and development of mesmerizing pixel-perfect visualizations.
Are you such a practitioner? You almost always rely on others to provide you the data in the desired format. You may have stumbled upon Power Query Editor and know how to remove columns and perform basic transformations, but nothing beyond it.
You focus on real relationships with your stakeholders and ignore model relationships. “Bring me the data in a single flat table”, you often say to your data pal. “Don’t you need it in a Star Schema?”, he may ask (and you are lucky to have him if he knows that Star Schema can help you), but you will probably reply back “A Star Wars schema? I prefer my lightsaber visualizations”.
You may know that there is a DAX language with Excel-like functions such as SUM and AVERAGE, but you always prefer to drag and drop numerical columns to the values section of your visuals (When you do it you create implicit measures – and this is a bad practice by itself – but you may not be ready to appreciate this nuance in your journey with Power BI. Instead, you will frown and redirect me to the beautiful and colorful reports you have made.
You are proud of the visualizations you have created. Your boss is also very proud. As you read these words, he may be showing off your visuals to his boss and stakeholders and may even take credit for your work, but you don’t care. Your job is secured for a decade. Life is beautiful. So are your reports.

I agree that Power Query and modeling may be perceived as topics that should be practiced by data engineers and not by data visualizations experts. It is not meant for everyone. So, if you are a data visualization expert, it is OK that you will decide to keep your focus on visualizations and avoid learning these tools. But know this – without having a strong data modeler in your team, your success may be short-lived.

Do you know what are the most important traits of a good modeler? I suspect that perseverance and humility are these traits. Why? Because unlike other software programming languages, the syntax of DAX and M are not sufficient to master these techniques. It takes practice and a long time of trials and errors. During this journey, many practitioners, reach a certain level and are not aware of the new pitfalls that await them. As they make a bigger impact in their organization, they become mission-critical in their projects. Filled with a sense of accomplishment, they may stop their learning journey. Knowing nothing about the opportunities that await them.

This is where we start seeing Power BI reports that are overly complicated. At the start, the heroes behind these reports are admirable by their team members and managers. With so many sophisticated calculated columns and measures in DAX that are barely readable, the authors of these reports become the envy of their peers. But over-time, these Power BI reports fail to keep up with their promise and eventually they become unsustainable. If you are a developer of an overly complicated report and you refuse to know what you don’t know, your short-term success will be defined by how fast you move to a new project. If you are never assessed by the sustainability or performance of your work, you will never realize you are applying the next bad practice.

You can see that there is a repetitive theme here – The lack of awareness that Power BI is a full-stack BI platform and the short-term focus on the visualization requirements as the only driver for success. This is, in my mind, the root cause of many bad practices in Power BI.
So, how did we get here? I think that it’s all related to one important element in Power BI – Its ability to enable any user to start fast. Power BI was initially designed to be a self-service analytics tool with a very low-barrier for beginners. If you have experience with PowerPoint and PivotTables in Excel, you will find it easy to drag and drop fields in visuals and start delivering amazing reports. If you know Excel formulas, you may think you know enough in DAX to create calculated columns and simple measures. So, why to bother to do more.
Another source for the lack of awareness of the data preparation and modeling’s true capabilities in Power BI lies in the fact that other BI tools simply lack such capabilities. Yes, as a BI developer in IT you may be able to do some ETL and modeling in other tools. But in Power BI you can do it using a unique technology that will allow your reports to easily be migrated across different products such as Excel, Power BI, Power BI dataflows, SQL Server Analysis Services, PowerApps, Azure Data Factory, and SSIS.
Now, if you would ask me what is the single missing behavior in Power BI that can prevent the bad practice by users to treat Power BI as a data visualizations tool, I will tell you a simple answer. It is the default Load button that should be replaced by Transform Data to expose new users to the data preparation elements of Power BI.

Microsoft recently changed the name of the button from Edit to Transform Data to help new users discover the opportunity they have, and launch the Power Query Editor to enable them to start preparing their data with few clicks of the UI. I hope this change will increase the use of Power Query, but I would go beyond such change and be more aggressive here. I would place the Transform Data as the default key and place the Load as a second key. I would even change the Transform Data to Transform now, my data is messy and change Load to Skip data transformation.
Please share your experience in the comments below. Did you encounter difficult situations following the misconception of Power BI as a data visualization tool? Do you think Power Query Editor should be more visible in the UI?
In one of the next episodes of “Bad Practices in Power BI“, we will demonstrate the over-arching bad practice of “not knowing what you don’t know” through a sequence of badly implemented import steps of a simple Excel workbook. Stay tuned and download the poster that summarizes this article here.
Thanks Gil for sharing this.
Just curios to hear your thoughts on why you would consider this a bad practice:
“but you always prefer to drag and drop numerical columns to the values section of your visuals (When you do it you create implicit measures â and this is a bad practice by itself â”
Cheers
Thank you Alaeddin. The main reasons for having it as a bad practice is that your DAX becomes better sustainable with explicit measures that are reusable. In addition the Analyze in Excel add-in cannot detect implicit measures. So, you end up with textual columns that cannot be dragged and dropped to values in PivotTables.
What an inspirational piece! I have some up against the same problems in a job search in the UK where Power BI is not enough. Recruiters want the whole DW/SQL/SSRS/SSIS/SSAS stack when this is all taken care of within the Power BI M Query engine! Having spent consderable doing the ETL processes all in Power BI I have not had the need to develop skills in these areas. My datalake is SQL, SharePoint, Excel files, Dynamics 365, my Data Marts are Power BI datasets. One sets of skills, one place to look when debugging and one set of ‘curated data models’! Connectivity to other data and Power BI is all I have needed!
Thank you Steve
Excellent post! Canât agree more. I have been waiting for a post like this for long.
Indeed i think many people from I.T. have no idea what Power BI is capable of (at least the case in my company). Normally they are the ones keep saying Power BI is a visualization tool. ?
I saw a Power BI report created by an internal SQL specialist, I couldnât have expected the FAT table behind the scene instead of a star schema… ??ââď¸
One other thing I try to / want to change is the term âreportâ. I intentionally state Tool instead of Report in most cases. Come on, are we still expecting a deck of static reports using Power BI? But you know, itâs not about the tool. Itâs about the people. Itâs behavioral change. Itâs a long road. Keep fighting!
Looking forward to your next post. ?
Thank you MF for sharing.
Good article! I also don’t like when someone thinks that I mostly create visualizations as a Power BI developer. I like to call it a data modeling tool to implicate the complexity.
Thank you Gil for this very interesting and inspiring post. I totally agree with you that Power BI was designed to be a self-service analytics tool for non BI practitioners. One of the features, that totaly illustrates that, is the “Auto Date/Time” that was set, by default, to “Enable” and in my opinion, if one of us has not disabled this feature is that he still have a long way to do
Thank you Mehdi. Happy 2020!
Really great article and highlights that emphasis is on good data modelling in Power BI foremost; unlike Tableau which has front-end capabilities to overcome lack of a sound data structure.
Thank you Chris
For me Power BI is first and foremost a modeling tool. Â With great model you can allow your clients to create the report themselves, but it starts with a well structured model, where all your measures are kept in a separate folder, not within the fact tables them selves.
You have said it all Gil. I always say there are two levels of Power in Power BI. The first one being the ability for anyone to start off and create some beautiful (but always wrong) visualizations. And the second power for me is the real one, the ability to Transform, Model and using DAX to create advanced reports.
(Copy paste from my reply on Twitter)
I do not agree at all with the sentence “…when an entire organization already has developed a data warehouse with complex ETL and DevOps processes …”. What’s wrong with that?
1. Power BI doesn’t have DevOps (currently).
2. Power BI can’t track history. A DWH can.
3. Complex ETL doesn’t belong in Power Query (maybe in data flows, but not in PBI Desktop).
My advice is still building a DWH. I’d say every time, but that might not be realistic. Almost every time đ Okay, rant over đ
The rest of the article is spot on. I’ve seen numerous occasions where as strong model saved a lot of headaches, especially in the DAX.
Thank you Koen for sharing. Nothing wrong in building a modern data warehouse, but there were two implicit points to make here. 1- Power BI developers could help build that data warehouse by using Power BI as a prototyping tool. You start with it and then replace relevant elements.
2- In some projects, you already have a data warehouse, but an additional data and ETL layers are developed to serve the new Analytics scenarios. This is where you can have more options for what should be developed in Power BI and what should be developed outside. But I always prefer you start with Power BI to be agile and replace the elements that are necessary overtime.
Very good!
Excellent article Gil. I have just returned from delivering a 3 day Power BI and DAX course to finance and IT professionals at a very large international NGO. We only delivered the training after spending nearly 6 months getting their data model right.
If you make the model right for Power BI, then you don’t need complex DAX formulae. 80% of your work should be modelling, 20% visualisations.
Within the modelling time you should spend time and effort on governance, which is critical to the successful roll-out of Power BI in a large organisation. The allocation of roles and responsibilities is so important. You can’t just have anyone preparing and publishing data without proper training, oversight and management control.
Thanks again for a great article!
Angus
Oxford Power BI
Thank you Angus. Curious to learn what is the ROI you observed for your course’s audience.
Good stuff Gil, you should set up a powerbi ideas entry “power query default should be transform not load” – if there isn’t one already
Thank you Win
One of the better articles on Power BI in a long time.
It’s never the visualisation but always the underlying data.
Sooner or later visuals are debunked if the data is not optimum. Modelling and transformation are truly the heros of power bi.
Good read
Thank you Anderson đ
The âdrag and dropâ elements of PBI is heavily promotes… 5 min to wow… however the work that goes into creating a durable model, then ensuring the work gets results, seems to be hit and miss.
When I hear someone say, âyeah we tried Power BI, it just never got off the groundâ; I just imagine downloading to try the product, without being aware of the full capability or investment.
I wasnât wowed by drag and drop, I was challenged with learning DAX and modeling… I need to be careful to not fall into over complexity đ
Looking forward to seeing the follow up on âknowing what you donât knowâ. Thanks Gil!
Great article … I always use for Power BI projects 20% Frontend / 80% Datamodel – PowerQuery (dependig if there is a DWH starschema PowerQuery can be faster. My wish for 2020 will be support in PowerQuery to save / export datasets to datasource … so SSIS is not needed anymore (I know there is R ODBC export but it should be supported in PowerQuery Native) !
Happy PowerBI for 2020 !
Than you Marco
Brilliantly put, Gil! Proper Star Schema modelling, data transformation and DAX construction is overlooked by quick viz, causing subsequent restrictions including erroneous aggregations (especially syncing sub-totals with line items in tables/matrix) and overall project cramp.
As you agree with @Ko_Ver, the DWH setup is ideal, although not all SMEs can afford that before project payoff, so sometimes I build SQL backend flattened OLAP tables if no DWH. Power Query / M can be used to do same thing as you note – very handy and hopefully more integrated with Power Automate etc in future!
Great stuff Gil! Thanks.
– I now have the courage to respond to postings stating that Power BI is an ETL tool. At least it has the ET parts. It doesn’t “L”oad out to other destinations, does it? I also see postings saying they need ETL application experience and listing tools without including Power BI. Then they list some “visualization tools” you need experience on and those often do include Power BI.
– Articles like “Top 10 ETL Tools….” never include Power BI.
– Microsoft needs to get their marketing team on this. I did several searches for “…Power BI an ETL tool?” and the like but I found nothing related until about four searches when I found your fine article. I’ll keep watching!
Paul
Thank you Paul for the feedback and thoughts. I totally agree. Microsoft invested a lot of effort in Power Query and introduced its ETL in dataflows in Power BI (to be loaded into the model or CDM Folders on Azure Data Lake Gen2) and recently in Azure Data Factory and Power Platform dataflows (data is loaded to CDS or CDM Folders on Azure Data Lake). At the early days of dataflows, they used the term “Self-service ETL” when it was just introduced. I am sure the ETL part will get more traction by IT this year.
I know Koen mentioned all the great properties of having a data warehouse. But, if i’m not correct, I recall seeing in that last 6 months other Power Bi commentators talking about Power Bi flows and the using the PowerPlatform as a whole as a way to create a data warehouse. Which in my mind means Power Bi could be come a way and tool for creating a data warehouse.
Your choice of names for the transform and load buttons is perfect. I completely agree with your summary as someone who is making the journey with the help of power query academy and sqlbi. I would like to say made the journey but it’s a long enjoyable continuously evolving experience in learning Power BI.
Thank you Gil for this article.
I have to say though that, besides the fact that prototyping is good with Power BI, I don’t fully agree on the fact that it should be used as a full BI solution when it comes to enterprise level BI system.
It depends really on the client’s needs, but I think that, in an enterprise level BI architecture with complex BI processes, Power BI should be used as a visualization tool only with all intelligence being in SSAS cubes and SQL data warehouses.
For ad-hoc and specific personal analytical reports, it does make sense to use it as a full all in one tool (Excel-style). But for mass reporting and read mode standard analytical reports, data should be prepared before hand.
The main reasons for why I say this are:
1) Reports containing tables with multiple steps and big amounts of data in Power Query become very slow.
2) More importantly, the sustainability of such architecture is not great as it takes out the modular design of the BI system and when an error occurs it can block the whole report whereas in a modular design, a measure for example can be incorrect but the other data remain available.
So to conclude, I totally agree that Power BI isn’t being used to its full potential but shouldn’t become the only tool used in a BI system.
Hi Rami, Thank you for the feedback. I agree with you that it should not replace a full enterprise architecture, but Power Query (and especially dataflows) can be incorporated as a low-code ETL that can spearhead any new development. You can move faster into an end-to-end solution in “beta”-level maturity with perhaps only a subset of your data (due to the slower processing) and gradually move pieces into Enterprise-grade ETL. When you are done you gain 2 key advantages: 1- The Power BI ETL helped you to deliver the dashboards faster. 2- The dataflows can stay as your data quality infrastructure, even after you moved everything in an enteprise-grade ETL.
I disagree with some your considerations. If the ETL is well done by a good background IT, usually achieved in big companies, and from there the data is available in a DW, it is no problem that the PBI is only responsible for showing the data and analyzing them. There’s no point in PBI having to do everything to make people value it. If a company’s IT is strong, that it contributes properly to the ETL well done, you save time.
I donât think we are in disagreement. The question is what happens when ETL is weak or when the ETL is strong by overwhelmed by many change requests by the business. This is where it makes sense to have a low-code ETL that will be included as an early stage ETL before IT takes over and migrate the logic.
PowerBI tries too hard to do everything, but can barely do 2 of them at once.
Its desktop app is slow and freezes often.
It is incapable of linking more than 3 tables (while doing anything else)
It tries to provide an interface to edit in website, but disables measures, so it has basic capability.
DB + “Power Query” + DAX = Get ready for bad data.
“Show items with no data” – Shouldn’t be required on tables, but even that is flakey.
RLS – Doesn’t even work as expected, it hides row
Thank you Tom for sharing. I have a different experience working with Power BI. You may work with a specific data source that triggers these issues (Is it import mode?), and perhaps a low-spec PC for Power BI Desktop?