Converting a Flat Table to a Good Data Model in Power Query | by Salvatore Cagliari | Dec, 2023

Editor
2 Min Read


When loading a wide Excel table into Power BI, we end up with a suboptimal data model. What can we do to create a good data model? What even is a “good” data model? Let’s dig into it.

12 min read

13 hours ago

Photo by Kaleidico on Unsplash

Just another day at the Office: A Client called me to fix something on his Power BI report. I looked at the data and found one wide table with 30, 40, or more columns.

I asked the natural question: “What is the source for this table?”
The Answer was “Excel; what else?”

“Of course”, I think.

My next question: “Can we build a good Data model out of it?”

My client: “Why?”

And here we are.

Ideally, I would import the source file into a relational database and convert the data into a nice Data model.
Unfortunately, it is the exception that my clients want to pay for something that, at first glance, doesn’t benefit them.

But why do I want to have a good data model? A flat table works well, isn’t it?

Again, at first glance, yes.
There is no problem with a few thousand rows.

But as soon as the amount of data increases, problems can build up.

Here is the short version of an answer to the “Why?” question:

There are a lot of reasons why I want to have a “Good data model”.

The two short reasons are Efficiency and Usability.

When separating the types of data into separate tables and connecting them with Relationships, Power BI can work in a more efficient way.

Moreover, it is possible to reduce the size of the Power BI file by removing redundancies.

I strongly recommend reading the SQLBI article on this topic. You can find the link in the References section below to get a thorough answer to this question.

Share this Article
Please enter CoinGecko Free Api Key to get this plugin works.