Many companies struggle with scattered data: information sitting in different systems, countries, and formats. This makes decision-making slow, reports unreliable, and prevents companies from getting real value from AI solutions.
Jaakko Mattila, a data advisor at Data Design, has worked on many projects where companies build a shared foundation for using data. A nearly year-long change project with a global, publicly traded company showed how systematically improving data management can lead to better quality decisions and more transparency in business operations.
Who are you and what do you do?
I am Jaakko Mattila, a data management advisor at Data Design. I work on client projects, helping companies use their business data more effectively and improve decision-making with data.
You have extensive experience in data management projects. Is there one that stands out to you?
One of the most memorable was a nearly year-long development project for a global, publicly traded company. The goal was to improve the organization's ability to use data for business leadership. The project covered the whole journey, from planning and problem definition to putting the change into practice.
Why did this project stand out?
In many consulting projects, things stop at the planning stage, and the client is left to make the change happen on their own. In this case, I was involved in the entire journey, from definition to implementation. Together with the client, we defined the most important data processes and roles and put a new operating model into use. This improved the reliability and usability of data for AI and reporting.
What was the project about?
The company had expanded quickly through acquisitions in multiple countries, but data management hadn't kept up with the growth. Units in different countries had their own ways of handling data, and there were no shared rules for things like product information management, reporting, or financial tracking.
In practice, it was hard to track product-specific profitability, and management reports couldn't be fully trusted. At an operational level, this meant double work, fixing errors, and a lack of trust in the data, which decreased the entire organization's efficiency.
Additionally, the company wanted to use AI features, but the poor quality and inconsistency of the data weakened the results of the AI models. This meant they didn't get the expected benefits.
What solution did you decide on?
The solution was to standardize data management and data standards. After identifying the data areas critical to the business, we decided to focus on those first. A central data team was set up to develop and implement shared practices across the different countries.
The team created a data governance model that defined roles, responsibilities, and processes for handling data. At the same time, they created a shared list of development tasks to improve data quality and usability in a controlled and measurable way.
How was the change received within the organization?
As with many changes, the initial phase focused on understanding the organization's challenges and how they affected daily work. Once the situation became clear and the first results started to show, reporting became more accurate and data reliability improved. The organization's attitude quickly became positive and committed.
Communicating successes widely and reporting them to the management strengthened trust in the project. The change was seen as producing real results and improving operations permanently.
What challenges came up in the project?
Because it was a global project, the biggest challenges were the different working methods, cultures, and goals across countries. Leading the change was key; we had to get people on board and help them understand why shared ways of working were necessary.
It wasn't just about the technical execution but mainly about involving people and bringing teams together. One of the most important tasks was to build trust and a shared understanding of why common operating models are needed and how they benefit everyone.
Other interesting observations?
At the beginning, the problems seemed huge: data quality, system differences, and scattered processes appeared overwhelming. But by digging deeper, we found that a small number of problems caused most of the consequences.
A key realization was that we didn't have to solve everything at once. Instead, we should focus on the things with the biggest impact. When we found the "20% of problems that caused 80% of the harm," things started to resolve quickly.
Could the same solution concept be applied elsewhere?
Absolutely. The most important lesson here was that making the change happen is the crucial factor. There are many good plans and roadmaps in the world, but real value only comes when they are put into practice and people are included in the change.
What was the best part?
The best part was seeing the change concretely, that things didn't just look better on paper but actually changed in reality. At the same time, we could use metrics to show that data quality improved, reporting became more precise, and business decision-making got easier.
It was also rewarding to see how the client was able to report the project's results to their management and concretely show how the change improved the business. That made the project truly meaningful from the perspective of both data and the entire organization's way of working.