infrastructure data

Data quality at the heart of infrastructure modernisation

All major IT trends, whether digital transformation, cloud, IA, IoT, blockchain or cybersecurity, share one thing: data. 

And all share the same requirement: modernisation of the infrastructures supporting them. However, not all data provide value or lend themselves to analysis, not to mention the time wasted simply to make certain data exploitable.

This mass of information has to be processed before we can even begin to consider efficiency and decision-making.

Data is at the heart of business today, and helps it grow. Some even consider data as the new “black gold”. However, its wealth also depends on our ability to actually use it. A significant share of technology investments – in storage, BI, Big Data and analytics, AI (as well as Machine Learning and Deep Learning), IoT, etc. – consist in acquiring the means to exploit these data and increase their value. And even more strategically, they need to be used to assist decision-making...

For there’s a flip side to this reality: not every piece of data can be used. Sixty-six percent of ISD(1) report that a mere half of their company’s resources is digitised! What’s worse, in addition to often unusable data being collected, these data can pollute the work and analyses relying on them, thereby jeopardising automation and decision-making.

Too much time spent on cleaning data

What do data managers, analysts and data scientists spend a good part of their time on? We would expect the answer to be data modelling and analysis. In reality, however, these tasks represent just 12% and 14% of their time. The bulk of their work is dedicated to cleaning, preparing and managing data (29%), product and troubleshooting (20%), integration, ETL (Extract, Transform, Load) and constructing pipelines (18%)(2).
Studies have also shown that 84% of companies have already used erroneous data(3) with 43% of them reporting that using unreliable data has had a direct impact on sales. Massive data fragmentation disrupts the work of IT teams, which spend between 10 and 50% of their time managing data and secondary applications in 73% of French companies(4).

Data quality is required

Using incorrect data decreases professionals’ productive time, slows down analytical processes and weakens results. Companies are required to proactively clean up unnecessary data at the source (such as in backups) and when establishing review and approval processes. They also need to put in place risk analysis with the detection of personal and sensitive data on unstructured bases.
To help BUs and ISD improve and accelerate analysis and decision-making, we need to be aware that the answer to these problems is the quality of the data used. Companies have to adopt or redeploy the resources and spending devoted to data management, automation, selection and cleaning, applying them to more strategic IT actions aimed at providing qualified and up-to-date data that can then have a positive impact on productivity and results.

Notes :

1 - Report “The Future of Enterprise Data: Democratized and Optimized” by ASG Technologies
2 - Study “The Definitive Data Operation Report” by Nexla.
3 - Study “Les secrets du Tag Management” by Netvigie
4 - Vanson Bourne study for Cohesity