Corporate Finance, DeFi, Blockchain, Web3 News
Corporate Finance, DeFi, Blockchain News

Data quality management is vital for business process improvements

By Gary Allemann, MD at Master Data Management


Data quality management is something that in the past has been viewed as the sole domain of the IT department. However, a recent Forrester Research report titled “Trends in Data Quality and Business Process Alignment” shows that quality data is vital for supporting business process improvements. Process transformation and optimisation is worthless if the foundation for core process improvement is not based on trusted, high quality data. As a result, managing data quality has become something of strategic organisational importance, requiring collaboration between business and IT stakeholders.

The Forrester report highlights that data quality initiatives have graduated from tactical projects to more strategic initiatives with senior sponsorship and prioritisation. However, as data quality maturity levels and internal expertise remain low in most companies, investments must be made in data quality in order to positively impact operational processes, customer relationship management and other core functions.

The move towards improved data quality management is a growing trend within large organisations around the world, for many reasons depending on the nature of the organisation. In highly regulated industries such as financial services, the main driver is often the need for compliance and risk management. In the commercial sector on the other hand improved business performance and efficiency coupled with reduced costs is often a common driver for data quality. Whatever the industry an organisation operates in, there are several common underlying needs that result in investments into data quality management systems and business process alignment. These include increasing revenue through improved direct marketing and account management; reducing costs by improving operational efficiency; and mitigating and controlling both regulatory and financial risk.

However, in order for data quality management to be effective, the ever-present disconnect between IT and business, in this case between data and business process transformation, needs to be addressed. It is important to remember that there is a symbiotic relationship between data quality and business processes, and each must work with the other in order to deliver. Business processes must have trusted, quality data in order to be effective, and data quality initiatives will fail to deliver value if the data is not supporting business processes. Data quality impacts the entire organisation, and is critical for optimising not only business processes, but business decisions and customer interactions as well.

One of the biggest challenges organisations face is how to correctly scope and prioritise data quality investments and resources. In order to do this, it is vital to identify the areas within an organisation that are most affected by data quality. These differ depending on the individual organisation. According to the Forrester report, 55% of respondents shared that data quality was a key dependency for improved operational efficiency. Other areas identified as being heavily impacted by poor data quality include: the customer experience, customer relationship management and product management.

Once areas that are impacted have been identified, and the data quality process has been scoped and prioritised, the next challenge is to develop a multi-dimensional people, process and technology data management approach to address individual business challenges. There are a number of different steps that can be taken to achieve this, depending on the nature of the organisation and its specific needs.

Firstly, business process and data management initiatives should be aligned, since business process transformation and optimisation efforts require trusted data and data cannot provide value if it is not delivered in context for business users. Secondly it is vital to formalise an enterprise data governance programme to define data quality standards and processes, because it is impossible to achieve the desired levels of data quality if you cannot define what data quality means to the business. The data governance programme will lay out the policies, business rules and standards that need to be embraced across the data lifecycle, from capture to retirement. As part of this it is also important to define roles, responsibilities and processes to mitigate data quality issues, as well as appoint responsible persons. These best practices will stand the organisation in good stead for ensuring the success of data quality initiatives.

The next step is to deploy data quality cleansing and validation capabilities in the form of data quality software. While best practices are important as enablers for achieving high levels of data quality in the future, the data that is currently contained within existing systems must also be addressed. These tools enable organisations to automate and implement batch and transactional data cleansing and validation capabilities. This in turn ensures that critical data is standardised, cleansed, validated, verified, matched and enriched based on the previously defined rules and standards of the organisation.

Following on from this, master data management capabilities augment data quality management systems. Where data quality software enables defined and centralised business rules to ensure high quality data, master data management enables organisations to deploy a centralised master data repository to deliver a single trusted view various master data. Master data management takes data quality to the next level by enabling those responsible for data quality to directly mitigate issues, manage data hierarchies and if necessary override exceptions to data standards and policies. It also enables data to be made available for analytics engines and systems, to drive business value from data.

The final step in the data quality implementation process is to rollout data quality monitoring capabilities. This will enable organisations to determine if business value and return on investment are being achieved to justify resource investments, and measure the success of data quality improvements in operational data, business performance indicators and programme level metrics

Data quality and management have emerged in recent years as strategic organisational competencies, which require that the traditional gap between business and IT be addressed. Data needs to be useful to the business in order to deliver value and return on investment. Ultimately, poor data quality acts as a bottleneck when it comes to business process improvements, which makes addressing data quality problems an item of high importance on the business agenda. Forrester Research’s recommendation is that data management professionals should assist in the change management process by educating their partners, in business and in IT, on best practises, trends and methodologies for building data governance and data quality competencies in house.

For more information on how to make data quality work for your organisation, download the full Forrester report: www.trilliumsoftware.com/success/_landing_pages/Forrester-TAP-2012

Master Data Management
Gary Allemann
Senior Consultant
masterdata.co.za

Vendredi 13 Avril 2012




OFFRES D'EMPLOI


OFFRES DE STAGES


NOMINATIONS


DERNIERES ACTUALITES


POPULAIRES