Modern data architecture as a strategic lever in the competitive landscape

Data has become the life blood of businesses and properly managing that data to gain the most value is becoming ever more important as businesses seek to remain competitive. This Insights paper will address the importance of investing in the processes, practices, and technologies to maximise the value of data in an enterprise.

Board members and C-suite executives view “Inability to utilise data analytics and ‘big data’ to achieve market intelligence and increase productivity and efficiency” among the Top risk issues for their organisations over the next decade.
Executive Perspectives on Top Risks for 2023 and 2032, Protiviti and NC State University ERM Initiative, December 2022

Defining a target state conceptual architecture with prescribed design patterns:

It is critical to understand the organisation’s current data providers or producers and its data consumers when defining a target state conceptual architecture that features prescribed design patterns. A good architecture will account for functional and non-functional needs for the platform.

Functional needs include data accessibility methods, data quality and timeliness standards, data privacy, access controls, change management and stakeholder specific data requirements. While, non-functional needs include availability, scalability, performance and other infrastructure capabilities like backup/recovery and disaster recovery.

A conceptual architecture describes the various high-level components of an enterprise data eco-system, which takes into account the business model, goals and domain design. There are times when those components correspond to specific capabilities, such as a data warehouse or analytic application.

However, a conceptual design is chiefly concerned with mapping the purpose and logic of how the system functions in support of business goals. Common constructions or relations of these components is termed a design pattern and helps architects compare or attempt to replicate the setup of systems known to have certain advantages.

Historical constraints in understanding and working with underlying data infrastructure once inhibited business leaders and key stakeholders from playing a more interactive role in the data system design process. However, modern architectures make it possible for applied requirements to be gathered first.

It is critical to have buy-in across the organisation. Without feedback from senior stakeholders down to everyday, front-line users, it becomes impossible for data-driven programs to realise their full strategic potential. It is crucial to have input and feedback from all these voices informing the context and objectives of an enterprise data solution.

Traditional architecture patterns were focused mainly on an IT centric view. The move to modern data architecture patterns are growing in adoption across the industry. This growth can be attributed to the emergence of architecture models that support higher levels of federation like the Data Mesh.

In this domain-based data architecture model, domain-managed and governed data products versus raw data sets are the primary inputs into the process. These data products need to be ready to use when they are published on the data platform. The responsibility lies with the domain owners to ensure that the product is fully governed and of optimal quality.

To create consistency across all the business data domains, an overarching central data governance process drives how the data products are published through data standards, catalogs, lineage, policies, common and shared infrastructure, and tools.

Automation is a critical component of modern data architecture, enabling agile onboarding of new data sets and publishing data products. Getting data products to market faster using an automated factory model, results in quicker monetisation or speedier compliance with the regulatory risks that are increasing every day. Metadata-driven approaches to moving and transforming data through the architecture are best practices to accomplish optimal automation and agility in releasing data products.

DevOps automation is also a key factor in both architecture agility and nimbleness in supporting business needs. Traditional operating models where data engineering and data operations worked in silos to create data products are now obsolete; the modern data stack operates much like a software application development stack, providing the data engineer greater ability to develop, test and deploy data pipelines in an agile manner. Modern data architecture must allow for integration of native or third-party tools and technologies that enable continuous DataOps to ensure faster delivery of data products and analytics.

Modern data architecture as a strategic lever in the competitive landscape

With global, country specific, local and state regulations, such as GDPR, CCPA/CPRA, CPA, VCDPA (Virginia), UCPA (Utah) and CTDPA (Connecticut) (State Level Comparison Charts of Data Privacy Laws in the U.S. | Bloomberg Law), data privacy and security should be a major focus for organisations. The modern data architecture needs to handle the myriad of data security and privacy requirements. On the data security front, the architecture should be able to adhere to various audit and compliance requirements and legislations like SOC1, SOC2, HITRUST, FedRAMP, HIPAA and others.

The ability to model data obfuscation rules around PII and PHI is a critical requirement to protect sensitive data. For public companies, it is also imperative to comply with SOX requirements like access control/information security, change management, incident management, physical and logical security, and backup/recovery. Establishing data clean rooms in an organisation enables data sharing and monetisation of data products in a compliant manner.

Modern data architecture powers a myriad of business use cases today such as fraud alerts on personal bank account activity and IOT (telematics) based transportation improvements.

These are some real world examples of how well thought out modern data solutions are used to develop and deploy business-driven data products that positively impact businesses and communities.

Discovering the risks of not establishing a foundational architecture and common set of capabilities:

To reduce risk, organisations should establish key foundational principles and embedded control frameworks. Risk can come in many forms, such as security breaches, ineffective controls, and data proliferation. It is important to note that the federation of responsibilities in the delivery model does not mean federation of the architecture, software and technical design.

A well-architected solution embeds controls and governance to allow proper oversight and shared investments that are key to avoiding rogue processes and data proliferation. Data security at a minimum requires an operational model that ensures governance and central oversight.

Ensuring organisational models guide process and establish key roles:

As mentioned above, there is an element of art when designing the elements of a model. A good example comes in the form of evaluating a particular situation. Take for example, a consumer bank. Many banks are organised by lines of business that have their own organisations to support their respective operations, systems and business goals. Given that, we would want to understand how the conceptual and domain design should align with that organisation as a data producer and data consumer.

As data producers, the bank has responsibilities for providing data back to enterprise functions such as risk, finance, compliance, and others. Due to this federated responsibility, a collaborative model should agree on the architecture based on each organisation’s skillsets, roles and overall data management operational model and principles. Investing in leadership to create the optimal organisational model with clear roles and responsibilities is necessary.

When the chief data officer (CDO) role first came about, it was a key IT role and was accountable for architecture and infrastructure. Yet, over the past decade or so, it has evolved into a more business-aligned role that is responsible for creating and governing the roadmap of data needs and goals.

With that, it is also important to have an equally senior and accountable role in technology to ensure IT standards, controls, common investments, and such are enabled. We challenge companies to consider this key role and ensure it has appropriate weight in the organisation to enable these foundational capabilities. The concept of a “CTO of data” has evolved and is gaining some traction across organisations.

DataOps is an emerging process based on agile software engineering and DevOps that encapsulates many data management best practices and helps generate better quality data and larger quantities of data analytics products. DataOps can enable companies to deliver data products faster and stay ahead of their competition.
The DataOps Manifesto - Read The 18 DataOps Principles).

Putting the puzzle pieces together with well-organised data operations

Once an architecture with design patterns is defined, the organisational model reviewed with each line of business or data producers, the data usage needs understood, and roles and responsibilities agreed upon in the overall model, it is time to establish the operating model.

The key tenet of an operating model is ensuring all key stakeholders have a way to interact (typically though stakeholder committees, data governance forums and/or executive committees). Defining the goals and outcomes of those interactions is key, as well as decision-making protocols.

Budget planning and aligned roadmaps are a critical exercise and cadence with the teams to ensure consistent alignment to those goals is needed. Also, aligning financial management and transparency with agreed upon cost allocations and such are more of a focus in a modern architecture, to ensure proper design to accurately report and allocate those costs are predesigned and planned.

Lastly, ensuring involvement of key stakeholders from compliance functions and cyber security early on is an absolute to ensure alignment to policies and requirements, as well as prevent risks.

Data operations also include the technical aspects of change management, access management and various other controls which can be automated as part of data devops frameworks. That model/design should be established to ensure the IT/code operational model with proper roles and responsibilities.

Modern data architecture as a strategic lever in the competitive landscape

Organisations are harnessing the power of data to improve processes, drive new business opportunities and increase competitive advantage. We provide services to design, source, transform and analyse data to empower your business by modernising your enterprise data architecture. Using our combination of strategic vision, proven expertise and practical experience, we will collaborate with you to enable the development of a cutting edge and pragmatic data architecture.

Our capabilities to enable a modern data architecture include:

  • Developing a data strategy and roadmap tailored to your organisation’s specific needs and growth objectives inclusive of architecture, organisational planning and data operations
  • Establishing a data governance framework with aligned data management policies, tooling and operations
  • Creating best-practices-based, streaming or batch ETL/ELT frameworks on a variety of cloud platforms to ensure your data is flowing properly
  • Providing high-performing storage designs and implementations for data lakes and data warehouses supporting both operational and analytical data workloads
  • Establishing policy authoring and design standards that deliver high-performing design standards and implementations for data lakes and data warehouses
  • Launching a data security and privacy program that incorporates the appropriate data backup and recovery testing strategies, methodologies and testing models
  • Designing a master data management strategy that will carry your organisation well into the future
  • Delivering analytics and reporting capabilities to enable self-service reporting, real time events, data discovery and more
Loading...