Mastering data architecture to enable digital transformation
Whether focused on finance, commodities or real estate, organizations are seeking to transform their infrastructures, processes, reporting and customer interactions through digital technologies.
Risk mitigation, the impact of hiring millennials, internal forecasting and reporting, investor and client demands, along with the desire to capture efficiencies while minimizing headcount, are crucial factors in driving adoption and testing of digital technologies. However, with challenges on several levels, many of these digital endeavors fail to get off the ground or achieve what they set out to accomplish.
A big driver of this phenomenon is the ability, or lack thereof, to become a master of data management. Organizations are finding it a challenge to define or develop information architecture (centralized or distributed) that allows for more effective management, storage, reporting and reconciliation of data without the proper information models in place. What may be even more problematic is articulating an enterprise-level plan for achieving related progress.
Unlocking the Power of Information
The information model is not simply data management. It is the gas powering the operating model’s engine, enabling organizations to more effectively communicate and reach their specific goals.
When an information model is properly established, it provides organizations with two distinct benefits. For one, it defines how, where, in what format and for what usage an organization manages and stores data from a technical and systemic point of view, as well as how that translates into personal understanding, utilization and comprehension. That means it is not just solving how internal computers store data, but how internal staff and end-consumers are accessing, visualizing and comprehending data.
Historically, organizations have relied on static information models that tell the enterprise where the information lies, how it is structured and where it lives (i.e., a system of record). Today, a dynamic information model is more desired because it offers greater business value.
A dynamic information model shows where the organization generates data in and out of the company, how data gets through integration layers, and where data is disseminated across the enterprise. If a trade is initiated with basic information, different parts of the organization need to understand who initiated the trade, where it is going, if there were any updates on calculations to ensure reporting is accurate, and whether or not the information is easily accessible for future auditing.
Five Steps to Create Your Target Information Model (TIM)
TIM elements should start with high-level information objects. Ask yourself what information is vital to your business. Prioritize them, then iterate for a second and third time. Consider if you need to reprioritize the top three categories. You should end up with items such as customer, product, account, order, invoice and so on.
Mismanaged or poorly implemented information models not only fail to provide value, they also slow down the overall system or process in place. Latent information can be almost as risky as inaccurate or missing information. The right information must be available, unnecessary data must be purged, and cross-system taxonomy translations must be actively maintained to eliminate duplicate information created in parallel. With standards in place, organizations can achieve cross-firm optimization, while provisioning existing data as a value-add to business areas at a lower cost.
The second benefit of an information model relates to permissions to view, write over and read the information. The information model should determine when and who has to provide the information to whom, and if the information predicates someone else’s availability or use of the information. This process includes performing verifications such as technical reconciliations that ensure certain batch processes are completed and operational reconciliations confirm the accuracy of data elements.
The information model further informs what each area does and how they all relate to each other. Without it, organizations can run into several issues, including simple awareness about whether a specific piece of data is captured or available, if the person accessing the information can understand it, if people know where to access the data, or if people accessing the information should actually have permissions to do so. Inefficient data access becomes a major issue for organizations that are overhauling their data taxonomies and information model to meet new regulations (e.g., MiFID II or FRTB) or those relying on real-time sensor data to augment control center operations such as power plants, pipelines or air traffic.
Aligning with a Business Strategy Framework
A target information model (TIM) defines an organization’s data architecture by considering it from different viewpoints, such as:
- Operations—How, where and when data is used and propagated and by which staff members
- Governance—Which staff members should have access to specific data and who is responsible for input, verification and oversight
- Content—Data that must be included, tested, checked, scrubbed and transformed
- Quality—Accuracy and minimum viability for usage and requirements for validity
- Breadth and Depth
At this stage, there is no need to go any further than determining common identifiers. Stick with information objects that are “shared” across two or more business functions.
Don’t believe it when you hear that a “product” for one part of the business is completely different than another part. Although they may not have the same form or identity, it is essential to identify the semantic attributes in the TIM. Your analytics and AI efforts would not be the same without this step. It will require deep business understanding, change management skills and patience.
To address the underlying data management, storage and communication issues, it is important to embrace an overall business strategy framework as a first step. The framework is a set of defined architectural principles that enable change, transformation and growth. These principles include the target operating model (TOM), which illustrates how a business is organized and performs tasks; the target architecture model (TAM), or the systemic architecture that underlines the technology and systems a firm uses to complete tasks; and the TIM, which is shows how a firm stores, manages, comprehends and views the flows of data across the organization.
Not meant to exist in isolation, the TIM prescribes guidelines and principles for an overarching data architecture, and is interconnected with the TOM and TAM. Without defining these principles, it is difficult to create the information model unless there is a clear line of sight into the current state operating model. Without a TIM, an organization cannot define its TAM.
A joint process across business lines can create an information hub and eliminate fragmented systems and processes. The more organizations can master these principles and models, the more they can optimize the speed and sharing of information without sacrificing security, privacy or controls.
- Links to Target Architecture Model (TAM)
For each information object, one or more applications or data stores will be identified as its system of record (SOR). This provides an essential originating node for data lineage, enabling visibility to data access by leveraging the TAM-prescribed integration methods (real-time, near real-time or batch).
Philosophical debates around single or central SOR versus multiple or a distributed SOR. Stay at the conceptual level and remember that the “T” in TIM stands for “target”. Define your desired state and trust that your road mapping exercise will give you a path.
Outside of the internal reporting and external regulatory pressures, there are other challenges affecting data flows.
One big pain point for financial institutions and commodities companies is their clients wanting to go digital. Clients want instant access to information, portfolio performance or home energy consumption on their smart phones, on a website and in an email. They want to understand what the information means to them both now and in the future, making the manner in which this information is displayed a vital component. But how can a company deliver information that is both timely and personally meaningful if many of the current processes for distributed insights are batch-driven? How can an institution deliver information that is predictive when they are unsure of the state of the data they are relying on?
Another key challenge is simply the sheer breadth of operations of banks, asset managers, and energy and commodities companies. Most of these organizations span many different lines of business often with multiple subsidiaries that carry out similar work across geographies. In order to support that structure, organizations typically have canonical data models backing each business line or subsidiary. For example, if an organization has one ledger or system, they may have different views of that system to support different areas of the company (e.g., institutional, retail, mortgage lending) across geographies.
Similarly, a commodities trading firm may use multiple trade capture systems for various commodities, such as power and gas in one system, crude oil in another, and financial interest rates in a third system. Or it may have different trading systems for North American and European power that require a comprehensive view of global exposure. How can an organization compile that information for internal analytics (e.g., assets, liabilities, P&L), compliance and auditing if there is not a unified or common view? This becomes an issue of translation and concatenation, leading to a miscomprehension of a company’s overall risk.
- Links to Target Operating Model (TOM)
For each information object, one or more individuals will be identified as the “data stewards” providing an essential link for data governance activities like standards and quality metrics. From the process perspective, business processes use information objects as inputs, as elements of calculation or synthesis and as outputs, creating information flows.
Be pragmatic and you will find that, most likely, there are business people already behaving as data owners even though they may not be officially called “data stewards”. Creating the TIM is often a good catalyst to establish solid and sustainable data governance.
Furthermore, in areas like over the counter (OTC) derivatives, securities clearing and collateral, industry bodies such as SWIFT and FPML are introducing new data taxonomies, languages and standards. This requires organizations to only normalize their current proprietary information models but adhere to industry-wide standards. It also requires that organizations can define how and what information should be shared, and when to avoid sharing with the wrong person at the wrong time or with the wrong data.
Defining a TIM requires an understanding of the organization’s specific business needs and challenges. Whether data within the organization is changing frequently, or that data is trapped within multiple underlying vendor systems or off-the-shelf solutions, there are options that make sense depending on the organization’s specific structure and business requirements. These factors will ultimately determine how to implement the information model from both a technical and business requirements perspective to establish a common view of information across the enterprise.
- Dynamic View
Combine elements from the TAM (applications and data stores) and the TOM (users and business functions) to create an information flow. This will provide a visual representation of how information flows throughout the enterprise (i.e., where data originates, how it is moved and transformed, and where it is consumed). The common challenge is shortcomings of previous steps like cutting corners with information links.
Joshua Satten is a director of the Fintech practice at Sapient Global Markets, a business management consulting firm in Boston.
Nicolas Papadakos is a director at Sapient Global Markets, a business and technology consulting firm.