Ongoing U.S. policy shifts and severe market volatility are resulting in a rapid re-assessment of risks across businesses of all sizes. Major changes to trade policy and tariffs, immigration policy and regulatory enforcement have jolted business leaders into rapid reassessments midcycle, bringing previously lower-priority risks to the forefront.
The need for sudden adjustments to risk reporting – and in some cases, financial planning, operational parameters and risk limits – has renewed focus on an old problem: Most ad hoc risk analyses are challenging to produce in a short period of time.
Thankfully, new developments in data analytics include effective tools that your organization may already have in-house to solve this problem.
Boards of directors and senior management will frequently ask about hypothetical scenarios not covered in ongoing management information (MI) packages. In some cases, financial institutions may also face ad hoc questions from regulatory examiners during reviews.
Paul Feldman
Answers to such situational questions can drive midcycle strategic changes including adjustments to resources and capital. These questions often can’t be answered as quickly or specifically as the senior team would like.
Developing a confident, data-supported response may require new quantitative metrics or models to support new risk reporting or key risk indicators (KRIs) to respond to evolving risks.
Below are three practical steps businesses and their risk managers can take to refine their organization’s data and risk management infrastructure.
Step 1: Identify the business lines and risk exposures most vulnerable to emerging shocks.
Ask risk management staff to identify the most important emerging risks related to recent policy shifts that are not captured in current risk inventories, and identify the business lines where those emerging risks are most prevalent.
Bryce Snape
Although certain risks, like tariffs, will be broadly relevant across most organizations, your firm or a counterparty that you are assessing may have other idiosyncratic exposures to export controls, immigration policy and other public policy initiatives. If your organization has an enterprise risk function, that group would be suitable for this evaluation. If not, apply whatever process your organization uses to assess top-down risks facing the enterprise in strategic and financial planning.
While specific risk factors will vary by organization, themes we have commonly observed this year include:
- Geographic/Country Risk: When analyzing country-level exposures, consider those countries most exposed to tariff and trade policy first. Check your counterparties’ transaction histories if you have remote oversight rights (e.g., import standby letters of credit, incoming wires, or payments to international suppliers or customers). Look for potential vulnerabilities to tariff policy. How rich is your data and metadata on your customers/counterparties and third parties? Is this level of analysis possible?
- Supply Chain Risk: Start with key commodity supply chains. Many commodities and other manufacturing inputs are produced in emerging-market economies where tariffs can have an outsized impact given their relatively small GDPs and purchasing power. Many financial institutions and corporates do not maintain detailed supply-chain data on customers, borrowers or third parties.
- Forex Risk: Next, consider your company’s foreign exchange exposure and the implications of large currency moves. At the very least, hedging costs will rise. It’s likely that some exposures are not hedged, particularly with less liquid emerging market currencies where triangle hedging is required. Hopefully, you and your counterparties already have the risk management capabilities required to quantify these exposures. If so, look for KRIs to quickly identify and aggregate risk, and work backwards to define the data requirements to produce them.
- Credit Risk: If you are a bank or a corporate that extends credit, evaluate your counterparties’ exposure in impacted regions and look for signals of heightened default risk, such as standby letters of credit in transaction histories. Wires and payments to international suppliers or customers may also help identify borrowers that have exposure to countries impacted by trade/tariff policy. Define data requirements and reporting that identifies the joint condition of heightened fundamental credit stress combined with exposure to changing trade policy.
Step 2: Assess whether the current risk and finance data can produce necessary, ad hoc management information.
Max Cantin
A strong partnership between risk management and IT leadership is essential. Understand the data challenges to produce ad hoc metrics, stress testing or reporting to drive ongoing monitoring of the emerging risks identified above.
The following are assessment areas to develop your plan and enhance the data infrastructure necessary to deliver better-quality ad hoc MI and capture emerging risks:
- Data Availability: Now that you have identified specific emerging risks, get the lay of the land. Identify the data sets already in-house or readily available via vendor subscription to address key emerging risks. Next, consider how confident you can be that they are accurate. Talk to IT/data teams to understand the analytics platform(s) that are used to store risk and business data. Many common modern data platforms offer out-of-the-box solutions that enable data lineage, data cataloging and data governance – all enhance data visibility and availability. These features often do not require new software implementations and can be offered without reworking existing vendor contracts.
- Data Frequency and Timeliness: Understand how often data sources are refreshed. Does the data’s frequency align with reporting needs going forward? Most analytics platforms now offer off-the-shelf solutions to these common problems. For any given data set, it is easy to identify the “last updated data.” Almost all database applications have “time travel” capabilities – users can easily see what that data set looked like in snapshots of five minutes, five days or five years ago. Emerging risk analysis might require point-in-time, historical time-series or change analysis for the recent past. Nimbly operating across time series is a key enabler to solving many risk analysis problems described herein. And within time series, some macroeconomic variables such as forex and supply chain data will require much higher-frequency intervals than others to address your needs.
- Data Accuracy: Although table stakes for any analysis, data accuracy can be time-consuming to demonstrate. Ask your IT/data teams if they have an existing data catalog – an easy-to-use, non-technical tool that allows the business to do metadata searches on data sets of pre-vetted, accurate data. The catalog will provide the data’s location, source, steward, change history and quality rating. Data catalogs are now offered within existing licensing of most leading enterprise analytics tools. If your organization is already leveraging these capabilities, you are ahead of the game. If not, these products offer a combination of AI-driven and workflow-based solutions that can quickly help organizations better understand what data they have, streamlining the most time-sensitive parts of risk analysis: data wrangling, validation and cleansing.
Step 3: Identify the first few use cases to apply the refined infrastructure to quickly turn around strategic analyses with confidence.
Apply a prioritized view of risk management concerns to identify where enhanced data capabilities can make the most immediate impact. For example, if management needs better visibility into emerging risks, leverage your organization's data catalog and analytics tools to create new metrics for monitoring. The value of data infrastructure refinements will soon become clear to a wider audience as challenging requirements are addressed.
As the risk team becomes more familiar with the enhanced data ecosystem, they will discover new ways to leverage available tools and data sets in a self-sufficient manner, decreasing reliance on IT while shortening time to insight. The key is to create an environment where risk managers understand the tools at their disposal and the quality of available data. This knowledge allows them to confidently propose innovative solutions and respond more effectively to ad hoc requests.
Conclusion
Often, organizations have a sense of the kinds of risk questions they would like to answer more quickly and the tools they want to build to achieve operational readiness. With the right approach and a proven methodology, risk managers can turn latent data infrastructure into a more valuable asset to drive midcycle analyses, thereby enabling strategic adjustments that could enhance risk-adjusted returns for their organizations.
Tactical next steps to get to your target state include:
- Identifying emerging risks most likely to impact the business and the data required to support analysis;
- Determining which data sets exist in-house or can be accessed easily and assess their availability, timeliness, and quality; and
- Leveraging data catalogs and analytics tools to accelerate analysis and reporting via the democratization of data.
Enhancements to data infrastructure, collecting richer metadata and building risk-focused data components will help organizations more quickly answer critical questions around risk and strategy – especially in an era where priorities can shift on short notice.
Paul Feldman is a Senior Director at FTI Consulting. He has more than 20 years of experience in implementing and validating analytics-driven solutions to problems facing financial institutions involving risk, finance, capital, strategy and regulatory compliance.
Bryce Snape is a Senior Managing Director in FTI Consulting’s Data & Analytics practice who focuses on providing advanced data analytics services. He leads projects related to data governance, data migration, data management, data validation, data retention, and IT and regulatory compliance. This includes regulatory requirements such as SCRA, CCAR, DFAST, CMMC and GDPR/CPRA. He has presented directly to regulatory agencies (e.g. CFPB, DOJ, OCC) about data methodologies and data analyses.
Max Cantin is a Senior Director in FTI Consulting’s Data & Analytics practice, and has more than 10 years of experience specializing in enterprise architecture, data warehousing, reporting automation, risk analytics and data governance. He develops cross-functional solutions within complex organizations, having led strategic analytics transformations for clients across industries.
Paul Feldman
Bryce Snape
Max Cantin