Better practices for compliance management

The main compliance challenges

We know that businesses and government entities alike struggle to manage compliance requirements. Many have put up with challenges for so long—often with limited resources—that they no longer see how problematic the situation has become.

FIVE COMPLIANCE CHALLENGES YOU MIGHT BE DEALING WITH

01 COMPLIANCE SILOS
It’s not uncommon that, over time, separate activities, roles, and teams develop to address different compliance requirements. There’s often a lack of integration and communication among these teams or individuals. The result is duplicated efforts—and the creation of multiple clumsy and inefficient systems. This is then perpetuated as compliance processes change in response to regulations, mergers and acquisitions, or other internal business re-structuring.

02 NO SINGLE VIEW OF COMPLIANCE ASSURANCE
Siloed compliance systems also make it hard for senior management to get an overview of current compliance activities and perform timely risk assessments. If you can’t get a clear view of compliance risks, then chances are good that a damaging risk will slip under the radar, go unaddressed, or simply be ignored.

03 COBBLED TOGETHER, HOME-GROWN SYSTEMS
Using generalized software, like Excel spreadsheets and Word documents, in addition to shared folders and file systems, might have made sense at one point. But, as requirements become more complex, these systems become more frustrating, inefficient, and risky. Compiling hundreds or thousands of spreadsheets to support compliance management and regulatory reporting is a logistical nightmare (not to mention time-consuming). Spreadsheets are also prone to error and limited because they don’t provide audit trails or activity logs.

04 OLD SOFTWARE, NOT DESIGNED TO KEEP UP WITH FREQUENT CHANGES
You could be struggling with older compliance software products that aren’t designed to deal with constant change. These can be increasingly expensive to upgrade, not the most user-friendly, and difficult to maintain.

05 NOT USING AUTOMATED MONITORING
Many compliance teams are losing out by not using analytics and data automation. Instead, they rely heavily on sample testing to determine if compliance controls and processes are working, so huge amounts of activity data is never actually checked.

Transform your compliance management process

Good news! There’s some practical steps you can take to transform compliance processes and systems so that they become way more efficient and far less expensive and painful.

It’s all about optimizing the interactions of people, processes, and technology around regulatory compliance requirements across the entire organization.

It might not sound simple, but it’s what needs to be done. And, in our experience, it can be achieved without becoming massively time-consuming and expensive. Technology for regulatory compliance management has evolved to unite processes and roles across all aspects of compliance throughout your organization.

Look, for example, at how technology like Salesforce (a cloud-based system with big data analytics) has transformed sales, marketing, and customer service. Now, there’s similar technology which brings together different business units around regulatory compliance to improve processes and collaboration for the better.

Where to start?

Let’s look at what’s involved in establishing a technology-driven compliance management process. One that’s driven by data and fully integrated across your organization.

THE BEST PLACE TO START IS THE END

Step 1: Think about the desired end-state.

First, consider the objectives and the most important outcomes of your new process. How will it impact the different stakeholders? Take the time to clearly define the metrics you’ll use to measure your progress and success.

A few desired outcomes:

  • Accurately measure and manage the costs of regulatory and policy compliance.
  • Track how risks are trending over time, by regulation, and by region.
  • Understand, at any point in time, the effectiveness of compliance-related controls.
  • Standardize approaches and systems for managing compliance requirements and risks across the organization.
  • Efficiently integrate reporting on compliance activities with those of other risk management functions.
  • Create a quantified view of the risks faced due to regulatory compliance failures for executive management.
  • Increase confidence and response times around changing and new regulations.
  • Reduce duplication of efforts and maximize overall efficiency.

NOW, WHAT DO YOU NEED TO SUPPORT YOUR OBJECTIVES?

Step 2: Identify the activities and capabilities that will get you the desired outcomes.

Consider the different parts of the compliance management process below. Then identify the steps you’ll need to take or the changes you’ll need to make to your current activity that will help you achieve your objectives. We’ve put together a cheat sheet to help this along.

Galvanize

IDENTIFY & IMPLEMENT COMPLIANCE CONTROL PROCEDURES

  • 01 Maintain a central library of regulatory requirements and internal corporate policies, allocated to owners and managers.
  • 02 Define control processes and procedures that will ensure compliance with regulations and policies.
  • 03 Link control processes to the corresponding regulations and corporate policies.
  • 04 Assess the risk of control weaknesses and failure to comply with regulations and policies.

RUN TRANSACTIONAL MONITORING ANALYTICS

  • 05 Monitor the effectiveness of controls and compliance activities with data analytics.
  • 06 Get up-to-date confirmation of the effectiveness of controls and compliance from owners with automated questionnaires or certification of adherence statements.

MANAGE RESULTS & RESPOND

  • 07 Manage the entire process of exceptions generated from analytic monitoring and from the generation of questionnaires and certifications.

REPORT RESULTS & UPDATE ASSESSMENTS

  • 08 Use the results of monitoring and exception management to produce risk assessments and trends.
  • 09 Identify new and changing regulations as they occur and update repositories and control and compliance procedures.
  • 10 Report on the current status of compliance management activities from high- to low-detail levels.

IMPROVE THE PROCESS

  • 11 Identify duplicate processes and fix procedures to combine and improve controls and compliance tests.
  • 12 Integrate regulatory compliance risk management, monitoring, and reporting with overall risk management activities.

Eight compliance processes in desperate need of technology

01 Centralize regulations & compliance requirements
A major part of regulatory compliance management is staying on top of countless regulations and all their details. A solid content repository includes not only the regulations themselves, but also related data. By centralizing your regulations and compliance requirements, you’ll be able to start classifying them, so you can eventually search regulations and requirements by type, region of applicability, effective dates, and modification dates.

02 Map to risks, policies, & controls
Classifying regulatory requirements is no good on its own. They need to be connected to risk management, control and compliance processes, and system functionality. This is the most critical part of a compliance management system.

Typically, in order to do this mapping, you need:

  • An assessment of non-compliant risks for each requirement.
  • Defined processes for how each requirement is met.
  • Defined controls that make sure the compliance process is effective in reducing non-compliance risks.
  • Controls mapped to specific analytics monitoring tests that confirm the effectiveness on an ongoing basis.
  • Assigned owners for each mapped requirement. Specific processes and controls may be assigned to sub-owners.

03 Connect to data & use advanced analytics

Using different automated tests to access and analyze data is foundational to a data-driven compliance management approach.

The range of data sources and data types needed to perform compliance monitoring can be humongous. When it comes to areas like FCPA or other anti-bribery and corruption regulations, you might need to access entire populations of purchase and payment transactions, general ledger entries, payroll, and travel and entertainment expenses. And that’s just the internal sources. External sources could include things like the Politically Exposed Persons database or Sanctions Checks.

Extensive suites of tests and analyses can be run against the data to determine whether compliance controls are working effectively and if there are any indications of transactions or activities that fail to comply with regulations. The results of these analyses identify specific anomalies and control exceptions, as well as provide statistical data and trend reports that indicate changes in compliance risk levels.

Truly delivering on this step involves using the right technology since the requirements for accessing and analyzing data for compliance are demanding. Generalized analytic software is seldom able to provide more than basic capabilities, which are far removed from the functionality of specialized risk and control monitoring technologies.

04 Monitor incidents & manage issues

It’s important to quickly and efficiently manage instances once they’re flagged. But systems that create huge amounts of “false positives” or “false negatives” can end up wasting a lot of time and resources. On the other hand, a system that fails to detect high risk activities creates risk of major financial and reputational damage. The monitoring technology you choose should let you fine-tune analytics to flag actual risks and compliance failures and minimize false alarms.

The system should also allow for an issues resolution process that’s timely and maintains the integrity of responses. If the people responsible for resolving a flagged issue don’t do it adequately, an automated workflow should escalate the issues to the next level.

Older software can’t meet the huge range of incident monitoring and issues management requirements. Or it can require a lot of effort and expense to modify the procedures when needed.

05 Manage investigations

As exceptions and incidents are identified, some turn into issues that need in-depth investigation. Software helps this investigation process by allowing the user to document and log activities. It should also support easy collaboration of anyone involved in the investigation process.

Effective security must be in place around access to all aspects of a compliance management system. But it’s extra important to have a high level of security and privacy for the investigation management process.

06 Use surveys, questionnaires & certifications

Going beyond just transactional analysis and monitoring, it’s also important to understand what’s actually happening right now, by collecting the input of those working in the front-lines.

Software that has built-in automated surveys and questionnaires can gather large amounts of current information directly from these individuals in different compliance roles, then quickly interpret the responses.

For example, if you’re required to comply with the Sarbanes-Oxley Act (SOX), you can use automated questionnaires and certifications to collect individual sign-off on SOX control effectiveness questions. That information is consolidated and used to support the SOX certification process far more efficiently than using traditional ways of collecting sign-off.

07 Manage regulatory changes

Regulations change constantly, and to remain compliant, you need to know—quickly— when those changes happen. This is because changes can often mean modifications to your established procedures or controls, and that could impact your entire compliance management process.

A good compliance software system is built to withstand these revisions. It allows for easy updates to existing definitions of controls, processes, and monitoring activities.

Before software, any regulatory changes would involve huge amounts of manual activities, causing backlogs and delays. Now much (if not most) of the regulatory change process can be automated, freeing your time to manage your part of the overall compliance program.

08 Ensure regulatory examination & oversight

No one likes going through compliance reviews by regulatory bodies. It’s even worse if failures or weaknesses surface during the examination.

But if that happens to you, it’s good to know that many regulatory authorities have proven to be more accommodating and (dare we say) lenient when your compliance process is strategic, deliberate, and well designed.

There are huge benefits, in terms of efficiency and cost savings, by using a structured and well-managed regulatory compliance system. But the greatest economic benefit happens when you can avoid a potentially major financial penalty as a result of replacing an inherently unreliable and complicated legacy system with one that’s purpose-built and data-driven.

Click here to access Galvanize’s new White Paper

Fintech, regtech and the role of compliance in 2020

The ebb and flow of attitudes on the adoption and use of technology has evolving ramifications for financial services firms and their compliance functions, according to the findings of the Thomson Reuters Regulatory Intelligence’s fourth annual survey on fintech, regtech and the role of compliance. This year’s survey results represent the views and experiences of almost 400 compliance and risk practitioners worldwide.

During the lifetime of the report it has had nearly 2,000 responses and been downloaded nearly 10,000 times by firms, risk and compliance practitioners, regulators, consultancies, law firms and global systemically-important financial institutions (G-SIFIs). The report also highlights the shifting role of the regulator and concerns about best or better practice approaches to tackle the rise of cyber risk. The findings have become a trusted source of insight for firms, regulators and their advisers alike. They are intended to help regulated firms with planning, resourcing and direction, and to allow them to benchmark whether their resources, skills, strategy and expectations are in line with those of the wider industry. As with previous reports, regional and G-SIFI results are split out where they highlight any particular trend. One challenge for firms is the need to acquire the skill sets which are essential if they are to reap the expected benefits of technological solutions. Equally, regulators and policymakers need to have the appropriate up-todate skillsets to enable consistent oversight of the use of technology in financial services. Firms themselves, and G-SIFIs in particular, have made substantial investments in skills and the upgrading of legacy systems.

Key findings

  • The involvement of risk and compliance functions in their firm’s approach to fintech, regtech and insurtech continues to evolve. Some 65% of firms reported their risk and compliance function was either fully engaged and consulted or had some involvement (59% in prior year). In the G-SIFI population 69% reported at least some involvement with those reporting their compliance function as being fully engaged and consulted almost doubling from 13% in 2018, to 25% in 2019. There is an even more positive picture presented on increasing board involvement in the firm’s approach to fintech, regtech and insurtech. A total of 62% of firms reported their board being fully engaged and consulted or having some involvement, up from 54% in the prior year. For G-SIFIs 85% reported their board being fully engaged and consulted or having some involvement, up from 56% in the prior year. In particular, 37% of G-SIFIs reported their board was fully engaged with and consulted on the firm’s approach to fintech, regtech and insurtech, up from 13% in the prior year.
  • Opinion on technological innovation and digital disruption has fluctuated in the past couple of years. Overall, the level of positivity about fintech innovation and digital disruption has increased, after a slight dip in 2018. In 2019, 83% of firms have a positive view of fintech innovation (23% extremely positive, 60% mostly positive), compared with 74% in 2018 and 83% in 2017. In the G-SIFI population the positivity rises to 92%. There are regional variations, with the UK and Europe reporting a 97% positive view at one end going down to a 75% positive view in the United States.
  • There has been a similar ebb and flow of opinion about regtech innovation and digital disruption although at lower levels. A total of 77% reported either an extremely or mostly positive view, up from 71% in the prior year. For G-SIFIs 81% had a positive view, up from 76% in the prior year.
  • G-SIFIs have reported a significant investment in specialist skills for both risk and compliance functions and at board level. Some 21% of G-SIFIs reported they had invested in and/or appointed people with specialist skills to the board to accommodate developments in fintech, insurtech and regtech, up from 2% in the prior year. This means in turn 79% of G-SIFIs have not completed their work in this area, which is potentially disturbing. Similarly, 25% of G-SIFIs have invested in specialist skills for the risk and compliance functions, up from 9% in the prior year. In the wider population 10% reported investing in specialist skills at board level and 16% reported investing in specialist skills for the risk and compliance function. A quarter (26%) reported they have yet to invest in specialist skills for the risk and compliance function, but they know it is needed (32% for board-level specialist skills). Again, these figures suggest 75% of G-SIFIs have not fully upgraded their risk and compliance functions, rising to 84% in the wider population.
  • The greatest financial technology challenge firms expect to face in the next 12 months have changed in nature since the previous survey, with the top three challenges cited as keeping up with technological advancements; budgetary limitations, lack of investment and cost; and data security. In prior years, the biggest challenges related to the need to upgrade legacy systems and processes as well as budgetary limitations, the adequacy and availability of skilled resources together with the need for cyber resilience. In terms of the greatest benefits expected to be seen from financial technology in the next 12 months the top three are a strengthening of operational efficiency, improved services for customers and greater business opportunities.
  • G-SIFIs are leading the way on the implementation of regtech solutions. Some 14% of G-SIFIs have implemented a regtech solution, up from 9% in the prior year with 75% (52% in the prior year) reporting they have either fully or partially implemented a regtech solution to help manage compliance. In the wider population, 17% reported implementing a regtech solution, up from 8% in the prior year. The 2018 numbers overall showed a profound dip from 2017 when 29% of G-SIFIs and 30% of firms reported implementing a regtech solution, perhaps highlighting that early adoption of regtech solutions was less than smooth.
  • Where firms have not yet deployed fintech or regtech solutions various reasons were cited as to what was holding them back. Significantly, one third of firms cited lack of investment; a similar number of firms pointed to a lack of in-house skills and information security/data protection concerns. Some 14% of  firms and 12% of G-SIFIs reported they had taken a deliberate strategic decision not to deploy fintech or regtech solutions yet.
  • There continues to be substantial variation in the overall budget available for regtech solutions. A total of 38% of firms (31% in prior year) reported that the expected budget would grow in the coming year, however, 31% said they lack a budget for regtech (25% in the prior year). For G-SIFIs 48% expected the budget to grow (36% in prior year), with 12% reporting no budget for regtech solutions (6% in the prior year).

Focus : Challenges for firms

Technological challenges for firms come in all shapes and sizes. There is the potential, marketplace changing, challenge posed by the rise of bigtech. There is also the evolving approach of regulators and the need to invest in specialist skill sets. Lastly, there is the emerging need to keep up with technological advances themselves.

TR10

The challenges for firms have moved on. In the first three years of the report the biggest financial technology challenge facing firms was that of the need to upgrade legacy systems and processes. This year the top three challenges are expected to be the need to keep up with technology advancements; perceived budgetary limitations, lack of investment and cost, and then data security.

Focus : Cyber risk

Cyber risk and the need to be cyber-resilient is a major challenge for financial services firms which are targets for hackers. They must be prepared and be able to respond to any kind of cyber incident. Good customer outcomes will be under threat if cyber resilience fails.

One of the most prevalent forms of cyber attack is ransomware. There are different types of ransomware, all of which will seek to prevent a firm or an individual from using their IT systems and will ask for something (usually payment of a ransom) to be done before access will be restored. Even then, there is no guarantee that paying the fine or acceding to the ransomware attacker’s demands will restore full access to all IT systems, data or files. Many firms have found that critical files often containing client data have been encrypted as part of an attack and large amounts of money are demanded for restoration. Encryption is in this instance used as a weapon and it can be practically impossible to reverse-engineer the encryption or “crack” the files without the original encryption key – which cyber attackers deliberately withhold. What was previously viewed often as an IT problem has become a significant issue for risk and compliance functions. The regulatory stance is typified by the UK Financial Conduct Authority (FCA) which has said its goal is to “help firms become more resilient to cyber attacks, while ensuring that consumers are protected and market integrity is upheld”. Regulators do not expect firms to be impervious but do expect cyber risk management to become a core competency.

Good and better practice on defending against ransomware attacks Risk and compliance officers do not need to become technological experts overnight but must ensure cyber risks are effectively managed and reported on within their firm’s corporate governance framework. For some compliance officers, cyber risk may be well outside their comfort zone but there is evidence that simple steps implemented rigorously can go a long way towards protecting a firm and its customers. Any basic cyber-security hygiene aimed at protecting businesses from ransomware attacks should make full use of the wide range of resources available on cyber resilience, IT security and protecting against malware attacks. The UK National Cyber Security Centre has produced some practical guidance on how organizations can protect themselves in cyberspace, which it updates regularly. Indeed, the NCSC’s 10 steps to cyber security have now been adopted by most of the FTSE350.

TR11

Closing thoughts

The financial services industry has much to gain from the effective implementation of fintech, regtech and insurtech but practical reality is there are numerous challenges to overcome before the potential benefits can be realised. Investment continues to be needed in skill sets, systems upgrades and cyber resilience before firms can deliver technological innovation without endangering good customer outcomes.

An added complication is the business need to innovate while looking over one shoulder at the threat posed by bigtech. There are also concerns for solution providers. The last year has seen many technology start-ups going bust and far fewer new start-ups getting off the ground – an apparent parallel, at least on the surface, to the bubble that was around dotcom. Solutions need to be practical, providers need to be careful not to over promise and under deliver and above all developments should be aimed at genuine problems and not be solutions looking for a problem. There are nevertheless potentially substantive benefits to be gained from implementing fintech, regtech and insurtech solutions. For risk and compliance functions much of the benefit may come from the ability to automate rote processes with increasing accuracy and speed. Indeed, when 900 respondents to the 10th annual cost of compliance survey report were asked to look into their crystal balls and predict the biggest change for compliance in the next 10 years, the largest response was automation.

Technology and its failure or misuse is increasingly being linked to the personal liability and accountability of senior managers. Chief executives, board members and other senior individuals will be held accountable for failures in technology and should therefore ensure their skill set is up-to-date. Regulators and politicians alike have shown themselves to be increasingly intolerant of senior managers who fail to take the expected reasonable steps with regards to any lack of resilience in their firm’s technology.

This year’s findings suggest firms may find it beneficial to consider:

  • Is fintech (and regtech) properly considered as part of the firm’s strategy? It is important for regtech especially not to be forgotten about in strategic terms: a systemic failure arising from a regtech solution has great capacity to cause problems for the firm – the UK FCA’s actions on regulatory reporting, among other things, are an indicator of this.
  • Not all firms seem to have fully tackled the governance challenge fintech implies: greater specialist skills may be needed at board level and in risk and compliance functions.
  • Lack of in-house skills was given as a main reason for failing to develop fintech or regtech solutions. It is heartening that firms understand the need for those skills. As fintech/regtech becomes mainstream, however, firms may be pressed into developing such solutions. Is there a plan in place to plug the skills gap?
  • Only 22% of firms reported that they need more resources to evaluate, understand and deploy fintech/ regtech solutions. This suggests 78% of firms are unduly relaxed about the resources needed in the second line of defence to ensure fintech/regtech solutions are properly monitored. This may be a correct conclusion, but seems potentially bullish.

Click here to access Thomson Reuters’ Survey Results

From Risk to Strategy : Embracing the Technology Shift

The role of the risk manager has always been to understand and manage threats to a given business. In theory, this involves a very broad mandate to capture all possible risks, both current and future. In practice, however, some risk managers are assigned to narrower, siloed roles, with tasks that can seem somewhat disconnected from key business objectives.

Amidst a changing risk landscape and increasing availability of technological tools that enable risk managers to do more, there is both a need and an opportunity to move toward that broader risk manager role. This need for change – not only in the risk manager’s role, but also in the broader approach to organizational risk management and technological change – is driven by five factors.

Marsh Ex 1

The rapid pace of change has many C-suite members questioning what will happen to their business models. Research shows that 73 percent of executives predict significant industry disruption in the next three years (up from 26 percent in 2018). In this challenging environment, risk managers have a great opportunity to demonstrate their relevance.

USING NEW TOOLS TO MANAGE RISKS

Emerging technologies present compelling opportunities for the field of risk management. As discussed in our 2017 report, the three levers of data, analytics, and processes allow risk professionals a framework to consider technology initiatives and their potential gains. Emerging tools can support risk managers in delivering a more dynamic, in-depth view of risks in addition to potential cost-savings.

However, this year’s survey shows that across Asia-Pacific, risk managers still feel they are severely lacking knowledge of emerging technologies across the business. Confidence scores were low in all but one category, risk management information systems (RMIS). These scores were only marginally higher for respondents in highly regulated industries (financial services and energy utilities), underscoring the need for further training across all industries.

Marsh Ex 3

When it comes to technology, risk managers should aim for “digital fluency, a level of familiarity that allows them to

  • first determine how technologies can help address different risk areas,
  • and then understand the implications of doing so.

They need not understand the inner workings of various technologies, as their niche should remain aligned with their core expertise: applying risk technical skills, principles, and practices.

CULTIVATING A “DIGITAL-FIRST” MIND-SET

Successful technology adoption does not only present a technical skills challenge. If risk function digitalization is to be effective, risk managers must champion a cultural shift to a “digital-first” mindset across the organization, where all stakeholders develop a habit of thinking about how technology can be used for organizational benefit.

For example, the risk manager of the future will be looking to glean greater insights using increasingly advanced analytics capabilities. To do this, they will need to actively encourage their organization

  • to collect more data,
  • to use their data more effectively,
  • and to conduct more accurate and comprehensive analyses.

Underlying the risk manager’s digitalfirst mind-set will be three supporting mentalities:

1. The first of these is the perception of technology as an opportunity rather than a threat. Some understandable anxiety exists on this topic, since technology vendors often portray technology as a means of eliminating human input and labor. This framing neglects the gains in effectiveness and efficiency that allow risk managers to improve their judgment and decision making, and spend their time on more value-adding activities. In addition, the success of digital risk transformations will depend on the risk professionals who understand the tasks being digitalized; these professionals will need to be brought into the design and implementation process right from the start. After all, as the Japanese saying goes, “it is workers who give wisdom to the machines.” Fortunately, 87 percent of PARIMA surveyed members indicated that automating parts of the risk manager’s job to allow greater efficiency represents an opportunity for the risk function. Furthermore, 63 percent of respondents indicated that this was not merely a small opportunity, but a significant one (Exhibit 6). This positive outlook makes an even stronger statement than findings from an earlier global study in which 72 percent of employees said they see technology as a benefit to their work

2. The second supporting mentality will be a habit of looking for ways in which technology can be used for benefit across the organization, not just within the risk function but also in business processes and client solutions. Concretely, the risk manager can embody this culture by adopting a data-driven approach, whereby they consider:

  • How existing organizational data sources can be better leveraged for risk management
  • How new data sources – both internal and external – can be explored
  • How data accuracy and completeness can be improved

“Risk managers can also benefit from considering outside-the-box use cases, as well as keeping up with the technologies used by competitors,” adds Keith Xia, Chief Risk Officer of OneHealth Healthcare in China.

This is an illustrative rather than comprehensive list, as a data-driven approach – and more broadly, a digital mind-set – is fundamentally about a new way of thinking. If risk managers can grow accustomed to reflecting on technologies’ potential applications, they will be able to pre-emptively spot opportunities, as well as identify and resolve issues such as data gaps.

3. All of this will be complemented by a third mentality: the willingness to accept change, experiment, and learn, such as in testing new data collection and analysis methods. Propelled by cultural transformation and shifting mind-sets, risk managers will need to learn to feel comfortable with – and ultimately be in the driver’s seat for – the trial, error, and adjustment that accompanies digitalization.

MANAGING THE NEW RISKS FROM EMERGING TECHNOLOGIES

The same technological developments and tools that are enabling organizations to transform and advance are also introducing their own set of potential threats.

Our survey shows the PARIMA community is aware of this dynamic, with 96 percent of surveyed members expecting that emerging technologies will introduce some – if not substantial – new risks in the next five years.

The following exhibit gives a further breakdown of views from this 96 percent of respondents, and the perceived sufficiency of their existing frameworks. These risks are evolving in an environment where there are already questions about the relevance and sufficiency of risk identification frameworks. Risk management has become more challenging due to the added complexity from rapid shifts in technology, and individual teams are using risk taxonomies with inconsistent methodologies, which further highlight the challenges that risk managers face in managing their responses to new risk types.

Marsh Ex 9

To assess how new technology in any part of the organization might introduce new risks, consider the following checklist :

HIGH-LEVEL RISK CHECKLIST FOR EMERGING TECHNOLOGY

  1. Does the use of this technology cut across existing risk types (for example, AI risk presents a composite of technology risk, cyber risk, information security risk, and so on depending on the use case and application)? If so, has my organization designated this risk as a new, distinct category of risk with a clear definition and risk appetite?
  2. Is use of this technology aligned to my company’s strategic ambitions and risk appetite ? Are the cost and ease of implementation feasible given my company’s circumstances?
  3. Can this technology’s implications be sufficiently explained and understood within my company (e.g. what systems would rely on it)? Would our use of this technology make sense to a customer?
  4. Is there a clear view of how this technology will be supported and maintained internally, for example, with a digitally fluent workforce and designated second line owner for risks introduced by this technology (e.g. additional cyber risk)?
  5. Has my company considered the business continuity risks associated with this technology malfunctioning?
  6. Am I confident that there are minimal data quality or management risks? Do I have the high quality, large-scale data necessary for advanced analytics? Would customers perceive use of their data as reasonable, and will this data remain private, complete, and safe from cyberattacks?
  7. Am I aware of any potential knock-on effects or reputational risks – for example, through exposure to third (and fourth) parties that may not act in adherence to my values, or through invasive uses of private customer information?
  8. Does my organization understand all implications for accounting, tax, and any other financial reporting obligations?
  9. Are there any additional compliance or regulatory implications of using this technology? Do I need to engage with regulators or seek expert advice?
  10. For financial services companies: Could I explain any algorithms in use to a customer, and would they perceive them to be fair? Am I confident that this technology will not violate sanctions or support crime (for example, fraud, money laundering, terrorism finance)?

SECURING A MORE TECHNOLOGY-CONVERSANT RISK WORKFORCE

As risk managers focus on digitalizing their function, it is important that organizations support this with an equally deliberate approach to their people strategy. This is for two reasons, as Kate Bravery, Global Solutions Leader, Career at Mercer, explains: “First, each technological leap requires an equivalent revolution in talent; and second, talent typically becomes more important following disruption.”

While upskilling the current workforce is a positive step, as addressed before, organizations must also consider a more holistic talent management approach. Risk managers understand this imperative, with survey respondents indicating a strong desire to increase technology expertise in their function within the next five years.

Yet, little progress has been made in adding these skills to the risk function, with a significant gap persisting between aspirations and the reality on the ground. In both 2017 and 2019 surveys, the number of risk managers hoping to recruit technology experts has been at least 4.5 times the number of teams currently possessing those skills.

Marsh Ex 15

EMBEDDING RISK CULTURE THROUGHOUT THE ORGANIZATION

Our survey found that a lack of risk management thinking in other parts of the organization is the biggest barrier the risk function faces in working with other business units. This is a crucial and somewhat alarming finding – but new technologies may be able to help.

Marsh Ex 19

As technology allows for increasingly accurate, relevant, and holistic risk measures, organizations should find it easier to develop risk-based KPIs and incentives that can help employees throughout the business incorporate a risk-aware approach into their daily activities.

From an organizational perspective, a first step would be to describe risk limits and risk tolerance in a language that all stakeholders can relate to, such as potential losses. Organizations can then cascade these firm-wide risk concepts down to operational business units, translating risk language into tangible and relevant incentives that encourages behavior that is consistent with firm values. Research shows that employees in Asia want this linkage, citing a desire to better align their individual goals with business goals.

The question thus becomes how risk processes can be made an easy, intuitive part of employee routines. It is also important to consider KPIs for the risk team itself as a way of encouraging desirable behavior and further embedding a risk-aware culture. Already a majority of surveyed PARIMA members use some form of KPIs in their teams (81 percent), and the fact that reporting performance is the most popular service level measure supports the expectation that PARIMA members actively keep their organization informed.

Marsh Ex 21

At the same time, these survey responses also raise a number of questions. Forty percent of organizations indicate that they measure reporting performance, but far fewer are measuring accuracy (15 percent) or timeliness (16 percent) of risk analytics – which are necessary to achieve improved reporting performance. Moreover, the most-utilized KPIs in this year’s survey tended to be tangible measures around cost, from which it can be difficult to distinguish a mature risk function from a lucky one.

SUPPORTING TRANSFORMATIONAL CHANGE PROGRAMS

Even with a desire from individual risk managers to digitalize and complement organizational intentions, barriers still exist that can leave risk managers using basic tools. In 2017, cost and budgeting concerns were the single, standout barrier to risk function digitalization, chosen by 67 percent of respondents, well clear of second placed human capital concerns at 18 percent. This year’s survey responses were much closer, with a host of ongoing barriers, six of which were cited by more than 40 percent of respondents.

Marsh Ex 22

Implementing the nuts and bolts of digitalization will require a holistic transformation program to address all these barriers. That is not to say that initiatives must necessarily be massive in scale. In fact, well-designed initiatives targeting specific business problems can be a great way to demonstrate success that can then be replicated elsewhere to boost innovation.

Transformational change is inherently difficult, in particular where it spans both technological as well as people dimensions. Many large organizations have generally relied solely on IT teams for their “digital transformation” initiatives. This approach has had limited success, as such teams are usually designed to deliver very specific business functionalities, as opposed to leading change initiatives. If risk managers are to realize the benefits of such transformation, it is incumbent on them to take a more active role in influencing and leading transformation programs.

Click here to access Marsh’s and Parima’s detailed report

Internal Audit’s Guide to Planning, Managing and Addressing Risks

As time passes and the modern-day enterprise evolves, so does the role of the internal auditor. What was once a function that was perceived as rule enforcers and compliance police is expanding into one that is a trusted advisor within the business. The last several years have introduced an enormous amount of change, but the proliferation of technology within the enterprise is accelerating every aspect; from operations to decision making.

The progressive steps organizations are taking as a result of the digital age present a bevy of benefits, but in turn, create a slew of challenges and risks. Subsequently, the internal audit function has been forced to adapt along the way, assuring key stakeholders in the business that risks have been identified, but above all, addressed and mitigated.

While identifying and managing risks tied to the business fall on management, it’s internal audit’s responsibility to focus on closing the loop. That’s why our second article focuses on the effective audit follow up, in addition to outlining the how and when tied to escalating risks.

A DYNAMIC AND ITERATIVE PROCESS

The COSO Internal Control – Integrated Framework (2013) provides that a “risk assessment involves a dynamic and iterative process for identifying and assessing risks to the achievement of objectives.” (emphasis added). To be effective, internal audit should be aware of and responsive to changes in known risks and additionally the emergence of new ones.

A purpose for the traditional (i.e., annual risk assessment) is to allow internal audit to develop a planning horizon which is understood by stakeholders and, in particular, executive management and the audit committee as a basis for the risks identified. In this process there can also be a push to finalize the internal audit “plan” so that budgets, schedules and staffing can be arranged.

With the emerging concept of “risk velocity”—measuring how fast a risk may affect an organization—is recognition that the typical risk assessment process is one that is not dynamic and iterative nor responsive to change in real time. Change does not occur on an annual basis. The move to a continuous and dynamic audit plan is significant for most internal audit departments. Some departments are already moving on this path and have had to adjust from a static process focused on listening to management on a seasonal basis to monitoring business objectives and risks that are rapidly changing.

Tony Redlinger, internal audit director with IHS Markit, observes the difficulties of the timely capture of risks as “asking the pertinent questions often without the broader knowledge of what the business is getting into, where the technology often advances much faster than the controls.”

BEYOND THE TYPICAL INTERNAL AUDIT RISK ASSESSMENT

What approaches internal audit functions can take to ramp up the process to achieve more dynamic audit planning?

One technique is to increase the frequency of the process and design a rolling service of assessments and audit planning. If existing processes can be made more streamlined and efficient, the time trajectory can be intensified to occur more frequently. Potentially, a concerted effort can result in an audit plan being updated every six months instead of annually. Since the risk identification process ideally is ongoing, management should be encouraged to implement a schedule to periodically review risks, while reserving the ability to accelerate reviews if a company objective changes, or risk factors increase.

For example, if management is considering an acquisition in a new jurisdiction, it could require the reevaluation of risk factors to determine how the decision could impact operations. Such processes can be formally linked into internal audit planning. Of course, existing sources of risk information should be identified and integrated into internal audit planning.

Other assessment processes including Enterprise Risk Management activities, department self-assessments and other functionspecific reviews in high-impact areas depending on industry (e.g., environmental hazards, cybersecurity threats, etc.), should connect and feed into internal audit processes.

Internal Audit 1

TECHNOLOGY TOOLS AND REALISM ABOUT SURVEYS

In the typical risk assessment, preparatory materials are provided and participants are asked a series of questions during sessions with audit staff. This process is expected to produce information to guide the allocation of resources and activities within internal audit so as to optimize the match between the company’s greatest risks and the corresponding mitigation efforts. The availability of sophisticated technology tools such as online surveys can seem to make it cheap and easy to gather voluminous data from a larger population, and to conduct statistical analysis of that data.

Dr. Hernan Murdock, vice president of the audit division at MISTI, finds surveys and questionnaires to be a technique to collect information. “[Questionnaires] promote risk and control awareness, while encouraging transparency and accountability,” he says.

Potentially, this means we can conduct a much larger assessment with the same resources. There is definitely a place for crowdsourcing risk as well as casting a wide net for particular fact patterns of concern, such as use of third-party sales intermediaries or collection of consumer personal data. Still, more data is not always better data. The essence of a good risk assessment is not popular opinion, mechanically sliced and diced; it is informed opinion and expert judgment applied to the facts. Be careful with gathering far more data than can be followed up on or that can be analyzed meaningfully which can result in human-judgment bottlenecks in the process.

Ordinarily, risk assessments gather information from senior executives and managers, as well as a sample of senior operational personnel in the business units. To the extent that “risk owners” are not in these groups, they are usually sought out, and sometimes manager-level input is also requested.

Front-line workers should be considered as well. It’s usually those who are in the details on a daily basis that have the best perspectives on risks and low-hanging fruit when it comes to increasing operational efficiency.

THE RISK OF THE INTERNAL AUDIT RISK ASSESSMENT

Here we are not talking about the risk assessment that drives the audit plan. Rather, this is the risk that the internal audit function itself will not achieve its objectives as a result of the risk assessment. Should you perform this type of quality engagement as well? See IIA’s Standards for the Professional Practice of Internal Auditing 2120—Risk Management: “The internal audit activity must evaluate the effectiveness and contribute to the improvement of risk management processes.”

The internal audit function in this regard should consider risks such as:

  • The potential that the audit risk assessment is inaccurate or incomplete leading to an ineffective audit plan
  • Audit staffing that is insufficient in terms of quality and capacity to deliver useful results on every engagement
  • Changes in business and risk not promptly identified so that the audit plan can be updated
  • Audit communications failing to provide information organizational stakeholders need, when they need it
  • Governance roles not able to understand audit results and their implications for management of the organization

Internal Audit 2

Beyond Quality: The Four-Part Approach for Audit Efficiency and Effectiveness

STEP 1: PLAN FOR ORGANIZATIONAL GROWTH

While the concept of quality is uniform for internal auditors of different varieties and capacities, effectiveness and efficiency can vary from organization to organization. Accordingly, clear definitions for these terms—the expectations for your team—must be established and adopted to plan for growth.

Use these questions as guidance when defining exactly what effectiveness and efficiency mean for you and your team:

  • Are we equipped with the up-to-date tools needed to conduct the best work possible?
  • Do we have the right resources and skill sets required to deliver our audit plan?
  • Are we contributing to organizational improvement? If so, can others see this?
  • Have we obtained any validation of our team’s quality, such as notification from managers or executives?
  • Is feedback effectively distributed to team members, so they know what areas to improve?
  • What quantifiable metrics can we associate with these definitions?

While you and your team’s definitions of effectiveness and efficiency are crucial, it is also important to gain the approval of key stakeholders involved in internal audit.

A major reason that process improvement initiatives fail, according to one Harvard Business Review article is that the people whose work will be directly impacted are often left out of the process.

Accordingly, feedback from stakeholders at the helm of the financial success of your company should also be incorporated. Here are a few stakeholders who should weigh in on your definitions of effectiveness and efficiency:

  1. Internal stakeholders: Board of directors, audit committee, executives, senior management and department leads
  2. External stakeholders: Regulators, standard-setters, vendors, customers and external audit teams

STEP 2: DO THE WORK NEEDED TO SET EXPECTATIONS

The second step of this process continues to articulate the definitions of effectiveness and efficiency, and sets expectations for your team.

By this stage, you should have an internal definition of effectiveness and efficiency, and you have tempered that definition in the context of what key internal and external stakeholders need. To better set your organization up for success, make these definitions more actionable and specific through the assignation of qualitative and quantitative metrics.

As described in a Forbes article, Forrester reports 74 percent of firms say they want to be “data-driven,” but only 29 percent are actually successful at connecting analytics to action. Actionable insights appear to be the missing link for companies that want
to drive business outcomes from their data.

Make these definitions more actionable and specific for your team by assigning qualitative and quantitative metrics for each. To collect qualitative and quantitative metrics, try the following tactics:

  • Look back at past performance data to determine quantitative metrics:
    • How many audits were scheduled?
    • How many were completed?
    • How was staff utilized?
    • What were the budgeted hours as compared to the actual hours?
  • Go on a listening tour of departments impacted by your work to determine qualitative metrics:
    • What do clients think of your team’s performance?
    • What do other internal stakeholders think of your team’s performance?
    • Do they consider you and your team leaders in their role or order-takers?
    • Would they want to engage in future projects with your team?

With these actionable definitions in hand, the expectations for your team should be crystal clear. It is ultimately up to chief audit executives to hold their teams accountable for efficient and effective—along with quality—work.

STEP 3: CHECK PROGRESS AGAINST SET EXPECTATIONS

To check the quality, effectiveness, and efficiency of your team’s work, internal audit leaders should look at individual performance on an ongoing basis—not just an annual one. After all, it is easier and less problematic for leaders to reevaluate individual performance in small increments before it becomes a major issue.

In organizations of all sizes, a traditional once-per-year approach to employee reviews is fading away in favor of more ongoing ones. As a Washington Post article describes, today’s employees have come to expect instant feedback in many other areas of their lives, and performance reviews should be the same. Besides, the article states, one report found that two-thirds of employees who receive the highest scores in a typical performance management system are not actually the organization’s highest performers.

Chief audit executives should encourage the completion of self-appraisals. A Harvard Business Review article explains that an effective self-appraisal should focus on what you have accomplished and talk about weaknesses carefully, using language with an emphasis on growth and improvement, rather than admonishment. Highlight your team’s blind spots that they might not be aware exists.

In short, employees want more frequent and iterative assessments of their work, and internal audit leaders need to step up to deliver this and ensure quality, effectiveness, and efficiency at all stages.

STEP 4: ACT UPON WHAT YOU HAVE LEARNED

By this step, internal audit leaders have an array of tools at their disposal, including:

  • Actionable definitions of effectiveness and efficiency for their teams
  • Qualitative and quantitative metrics to bolster these definitions
  • Information gathered from self- and manager-guided evaluations
  • An understanding of how team members have performed along these guidelines

With this information in hand, many opportunities for growth are apparent—simply compare where you want your team members to be against where they are right now. By
implementing these fact-based changes into your internal audit processes, leaders set the stage for cyclical organizational and personal improvement.

According to a survey, this type of continuous improvement yields a positive ROI for organizations, helping increase revenue, along with saving time and money—an average annual impact of $6,000. Additionally, these improvements are designed to compound with each cycle.

Just as the approach to monitoring and improving audit quality is ongoing and cyclical—there are always improvements yet to be made—this approach to improving effectiveness and efficiency is fluid as well.

By weaving this four-part process into the fabric of your internal audit methodology, leaders can improve effectiveness and efficiency in their organizations.

 

Click here to access Workiva’s and MISTI’s White Paper

Mastering Risk with “Data-Driven GRC”

Overview

The world is changing. The emerging risk landscape in almost every industry vertical has changed. Effective methodologies for managing risk have changed (whatever your perspective:

  • internal audit,
  • external audit/consulting,
  • compliance,
  • enterprise risk management,

or otherwise). Finally, technology itself has changed, and technology consumers expect to realize more value, from technology that is more approachable, at lower cost.

How are these factors driving change in organizations?:

Emerging Risk Landscapes

Risk has the attention of top executives. Risk shifts quickly in an economy where “speed of change” is the true currency of business, and it emerges in entirely new forms in a world where globalization and automation are forcing shifts in the core values and initiatives of global enterprises.

Evolving Governance, Risk, and Compliance Methodologies

Across risk and control oriented functions spanning a variety of audit functions, fraud, compliance, quality management, enterprise risk management, financial control, and many more, global organizations are acknowledging a need to provide more risk coverage at lower cost (measured in both time and currency), which is driving re-inventions of methodology and automation.

Empowerment Through Technology

Gartner, the leading analyst firm in the enterprise IT space, is very clear that the convergence of four forces—Cloud, Mobile, Data, and Social—is driving the empowerment of individuals as they interact with each other and their information through well-designed technology.

In most organizations, there is no coordinated effort to leverage organizational changes emerging from these three factors in order to develop an integrated approach to mastering risk management. The emerging opportunity is to leverage the change that is occurring, to develop new programs; not just for technology, of course, but also for the critical people, methodology, and process issues. The goal is to provide senior management with a comprehensive and dynamic view of the effectiveness of how an organization is managing risk and embracing change, set in the context of overall strategic and operational objectives.

Where are organizations heading?

“Data Driven GRC” represents a consolidation of methodologies, both functional and technological, that dramatically enhance the opportunity to address emerging risk landscapes and, in turn, maximizing the reliability of organizational performance.

This paper examines the key opportunities to leverage change—both from a risk and an organizational performance management perspective—to build integrated, data-driven GRC processes that optimize the value of audit and risk management activities, as well as the investments in supporting tools and techniques.

Functional Stakeholders of GRC Processes and Technology

The Institute of Internal Auditors’ (IIA) “Three Lines of Defense in Effective Risk Management and Control” model specifically addresses the “who and what” of risk management and control. It distinguishes and describes three role- and responsibility-driven functions:

  • Those that own and manage risks (management – the “first line”)
  • Those that oversee risks (risk, compliance, financial controls, IT – the “second line”)
  • Those functions that provide independent assurance over risks (internal audit – the “third line”)

The overarching context of these three lines acknowledges the broader role of organizational governance and governing bodies.

IIAA

Technology Solutions

Data-Driven GRC is not achievable without a technology platform that supports the steps illustrated above, and integrates directly with the organization’s broader technology environment to acquire the data needed to objectively assess and drive GRC activities.

From a technology perspective, there are four main components required to enable the major steps in Data-Driven GRC methodology:

1. Integrated Risk Assessment

Integrated risk assessment technology maintains the inventory of strategic risks and the assessment of how well they are managed. As the interface of the organization’s most senior professionals into GRC processes, it must be a tool relevant to and usable by executive management. This technology sets the priorities for risk mitigation efforts, thereby driving the development of project plans crafted by each of the functions in the different lines of defense.

2. Project & Controls Management

A project and controls management system (often referred to more narrowly as audit management systems or eGRC systems) enables the establishment of project plans in each risk and control function that map against the risk mitigation efforts identified as required. Projects can then be broken down into actionable sets of tactical level risks, controls that mitigate those risks, and tests that assess those controls.

This becomes the backbone of the organization’s internal control environment and related documentation and evaluation, all setting context for what data is actually required to be tested or monitored in order to meet the organization’s strategic objectives.

3. Risk & Control Analytics

If you think of Integrated Risk Assessment as the brain of the Data-Driven GRC program and the Project & Controls Management component as the backbone, then Risk & Control Analytics are the heart and lungs.

An analytic toolset is critical to reaching out into the organizational environment and acquiring all of the inputs (data) that are required to be aggregated, filtered, and processed in order to route back to the brain for objective decision making. It is important that this toolset be specifically geared toward risk and control analytics so that the filtering and processing functionality is optimized for identifying anomalies representing individual occurrences of risk, while being able to cope with huge populations of data and illustrate trends over time.

4. Knowledge Content

Supporting all of the technology components, knowledge content comes in many forms and provides the specialized knowledge of risks, controls, tests, and data required to perform and automate the methodology across a wide-range of organizational risk areas.

Knowledge content should be acquired in support of individual risk and control objectives and may include items such as:

  • Risk and control templates for addressing specific business processes, problems, or high-level risk areas
  • Integrated compliance frameworks that balance multiple compliance requirements into a single set of implemented and tested controls
  • Data extractors that access specific key corporate systems and extract data sets required for evaluation (e.g., an SAP supported organization may need an extractor that pulls a complete set of fixed asset data from their specific version of SAP that may be used to run all require tests of controls related to fixed assets)
  • Data analysis rule sets (or analytic scripts) that take a specific data set and evaluate what transactions in the data set violate the rules, indicating control failures occurred

Mapping these key technology pieces that make up an integrated risk and control technology platform against the completely integrated Data-Driven GRC methodology looks as follows:

DDGRC

When evaluating technology platforms, it is imperative that each piece of this puzzle directly integrates with the other; otherwise, manual aggregation of results will be required, which is not only laborious but also inconsistent, disorganized and (by definition) violates the Data-Driven GRC methodology.

HiPerfGRC

 

Click here to access ACL’s study

Integrating Finance, Risk and Regulatory Reporting (FRR) through Comprehensive Data Management

Data travels faster than ever, anywhere and all the time. Yet as fast as it moves, it has barely been able to keep up with the expanding agendas of financial supervisors. You might not know it to look at them, but the authorities in Basel, Washington, London, Singapore and other financial and political centers are pretty swift themselves when it comes to devising new requirements for compiling and reporting data. They seem to want nothing less than a renaissance in the way institutions organize and manage their finance, risk and regulatory reporting activities.

The institutions themselves might want the same thing. Some of the business strategies and tactics that made good money for banks before the global financial crisis have become unsustainable and cut into their profitability. More stringent regulator frameworks imposed since the crisis require the implementation of complex, data-intensive stress testing procedures and forecasting models that call for unceasing monitoring and updating. The days of static reports capturing a moment in a firm’s life are gone. One of the most challenging data management burdens is rooted in duplication. The evolution of regulations has left banks with various bespoke databases across five core functions:

  • credit,
  • treasury,
  • profitability analytics,
  • financial reporting
  • and regulatory reporting,

with the same data inevitably appearing and processed in multiple places. This hodgepodge of bespoke marts simultaneously leads to both the duplication of data and processes, and the risk of inconsistencies – which tend to rear their head at inopportune moments (i.e. when consistent data needs to be presented to regulators). For example,

  • credit extracts core loan, customer and credit data;
  • treasury pulls core cash flow data from all instruments;
  • profitability departments pull the same instrument data as credit and treasury and add ledger information for allocations;
  • financial reporting pulls ledgers and some subledgers for reporting;
  • and regulatory reporting pulls the same data yet again to submit reports to regulators per prescribed templates.

The ever-growing list of considerations has compelled firms to revise, continually and on the fly, not just how they manage their data but how they manage their people and basic organizational structures. An effort to integrate activities and foster transparency – in particular through greater cooperation among risk and finance – has emerged across financial services. This often has been in response to demands from regulators, but some of the more enlightened leaders in the industry see it as the most sensible way to comply with supervisory mandates and respond to commercial exigencies, as well. Their ability to do that has been constrained by the variety, frequency and sheer quantity of information sought by regulators, boards and senior executives. But that is beginning to change as a result of new technological capabilities and, at least as important, new management strategies. This is where the convergence of Finance, Risk and Regulatory Reporting (FRR) comes in. The idea behind the FRR theme is that sound regulatory compliance and sound business analytics are manifestations of the same set of processes. Satisfying the demands of supervisory authorities and maximizing profitability and competitiveness in the marketplace involve similar types of analysis, modeling and forecasting. Each is best achieved, therefore, through a comprehensive, collaborative organizational structure that places the key functions of finance, risk and regulatory reporting at its heart.

The glue that binds this entity together and enables it to function as efficiently and cost effectively as possible – financially and in the demands placed on staff – is a similarly comprehensive and unified FRR data management. The right architecture will permit data to be drawn upon from all relevant sources across an organization, including disparate legacy hardware and software accumulated over the years in silos erected for different activities ad geographies. Such an approach will reconcile and integrate this data and present it in a common, consistent, transparent fashion, permitting it to be deployed in the most efficient way within each department and for every analytical and reporting need, internal and external.

The immense demands for data, and for a solution to manage it effectively, have served as a catalyst for a revolutionary development in data management: Regulatory Technology, or RegTech. The definition is somewhat flexible and tends to vary with the motivations of whoever is doing the defining, but RegTech basically is the application of cutting-edge hardware, software, design techniques and services to the idiosyncratic challenges related to financial reporting and compliance. The myriad advances that fall under the RegTech rubric, such as centralized FRR or RegTech data management and analysis, data mapping and data visualization, are helping financial institutions to get out in front of the stringent reporting requirements at last and accomplish their efforts to integrate finance, risk and regulatory reporting duties more fully, easily and creatively.

A note of caution though: While new technologies and new thinking about how to employ them will present opportunities to eliminate weaknesses that are likely to have crept into the current architecture, ferreting out those shortcomings may be tricky because some of them will be so ingrained and pervasive as to be barely recognizable. But it will have to be done to make the most of the systems intended to improve or replace existing ones.

Just what a solution should encompass to enable firms to meet their data management objectives depends on the

  • specifics of its business, including its size and product lines,
  • the jurisdictions in which it operates,
  • its IT budget
  • and the tech it has in place already.

But it should accomplish three main goals:

  1. Improving data lineage by establishing a trail for each piece of information at any stage of processing
  2. Providing a user-friendly view of the different processing step to foster transparency
  3. Working together seamlessly with legacy systems so that implementation takes less time and money and imposes less of a burden on employees.

The two great trends in financial supervision – the rapid rise in data management and reporting requirements, and the demands for greater organizational integration – can be attributed to a single culprit: the lingering silo structure. Fragmentation continues to be supported by such factors as a failure to integrate the systems of component businesses after a merger and the tendency of some firms to find it more sensible, even if it may be more costly and less efficient in the long run, to install new hardware and software whenever a new set of rules comes along. That makes regulators – the people pressing institutions to break down silos in the first place – inadvertently responsible for erecting new barriers.

This bunker mentality – an entrenched system of entrenchment – made it impossible to recognize the massive buildup of credit difficulties that resulted in the global crisis. It took a series of interrelated events to spark the wave of losses and insolvencies that all but brought down the financial system. Each of them might have appeared benign or perhaps ominous but containable when taken individually, and so the occupants of each silo, who could only see a limited number of the warning signs, were oblivious to the extent of the danger. More than a decade has passed since the crisis began, and many new supervisory regimens have been introduced in its aftermath. Yet bankers, regulators and lawmakers still feel the need, with justification, to press institutions to implement greater organizational integration to try to forestall the next meltdown. That shows how deeply embedded the silo system is in the industry.

Data requirements for the development that, knock on wood, will limit the damage from the next crisis – determining what will happen, rather than identifying and explaining what has already happened – are enormous. The same goes for running an institution in a more integrated way. It’s not just more data that’s needed, but more kinds of data and more reliable data. A holistic, coordinated organizational structure, moreover, demands that data be analyzed at a higher level to reconcile the massive quantities and types of information produced within each department. And institutions must do more than compile and sort through all that data. They have to report it to authorities – often quarterly or monthly, sometimes daily and always when something is flagged that could become a problem. Indeed, some data needs to be reported in real time. That is a nearly impossible task for a firm still dominated by silos and highlights the need for genuinely new design and implementation methods that facilitate the seamless integration of finance, risk and regulatory reporting functions. Among the more data-intensive regulatory frameworks introduced or enhanced in recent years are:

  • IFRS 9 Financial Instruments and Current Expected Credit Loss. The respective protocols of the International Accounting Standards Board and Financial Accounting Standards Board may provide the best examples of the forwardthinking approach – and rigorous reporting, data management and compliance procedures – being demanded. The standards call for firms to forecast credit impairments to assets on their books in near real time. The incurred-loss model being replaced merely had banks present bad news after the fact. The number of variables required to make useful forecasts, plus the need for perpetually running estimates that hardly allow a chance to take a breath, make the standards some of the most data-heavy exercises of all.
  • Stress tests here, there and everywhere. Whether for the Federal Reserve’s Comprehensive Capital Analysis and Review (CCAR) for banks operating in the United States, the Firm Data Submission Framework (FDSF) in Britain or Asset Quality Reviews, the version conducted by the European Banking Authority (EBA) for institutions in the euro zone, stress testing has become more frequent and more free-form, too, with firms encouraged to create stress scenarios they believe fit their risk profiles and the characteristics of their markets. Indeed, the EBA is implementing a policy calling on banks to conduct stress tests as an ongoing risk management procedure and not merely an assessment of conditions at certain discrete moments.
  • Dodd-Frank Wall Street Reform and Consumer Protection Act. The American law expands stress testing to smaller institutions that escape the CCAR. The act also features extensive compliance and reporting procedures for swaps and other over-the-counter derivative contracts.
  • European Market Infrastructure Regulation. Although less broad in scope than Dodd-Frank, EMIR has similar reporting requirements for European institutions regarding OTC derivatives.
  • AnaCredit, Becris and FR Y-14. The European Central Bank project, known formally as the Analytical Credit Dataset, and its Federal Reserve equivalent for American banks, respectively, introduce a step change in the amount and granularity of data that needs to be reported. Information on loans and counterparties must be reported contract by contract under AnaCredit, for example. Adding to the complication and the data demands, the European framework permits national variations, including some with particularly rigorous requirements, such as the Belgian Extended Credit Risk Information System (Becris).
  • MAS 610. The core set of returns that banks file to the Monetary Authority of Singapore are being revised to require information at a far more granular level beginning next year. The number of data elements that firms have to report will rise from about 4,000 to about 300,000.
  • Economic and Financial Statistics Review (EFS). The Australian Prudential Authority’s EFS Review constitutes a wide-ranging update to the regulator’s statistical data collection demands. The sweeping changes include requests for more granular data and new forms in what would be a three-phase implementation spanning two years, requiring parallel and trial periods running through 2019 and beyond.

All of those authorities, all over the world, requiring that much more information present a daunting challenge, but they aren’t the only ones demanding that finance, risk and regulatory reporting staffs raise their games. Boards, senior executives and the real bosses – shareholders – have more stringent requirements of their own for profitability, capital efficiency, safety and competitiveness. Firms need to develop more effective data management and analysis in this cause, too.

The critical role of data management was emphasized and codified in Document 239 of the Basel Committee on Banking Supervision (BCBS), “Principles for Effective Risk Data Aggregation and Risk Reporting.” PERDARR, as it has come to be called in the industry, assigns data management a central position in the global supervisory architecture, and the influence of the 2013 paper can be seen in mandates far and wide. BCBS 239 explicitly linked a bank’s ability to gauge and manage risk with its ability to function as an integrated, cooperative unit rather than a collection of semiautonomous fiefdoms. The process of managing and reporting data, the document makes clear, enforces the link and binds holistic risk assessment to holistic operating practices. The Basel committee’s chief aim was to make sure that institutions got the big picture of their risk profile so as to reveal unhealthy concentrations of exposure that might be obscured by focusing on risk segment by segment. Just in case that idea might escape some executive’s notice, the document mentions the word “aggregate,” in one form or another, 86 times in the 89 ideas, observations, rules and principles it sets forth.

The importance of aggregating risks, and having data management and reporting capabilities that allow firms to do it, is spelled out in the first of these: ‘One of the most significant lessons learned from the global financial crisis that began in 2007 was that banks’ information technology (IT) and data architectures were inadequate to support the broad management of financial risks. Many banks lacked the ability to aggregate risk exposures and identify concentrations quickly and accurately at the bank group level, across business lines and between legal entities. Some banks were unable to manage their risks properly because of weak risk data aggregation capabilities and risk reporting practices. This had severe consequences to the banks themselves and to the stability of the financial system as a whole.’

If risk data management was an idea whose time had come when BCBS 239 was published five years ago, then RegTech should have been the means to implement the idea. RegTech was being touted even then, or soon after, as a set of solutions that would allow banks to increase the quantity and quality of the data they generate, in part because RegTech itself was quantitatively and qualitatively ahead of the hardware and software with which the industry had been making do. There was just one ironic problem: Many of the RegTech solutions on the market at the time were highly specialized and localized products and services from small providers. That encouraged financial institutions to approach data management deficiencies gap by gap, project by project, perpetuating the compartmentalized, siloed thinking that was the scourge of regulators and banks alike after the global crisis. The one-problem-at-a-time approach also displayed to full effect another deficiency of silos: a tendency for work to be duplicated, with several departments each producing the same information, often in different ways and with different results. That is expensive and time consuming, of course, and the inconsistencies that are likely to crop up make the data untrustworthy for regulators and for executives within the firm that are counting on it.

Probably the most critical feature of a well thought-out solution is a dedicated, focused and central FRR data warehouse that can chisel away at the barriers between functions, even at institutions that have been slow to abandon a siloed organizational structure reinforced with legacy systems.

FRR

With :

  • E : Extract
  • L : Load
  • T : Transform Structures
  • C : Calculations
  • A : Aggregation
  • P : Presentation

 

Click here to access Wolters Kluwer’s White Paper

 

 

Front Office Risk Management Technology

A complex tangle of embedded components

Over the past three decades, Front Office Risk Management (FORM) has developed in a piecemeal way. As a result of historical business drivers and the varying needs of teams focused on different products within banks, FORM systems were created for individual business silos, products and trading desks. Typically, different risk components and systems were entwined and embedded within trading systems and transaction processing platforms, and ran on different analytics, trade capture and data management technology. As a result, many banks now have multiple, varied and overlapping FORM systems.

Increasingly, however, FORM systems are emerging as a fully fledged risk solution category, rather than remaining as embedded components inside trading systems or transactional platforms (although those components still exist). For many institutions FORM, along with the frontoffice operating environment, has fundamentally changed following the global financial crisis of 2008. Banks are now dealing with a wider environment of systemically reduced profitability in which cluttered and inefficient operating models are no longer sustainable, and there are strong cost pressures for them to simplify their houses.

Equally, a more stringent and prescriptive regulatory environment is having significant direct and indirect impacts on front-office risk technology. Because of regulators’ intense scrutiny of banks’ capital management, the front office is continuously and far more acutely aware of its capital usage (and cost), and this is having a fundamental impact on the way the systems it uses are evolving. The imperative for risk-adjusted pricing means that traditional trading systems are struggling to cope with the growing importance of and demand for Valuation Adjustment (xVA) systems at scale. Meanwhile, regulations such as the Fundamental Review of the Trading Book (FRTB) will have profound implications for frontoffice risk systems.

As a result of these direct and indirect regulatory pressures, several factors are changing the frontoffice risk technology landscape:

  • The scale and complexity involved in data management.
  • Requirements for more computational power.
  • The imperative for integration and consistency with middle-office risk systems.

Evolving to survive

As banks recognize the need for change, FORM is slowly but steadily evolving. Banks can no longer put off upgrades to systems that were built for a different era, and consensus around the need for a flexible, cross-asset, externalized front-office risk system has emerged.

Over the past few years, most Tier 1 and Tier 2 banks have started working toward the difficult goal of

  • standardizing,
  • consolidating
  • and externalizing

their risk systems, extracting them from trading and transaction processing platforms (if that’s where they existed). These efforts are complicated by the nature of FORM – specifically that it cuts across several functional areas.

Vendors, meanwhile, are struggling with the challenges of meeting the often contradictory nature of front-office demands (such as the need for flexibility vs. scalability). As the frontoffice risk landscape shifts under the weight of all these demand-side changes, many leading vendors have been slow to adapt to the significant competitive challenges. Not only are they dealing with competition from new market entrants with different business models, in many instances they are also playing catch-up with more innovative Tier 1 banks. What’s more, the willingness to experiment and innovate with front-office risk systems is now filtering down to Tier 2s and smaller institutions across the board. Chartis is seeing an increase in ‘build and buy’ hybrid solutions that leverage open-source and open-HPC2 infrastructure.

The rapid development of new technologies is radically altering the dynamics of the market, following several developments:

  • A wave of new, more focused tools.
  • Platforms that leverage popular computational paradigms.
  • Software as a Service (SaaS) risk systems.

More often than not, incumbent vendors are failing to harness the opportunities that these technologies and new open-source languages bring, increasing the risk that they could become irrelevant within the FORM sector. Chartis contends that, as the market develops, the future landscape will be dominated by a combination of agile new entrants and existing players that can successfully transform their current offerings. Vendors have many different strategies in evidence, but the evolution required for them to survive and flourish has only just begun.

With that in mind, we have outlined several recommendations for vendors seeking to stay relevant in the new front-office risk environment:

  • Above all, focus on an open, flexible environment.
  • Create consistent risk data and risk factor frameworks.
  • Develop highly standardized interfaces.
  • Develop matrices and arrays as ‘first-class constructs’.
  • Embrace open-source languages and ecosystems.
  • Consider options such as partnerships and acquisitions to acquire the requisite new skills and technology capabilities in a relatively short period of time.

Chartis

Click here to access Chartis’ Vendor Spotlight Report