Actualités

EIOPA: Potential macroprudential tools and measures to enhance the current insurance regulatory framework

The European Insurance and Occupational Pensions Authority (EIOPA) initiated in 2017 the publication of a series of papers on systemic risk and macroprudential policy in insurance. So far, most of the discussions concerning macroprudential policy have focused on the banking sector. The aim of EIOPA is to contribute to the debate, whilst taking into consideration the specific nature of the insurance business.

With this purpose, EIOPA has followed a step-by-step approach, seeking to address the following questions:

  • Does insurance create or amplify systemic risk?
  • If yes, what are the tools already existing in the current framework, and how do they contribute to mitigate the sources of systemic risk?
  • Are other tools needed and, if yes, which ones could be promoted?

While the two first questions were addressed in previous papers, the purpose of the present paper is to identify, classify and provide a preliminary assessment of potential additional tools and measures to enhance the current framework in the EU from a macroprudential perspective.

EIOPA carried out an analysis focusing on four categories of tools:

  1. Capital and reserving-based tools;
  2. Liquidity-based tools;
  3. Exposure-based tools; and
  4. Pre-emptive planning.

EIOPA also considers whether the tools should be used for enhanced reporting and monitoring or as intervention power. Following this preliminary analysis, EIOPA concludes the following (Table 1):

Table 1 Macro

It is important to stress that the paper essentially focuses on whether a specific instrument should or should not be further considered. This is an important aspect in light of future work in the context of the Solvency II review. As such, this work should be understood as a first step of the process and not as a formal proposal yet. Furthermore, EIOPA is aware that the implementation of tools also has important challenges. In this respect this report provides an overview of tools, main conclusions and observations, stressing also the main challenges.

Table 2 puts together the findings of all three papers published by EIOPA by linking

  1. sources of systemic risk and operational objectives (first paper),
  2. tools already available in the current framework (second paper)
  3. and other potential tools and measures to be further considered (current paper).

Table 2 Papers

The first paper, ‘Systemic risk and macroprudential policy in insurance’ aimed at identifying and analysing the sources of systemic risk in insurance from a conceptual point of view and at developing a macroprudential framework specifically designed for the insurance sector.

The second paper, ‘Solvency II tools with macroprudential impact’, identified, classified and provided a preliminary assessment of the tools or measures already existing within the Solvency II framework, which could mitigate any of the sources of systemic risk.

This third paper carries out an initial assessment of potential tools or measures to be included in a macroprudential framework designed for insurers, in order to mitigate the sources of systemic risk and contribute to the achievement of the operational objectives.

It covers six main issues:

  1. Identification of potential new instruments/measures. The tools will be grouped according to the following blocks:
    • Capital and reserving-based tools
    • Liquidity-based tools
    • Exposure-based tools
    • Pre-emptive planning
  2. Way in which the tools in each block contribute to achieving one or more of the operational objectives identified in previous papers.
  3. Interaction with Solvency II.
  4. Individual description of all the tools identified for each of the blocks. The following classification will be considered:
    • Enhanced reporting and monitoring tools and measures. They provide supervisors and other authorities with additional relevant information about potential risks and vulnerabilities that are or could be building up in the system. Authorities could then implement an array of measures to address them both at micro and macroprudential level (see annex for an inventory of powers potentially available to national supervisory authorities (NSAs)).
    • Intervention powers. These powers are currently not available as macroprudential tools. They are more intrusive and intervene more severely in the management of the companies. Examples could be additional buffers, limits or restrictions. They are only justified where the existing measures may not suffice to address the sources of systemic risk identified.
  5. Preliminary analysis per tool.
  6. Preliminary conclusion.

Four initial remarks should be made.

  1. First, although in several instances the measures and instruments are originally microprudential in nature, they could also be implemented as macroprudential instruments, if a systemically important institution or set of institutions or the whole market are targeted.
  2. Secondly, analysing potential changes on the long-term guarantees (LTG) measures and measures on equity risk that were introduced in the Solvency II directive, although out of the scope of this paper, could contribute to further enhance the framework from a macroprudential perspective. The focus of this paper is essentially on new tools, leaving aside the analysis of potential changes in the current LTG measures and measures on equity risk, which will be carried out in the context of the Solvency II review by 1 January 2021.
  3. Thirdly, when used as a macroprudential tool, the decision process may differ, given that there are different institutional models for the implementation of macroprudential policies across EU countries, in some cases involving different parties (e.g. ministries, supervisors, etc.). This paper seeks to adopt a neutral approach by referring to the concept of the ‘relevant authority in charge of the macroprudential authority’, which should encompass the different institutional models existing across jurisdictions.
  4. Fourthly, there seems to be no single solution when it comes to the level of application of each tool (single vs. group level).

Concerning the different proposed monitoring tools, in the follow-up work, the structure and content of the additional data requirements should be defined. This should then be followed by an assessment of the potential burden of collecting this information from undertakings.

It is important to stress that this paper essentially focuses on whether a specific instrument should or should not be further considered. This is an important aspect in light of future work in the context of the Solvency II review. As such, this work should be understood as a first step of the process and not as a formal proposal yet.

Figure ORSA

Click here to access EIOPA’s detailed discussion paper

Failures and near misses in insurance – Overview of the causes and early identification

General approach

The approach to dealing with failures of financial institutions has witnessed significant changes since the eruption of the financial crisis in 2008, both from the crisis prevention and the crisis management perspective. A changing perspective in the interpretation of the causes, early identification and corrective measures used in the context of (near) failures may create difficulties when trying to compare past failures with current ones, particularly with the advent of recovery and resolution frameworks in finance.

EIOPA has developed its own conceptual approach, which is followed throughout this report. It should be stressed that there is not a conceptual approach which is universally agreed. The aim of the present chapter is to explain the approach followed by EIOPA, in order to achieve a common understanding and support the classification of the different cases of insurance failures and near misses.

This chapter focuses on the following two issues:

  • The definition of the concepts of “failure” and “near miss”, which are essential to understanding the database construction process and the scope of the cases to be included.
  • The need to have a common understanding of the framework for crisis prevention and management, as well as the recovery and resolution tools to be used.

In terms of crisis prevention and management, the fundamental approach followed by EIOPA can be understood as part of a continuum of supervisory activities. Illustration 1 below summarizes the whole process: During business as usual, and in the normal stages of supervision, an initial problem can be identified, and insurers may seek to implement measures to overcome the problem. Supervisors would, in turn, normally intensify supervision and follow-up more closely on the developments of the insurer. Should the initial problem become a real financial threat (e.g. being in breach of, or about to breach, solvency capital requirements) the insurer enters into a new stage, which is linked to an increased risk of failure, i.e. a near miss situation. In this context, the insurer should trigger certain recovery actions to restore its financial position, while supervisors can intervene more intrusively. In general, there should be a reasonable prospect of recovery if effective and credible measures are implemented. Nevertheless, if the situation of distress is extremely severe and the measures taken do not yield the expected results, the insurer enters into resolution.

Eventually, the insurer (or parts of it) is (are) wound-up and exits the market.

EIOPA - Resolution

Near miss

In the context of this report, a near miss is defined as a case where an insurer faces specific financial difficulties (for example, when the solvency requirements are breached or likely to be breached) and the supervisor feels it necessary to intervene or to place the insurer under some form of special measures.

The elements to identify a near miss are the following:

  • The insurer is still in operation under its original form;
  • Nevertheless it is subject to a severe financial distress to an extent that the supervisory authority deems it necessary to intervene; and
  • In the absence of this intervention, the insurer will not survive in its current form and may eventually go into resolution or be wound-up.

Underlying is the idea of success of the measures taken. As such, it should not involve public money or policyholders’ loss.

In other words, a near miss presupposes that the supervisory intervention, either directly (e.g. replacing the management) or indirectly (e.g. request for an increase in capital), contributed in a clear way to overcome the insurer’s financial distress and bring it back to a “business-as-usual” environment. Shareholders generally keep their rights and could potentially oppose any of the measures undertaken.

On a day-to-day basis, insurers and NSAs might have to take different actions that require a certain degree of coordination. A “near miss” in the sense described in this report should be distinguished from these type of situations. Near misses only refer to cases where severe problems were detected or reported and supervisory measures were necessary to ensure the viability of the insurer.

Near misses actually constitute an area of particular interest for this report. In effect, their correct reporting and analysis would allow valuable lessons to be learned from successfully managed distress situations – prospective failure of an insurer and supervisory actions that permitted recovery.

Insurance failure

A failure, for the purposes of the present database, exists from the moment when an insurer is no longer viable or likely to be no longer viable, and has no reasonable prospect of becoming so.

The processes of winding-up/liquidation, which are usually initiated after insolvency, either on a balance sheet basis (the insurer’s liabilities are greater than its assets) or cash-flow basis (the insurer is unable to pay its debts as they fall due), are also encompassed within the definition of failure for the purposes of the database. Failure is thus triggered by “non-viability”.

The failed insurer ceases to operate in its current form. Shareholders generally lose some or all of their rights and cannot oppose to the measures taken by the authority in charge of resolution, which has formally taken over the reins from the supervisory authority.

For classification purposes, any case is considered as a failure (regardless of the final result of the intervention) when:

  • Private external support (e.g. by means of an insurance guarantee system (IGS)) has been received.
  • Public funds by taxpayers were needed for policyholders’ protection or financial stability reasons.
  • Policyholders have suffered any type of loss, be it in financial terms or in a deterioration of their insurance coverage.

The following are examples of resolution tools that may be used by authorities in a case of failure:

  • Sale of all or part of the insurers’ business to a private purchaser. A particular case is the transfer of an insurers’ portfolio, moving all or part of its business to another insurer without the consent of each and every policyholder.
  • Discontinue the writing of new business and continue administering the existing contractual policy obligations for inforce business (run-off).
  • Set-up a bridge institution as a temporary public entity to which all or part of the business of the insurer is transferred in order to preserve its critical functions.
  • Separate toxic assets from good assets establishing an asset management vehicle (i.e. a “bad insurer” similar to the concept used in banking) wholly owned by one or more public authorities for managing and running-down those assets in an orderly manner.
  • Restructure, limit or write down liabilities (including insurance and reinsurance liabilities) and allocate losses following the hierarchy of claims.

This also includes the bail-in of liabilities when they are by converted into equity.

  • Closure and orderly liquidation of the whole or part of a failing insurer.
  • Withdrawal of authorisation.

Lastly, it should be mentioned that the flow of events shown in Illustration 1 does not necessarily take place in a sequential way. For example, there could be cases in which an insurer goes directly into resolution. Thus, what is relevant for the classification of a particular case is whether the insurer recovers (which would then be considered as a near miss or as a case resolution/return to market if some kind of resolution action/tool is used) or has to be fully resolved and/or liquidated.

EIOPA - Sharma Risks

Click here to access EIOPA’s detailed report

The Future of Planning Budgeting and Forecasting

The world of planning, budgeting and forecasting is changing rapidly as new technologies emerge, but the actual pace of change within the finance departments of most organizations is rather more sluggish. The progress companies have made in the year since The Future of Planning, Budgeting and Forecasting 2016 has been incremental, with a little accuracy gained but very little change to the reliance on insight-limiting technologies like spreadsheets.

That said, CFOs and senior finance executives are beginning to recognize the factors that contribute to forecasting excellence, and there is a groundswell of support for change. They’ll even make time to do it, and we all know how precious a CFOs time can be, especially when basic improvements like automation and standardization haven’t yet been implemented.

The survey shows that most PBF functions are still using relatively basic tools, but it also highlights the positive difference more advanced technology like visualization techniques and charting can make to forecasting outcomes. For the early adopters of even more experimental technologies like machine learning and artificial intelligence, there is some benefit to being at the forefront of technological change. But the survey suggests that there is still some way to go before machines take over the planning, budgeting and forecasting function.

In the meantime, senior finance executives who are already delivering a respected, inclusive and strategic PBF service need to focus on becoming more insightful, which means using smart technologies in concert with non-financial data to deliver accurate, timely, long term forecasts that add real value to the business.

Making headway

CFOs are making incremental headway in improving their planning, budgeting and forecasting processes, reforecasting more frequently to improve accuracy. But spreadsheet use remains a substantial drag on process improvements, despite organizations increasingly looking towards new technologies to progress the PBF landscape.

That said, respondents seem open to change, recognizing the importance of financial planning and analysis as a separate discipline, which will help channel resources in that direction. At the moment, a slow and steady approach is enough to remain competitive, but as more companies make increasingly substantial changes to their PBF processes to generate better insight, those that fail to speed up will find they fall behind.

Leading the debate

FSN’s insights gleaned from across the finance function shed light on the changes happening within the planning, budgeting and forecasting function, and identify the processes that make a real difference to outcomes. Senior finance executives are taking heed of these insights and making changes within the finance function. The most important one is the increasing inclusion of non-financial data into forecasting and planning processes. The Future of The Finance Function 2016 identified this as a game-changer, for the finance function as a whole, and for PBF in particular. It is starting to happen now. Companies are looking towards data from functions outside of finance, like customer relationship management systems and other non-financial data sources.

Senior executives are also finally recognizing the importance of automation and standardization as the key to building a strong PBF foundation. Last year it languished near the bottom of CFO’s priority lists, but now it is at the top. With the right foundation, PBF can start to take advantage of the new technology that will improve forecasting outcomes, particularly in the cloud.

There is increasing maturity in the recognition of cloud solution benefits, beyond just cost, towards agility and scalability. With recognition comes implementation, and it is hoped that uptake of these technologies will follow with greater momentum.

Man vs machine

Cloud computing has enabled the growth of machine learning and artificial intelligence solutions, and we see these being embedded into our daily lives, in our cars, personal digital assistants and home appliances. In the workplace, machine learning tools are being used for

  • predictive maintenance,
  • fraud detection,
  • customer personalization
  • and automating finance processes.

In the planning, budgeting and forecasting function, machine learning tools can take data over time, apply parameters to the analysis, and then learn from the outcomes to improve forecasts.

On the face of it, machine learning appears to be a game changer, adding unbiased logic and immeasurable processing power to the forecasting process, but the survey doesn’t show a substantial improvement in forecasting outcomes for organizations that use experimental technologies like these. And the CFOs and senior finance executives who responded to the survey believe there are substantial limitations to the effective of machine forecasts. As the technology matures, and finance functions become more integrated, machine learning will proliferate, but right now it remains the domain of early adopters.

Analytic tools

Many of the cloud solutions for planning, budgeting and forecasting involve advanced analytic tools, from visualization techniques to machine learning. Yet the majority of respondents still use basic spreadsheets, pivot tables and business intelligence tools to mine their data for forecasting insight. But they need to be upgrading their toolbox.

The survey identifies users of cutting edge visualization tools as the most effective forecasters. They are more likely to utilize specialist PBF systems, and have an arsenal of PBF technology they have prioritized for implementation in the next three years to improve their forecasts.

Even experimental organizations that aren’t yet able to harness the full power of machine learning and AI, are still generating better forecasts than the analytic novices.

The survey results are clear, advanced analytics must become the new baseline technology, it is no longer enough on rely on simple spreadsheets and pivot tables when your competitors are several steps ahead.

Insight – the top trump

But technology can’t operate in isolation. Cutting edge tools alone won’t provide the in-depth insight that is needed to properly compete against nimble start-ups. CFOs must ensure their PBF processes are inclusive, drawing input from outside the financial bubble to build a rounded view of the organization. This will engender respect for the PBF outcomes and align them with the strategic direction of the business.

Most importantly though, organizations need to promote an insightful planning, budgeting and forecasting function, by using advanced analytic techniques and tools, coupled with a broad data pool, to reveal unexpected insights and pathways that lead to better business performance.

As FSN stated, today’s finance organizations are looking to:

  • provide in-depth insights;
  • anticipate change and;
  • verify business opportunities before they become apparent to competitors.

But AI and machine learning technologies are still too immature. And spreadsheet-based processes don’t have the necessary functions to fill these advanced needs. While some might argue that spreadsheet-based processes could work for small businesses, they become unmanageable as companies grow.

PBF

Click here to access Wolters Kluwers FSN detailed survey report

The Innovation Game – How Data is Driving Digital Transformation

Technology waits for no one. And those who strike first will have an advantage. The steady decline in business profitability across multiple industries threatens to erode future investment, innovation and shareholder value. Fortunately, the emergence of artificial intelligence (AI) can help kick-start profitability. Accenture research shows that AI has the potential to boost rates of profitability by an average of 38 percent by 2035 and lead to an economic boost of US$14 trillion across 16 industries in 12 economies by 2035.

Driven by these economic forces, the age of digital transformation is in full swing. Today we can’t be “digital to the core” if we don’t leverage all new data sources – unstructured, dark data and thirty party sources. Similarly, we have to take advantage of the convergence of AI and analytics to uncover previously hidden insights. But, with the increasing use of AI, we also have to be responsible and take into account the social implications.

Finding answers to the biggest questions starts with data, and ensuring you are capitalizing on the vast data sources available within your own business. Thanks to the power of AI/machine learning and advanced algorithms, we have moved from the era of big data to the era of ALL data, and that is helping clients create a more holistic view of their customer and more operational efficiencies.

Embracing the convergence of AI and analytics is crucial to success in our digital transformation. Together,

  • AI-powered analytics unlock tremendous value from data that was previously hidden or unreachable,
  • changing the way we interact with people and technology,
  • improving the way we make decisions, and giving way to new agility and opportunities.

While businesses are still in the infancy of tapping into the vast potential of these combined technologies, now is the time to accelerate. But to thrive, we need to be pragmatic in finding the right skills and partners to guide our strategy.

Finally, whenever we envision the possibilities of AI, we should consider the responsibility that comes with it. Trust in the digital era or “responsible AI” cannot be overlooked. Explainable AI and AI transparency are critical, particularly in such areas as

  • financial services,
  • healthcare,
  • and life sciences.

The new imperative of our digital transformation is to balance intelligent technology and human ingenuity to innovate every facet of business and become a smarter enterprise.

The exponential growth of data underlying the strategic imperative of enterprise digital transformation has created new business opportunities along with tremendous challenges. Today, we see organizations of all shapes and sizes embarking on digital transformation. As uncovered in Corinium Digital’s research, the primary drivers of digital transformation are those businesses focused on addressing increasing customer expectations and implementing efficient internal processes.

Data is at the heart of this transformation and provides the fuel to generate meaningful insights. We have reached the tipping point where all businesses recognize they cannot compete in a digital age using analog-era legacy solutions and architectures. The winners in the next phase of business will be those enterprises that obtain a clear handle on the foundations of modern data management, specifically the nexus of

  • data quality,
  • cloud,
  • and artificial intelligence (AI).

While most enterprises have invested in on-premises data warehouses as the backbone of their analytic data management practices, many are shifting their new workloads to the cloud. The proliferation of new data types and sources is accelerating the development of data lakes with aspirations of gaining integrated analytics that can accelerate new business opportunities. We found in the research that over 60% of global enterprises are now investing in a hybrid, multi-cloud strategy with both data from cloud environments such as Microsoft Azure along with existing on-premises infrastructures. Hence, this hybrid, multicloud strategy will need to correlate with their investments in data analytics, and it will become imperative to manage data seamlessly across all platforms. At Paxata, our mission is to give everyone the power to intelligently profile and transform data into consumable information at the speed of thought. To empower everyone, not just technical users, to prepare their data and make it ready for analytics and decision making.

The first step in making this transition is to eliminate the bottlenecks of traditional IT-led data management practices through AI-powered automation.

Second, you need to apply modern data preparation and data quality principles and technology platforms to support both analytical and operational use cases.

Thirdly, you need a technology infrastructure that embraces the hybrid, multi-cloud world. Paxata sits right at the center stage of this new shift, helping enterprises profile and transform complex data types in highvariety, high-volume environments. As such, we’re excited about partnering with Accenture and Microsoft to accelerate businesses with our ability to deliver modern analytical and operational platforms to address today’s digital transformation requirements.

Artificial intelligence is causing two major revolutions simultaneously among developers and enterprises. These revolutions will drive the technology decisions for the next decade. Developers are massively embracing AI. As a platform company, Microsoft is focused on enabling developers to make the shift to the next app development pattern, driven by the intelligent cloud and intelligent edge.

AI is the runtime that will power the apps of the future. At the same time, enterprises are eager to adopt and integrate AI. Cloud and AI are the most requested topics in Microsoft Executive Briefing Centers. AI is changing how companies serve their customers, run their operations, and innovate.

Ultimately, every business process in every industry will be redefined in profound ways. If it used to be true that “software was eating the world,” it is now true to say that “AI is eating software”. A new competitive differentiator is emerging: how well an enterprise exploits AI to reinvent and accelerate its processes, value chain and business models. Enterprises need a strategic partner who can help them transform their organization with AI. Microsoft is emerging as a solid AI leader as it is in a unique position to address both revolutions. Our strength and differentiation lie in the combination of multiple assets:

  • Azure AI services that bring AI to every developer. Over one million developers are accessing our pre-built and customizable AI services. We have the most comprehensive solution for building bots, combined with a powerful platform for Custom AI development with Azure Machine Learning that spans the entire AI development lifecycle, and a market leading portfolio of pre-built cognitive services that can be readily attached to applications.
  • A unique cloud infrastructure including CPU, GPU, and soon FPGA, makes Azure the most reliable, scalable and fastest cloud to run AI workloads.
  • Unparalleled tools. Visual Studio, used by over 6 million developers, is the most preferred tool in the world for application development. Visual Studio and Visual Studio Code are powerful “front doors” through which to attract developers seeking to add AI to their applications.
  • Ability to add AI to the edge. We enable developers, through our tools and services, to develop an AI model and deploy that model anywhere. Through our support for ONNX – the open source representation for AI models in partnership with Facebook, Amazon, IBM and others – as well as for generic containers, we allow developers to run their models on the IoT edge and leverage the entire IoT solution from Azure.

But the competition to win enterprises is not only played in the platform battlefield, enterprises are demanding solutions. Microsoft AI solutions provide turnkey implementations for customers who want to transform their core processes with AI. Our unique combination of IP and consulting services address common scenarios such as business agents, sales intelligence or marketing intelligence. As our solutions are built on top of our compelling AI platform, unlike ourcompetitors, our customers are not locked in to any one consulting provider, they remain in full control of their data and can extend the scenarios or target new scenarios themselves or through our rich partner ecosystem.

AI Analytics

Click here to access Corinium’s White Paper

RPA – A programmatic approach to intelligent automation to scale growth, manage risk, and drive enterprise value

Business leaders and chief information officers around the world are jumping on the robotic process automation (RPA) pilot bandwagon to start their companies on the automation journey. Some RPA pilots are evaluating software designed to stitch together known technology concepts—such as screen scraping and macrobased automation—through user-friendly tools to take process automation to the next level. Other pilots are venturing into the use of machine learning and cognitive automation to unleash new business insights.

These pilots—or proof-of-concept programs—help leaders set a foundation for their understanding of RPA, while at the same time introducing new ideas for how automation can help scale operations or define new business strategies. And now the pilot was successful, and leaders are seeing the possibilities. So what happens next?

When performing RPA pilots many companies get stuck in basic automation and stop there. Other companies have basic and cognitive automation pilots going on simultaneously.

Aligning the goals of basic RPA with cognitive computing and artificial intelligence can seem improbable. But are the objectives really that different? Leaders want to use all levels of automation to

  • drive business growth,
  • manage risk,
  • and increase value.

The trick is having a strategy for getting from pilot to program, and putting in place a comprehensive structure looking beyond the RPA pilots to intelligent automation (IA) as an across-the-board investment. This ensures IA ventures become more than speculation and remain significant to the business.

  • But how can leaders ensure that IA is more than a one-time cost play?
  • How are future automation opportunities identified and evaluated for both risk and benefit?
  • How is “electronic employee” service performance monitored?
  • How do leaders ensure the optimal mix of basic, enhanced, and cognitive automation?
  • How is business continuity maintained if the IA solution fails?
  • How is
    • system security,
    • change management,
    • system processing,
    • and authentication control
  • maintained as automation risk becomes more complex?
  • How will IA be used to transform the business?

Leaders know technology is changing rapidly, and IA is a moving target. Implementing a “bullet-proof” value-based program is critical to managing the automation revolution and ensuring it delivers positive business impacts over time. Robust program management balances risk and reward with structures driving sustainable IA value. An IA program model delivers these ideals.

An Intelligent Automation program can help enhance and expedite the implementation of IA throughout an organization. Here are four critical characteristics for success:

  1. It is strategically positioned – Positioning IA on par with other business strategies as integral to enterprise objectives is the best place to start. Similar to outsourcing (OS), these dependent IA vendor relationships are treated as strategic. Global processowners (GPO) use IA to transform end-to-end services. Global teams engage in IA opportunity evaluation to ensure bad processes are not automated.
  2. It uses a “center of excellence” service model – Establishing a center of excellence (CoE) demonstrates a commitment to IA success. Focus drives effectiveness, and CoEs drive transparency to IA results. CoEs have varied formats (virtual, centralized, regional, etc.) and engage cross-functional teams. CoE governance guides IA strategy and validates results. Clarifying decision rights balances governance and operations accountabilities. Incorporating IA support roles (e.g., HR, IT Security, Internal Audit, risk) in decision-making ensures change integration is well managed.
  3. It has a robust delivery framework – Integrating technologies, toolkits, and tactics into IA program execution safeguards sustainability. Including relevant designers, IT professionals, and operations teams in testing makes sure solutions work. Socializing and managing life cycle compliance (e.g., intake, approvals, testing) ensures team interaction is clear. Program management, repository, and workflow tools makes oversight effective.
  4. It incorporates a proactive risk management structure – Making IT risk and security control oversight a part of IA development ensures solutions are sound. Like any technology integration, change control is critical to implementation success. An IT security risk and control framework provides this support. Risk mitigation strategies linking security reviews to IA validation ensures business goals and technology risks are appropriately considered.

RPA

Click here to access KPMG’s detailed RPA report

Global Governance Insights on Emerging Risks

A HEIGHTENED FOCUS ON RESPONSE AND RECOVERY

Over a third of directors of US public companies now discuss cybersecurity at every board meeting. Cyber risks are being driven onto the agenda by

  • high-profile data breaches,
  • distributed denial of services (DDoS) attacks,
  • and rising ransomware and cyber extortion attacks.

The concern about cyber risks is justified. The annual economic cost of cyber-crime is estimated at US$1.5 trillion and only about 15% of that loss is currently covered by insurance.

MMC Global Risk Center conducted research and interviews with directors from WCD to understand the scope and depth of cyber risk management discussions in the boardroom. The risk of cyberattack is a constantly evolving threat and the interviews highlighted the rising focus on resilience and recovery in boardroom cyber discussions. Approaches to cyber risks are maturing as organizations recognize them as an enterprise business risk, not just an information technology (IT) problem.

However, board focus varies significantly across industries, geographies, organization size and regulatory context. For example, business executives ranked cyberattacks among the top five risks of doing business in the Asia Pacific region but Asian organizations take 1.7 times longer than the global median to discover a breach and spend on average 47% less on information security than North American firms.

REGULATION ON THE RISE

Tightening regulatory requirements for cybersecurity and breach notification across the globe such as

  • the EU GDPR,
  • China’s new Cyber Security Law,
  • and Australia’s Privacy Amendment,

are also propelling cyber onto the board agenda. Most recently, in February 2018, the USA’s Securities and Exchange Commission (SEC) provided interpretive guidance to assist public companies in preparing disclosures about cybersecurity risks and incidents.

Regulations relating to transparency and notifications around cyber breaches drive greater discussion and awareness of cyber risks. Industries such as

  • financial services,
  • telecommunications
  • and utilities,

are subject to a large number of cyberattacks on a daily basis and have stringent regulatory requirements for cybersecurity.

Kris Manos, Director, KeyCorp, Columbia Forest Products, and Dexter Apache Holdings, observed, “The manufacturing sector is less advanced in addressing cyber threats; the NotPetya and WannaCry attacks flagged that sector’s vulnerability and has led to a greater focus in the boardroom.” For example, the virus forced a transportation company to shut down all of its communications with customers and also within the company. It took several weeks before business was back to normal, and the loss of business was estimated to have been as high as US$300 million. Overall, it is estimated that as a result of supply chain disruptions, consumer goods manufacturers, transport and logistics companies, pharmaceutical firms and utilities reportedly suffered, in aggregate, over US$1 billion in economic losses from the NotPetya attacks. Also, as Cristina Finocchi Mahne, Director, Inwit, Italiaonline, Banco Desio, Natuzzi and Trevi Group, noted, “The focus on cyber can vary across industries depending also on their perception of their own clients’ concerns regarding privacy and data breaches.”

LESSONS LEARNED: UPDATE RESPONSE PLANS AND EVALUATE THIRD-PARTY RISK

The high-profile cyberattacks in 2017, along with new and evolving ransomware onslaughts, were learning events for many organizations. Lessons included the need to establish relationships with organizations that can assist in the event of a cyberattack, such as l

  • aw enforcement,
  • regulatory agencies and recovery service providers
  • including forensic accountants and crisis management firms.

Many boards need to increase their focus on their organization’s cyber incident response plans. A recent global survey found that only 30% of companies have a cyber response plan and a survey by the National Association of Corporate Directors (NACD) suggests that only 60% of boards have reviewed their breach response plan over the past 12 months. Kris Manos noted, “[If an attack occurs,] it’s important to be able to quickly access a response plan. This also helps demonstrate that the organization was prepared to respond effectively.”

Experienced directors emphasized the need for effective response plans alongside robust cyber risk mitigation programs to ensure resilience, as well as operational and reputation recovery. As Jan Babiak, Director, Walgreens Boots Alliance, Euromoney Institutional Investor, and Bank of Montreal, stressed, “The importance of the ’respond and recover’ phase cannot be overstated, and this focus needs to rapidly improve.”

Directors need to review how the organization will communicate and report breaches. Response plans should include preliminary drafts of communications to all stakeholders including customers, suppliers, regulators, employees, the board, shareholders, and even the general public. The plan should also consider legal requirements around timelines to report breaches so the organization is not hit with financial penalties that can add to an already expensive and reputationally damaging situation. Finally, the response plan also needs to consider that normal methods of communication (websites, email, etc.) may be casualties of the breach. A cyber response plan housed only on the corporate network may be of little use in a ransomware attack.

Other lessons included the need to focus on cyber risks posed by third-party suppliers, vendors and other impacts throughout the supply chain. Shirley Daniel, Director, American Savings Bank, and Pacific Asian Management Institute, noted, “Such events highlight vulnerability beyond your organization’s control and are raising the focus on IT security throughout the supply chain.” Survey data suggests that about a third of organizations do not assess the cyber risk of vendors and suppliers. This is a critical area of focus as third-party service providers (e.g., software providers, cloud services providers, etc.) are increasingly embedded in value chains.

FRUSTRATIONS WITH OVERSIGHT

Most directors expressed frustrations and challenges with cyber risk oversight even though the topic is frequently on meeting agendas. Part of the challenge is that director-level cyber experts are thin on the ground; most boards have only one individual serving as the “tech” or “cyber” person. A Spencer Stuart survey found that 41% of respondents said their board had at least one director with cyber expertise, with an additional 7% who are in the process of recruiting one. Boards would benefit from the addition of experienced individuals who can identify the connections between cybersecurity and overall company strategy.

A crucial additional challenge is obtaining clarity on the organization’s overall cyber risk management framework. (See Exhibit 1: Boards Need More Information on Cyber Investments.) Olga Botero, Director, Evertec, Inc., and Founding Partner, C&S Customers and Strategy, observed, “There are still many questions unanswered for boards, including:

  • How good is our security program?
  • How do we compare to peers?

There is a big lack of benchmarking on practices.” Anastassia Lauterbach, Director, Dun & Bradstreet, and member of Evolution Partners Advisory Board, summarized it well, “Boards need a set of KPIs for cybersecurity highlighting their company’s

  • unique business model,
  • legacy IT,
  • supplier and partner relationships,
  • and geographical scope.”

CR Ex 1

Nearly a quarter of boards are dissatisfied with the quality of management-provided information related to cybersecurity because of insufficient transparency, inability to benchmark and difficulty of interpretation.

EFFECTIVE OVERSIGHT IS BUILT ON A COMPREHENSIVE CYBER RISK MANAGEMENT FRAMEWORK

Organizations are maturing from a “harden the shell” approach to a protocol based on understanding and protecting core assets and optimizing resources. This includes the application of risk disciplines to assess and manage risk, including quantification and analytics. (See Exhibit 2: Focus Areas of a Comprehensive Cyber Risk Management Framework.) Quantification shifts the conversation from a technical discussion about threat vectors and system vulnerabilities to one focused on maximizing the return on an organization’s cyber spending and lowering its total cost of risk.

CR Ex 2

Directors also emphasized the need to embed the process in an overall cyber risk management framework and culture. “The culture must emphasize openness and learning from mistakes. Culture and cyber risk oversight go hand in hand,” said Anastassia Lauterbach. Employees should be encouraged to flag and highlight potential cyber incidents, such as phishing attacks, as every employee plays a vital role in cyber risk management. Jan Babiak noted, “If every person in the organization doesn’t view themselves as a human firewall, you have a soft underbelly.” Mary Beth Vitale, Director, GEHA and CoBiz Financial, Inc., also noted, “Much of cyber risk mitigation is related to good housekeeping such as timely patching of servers and ongoing employee training and alertness.”

Boards also need to be alert. “Our board undertakes the same cybersecurity training as employees,” noted Wendy Webb, Director, ABM Industries. Other boards are putting cyber updates and visits to security centers on board “offsite” agendas.

THE ROLE OF CYBER INSURANCE

Although the perception of many directors is that cyber insurance provides for limited coverage, the insurance is increasingly viewed as an important component of a cyber risk management framework and can support response and recovery plans. Echoing this sentiment, Geeta Mathur, Director, Motherson Sumi Ltd, IIFL Holdings Ltd, and Tata Communication Transformation Services Ltd., commented, « There is a lack of information and discussion on risk transfer options at the board level. The perception is that it doesn’t cover much particularly relating to business interruption on account of cyber threats.” Cristina Finocchi Mahne also noted, “Currently, management teams may not have a positive awareness of cyber insurance, but we expect this to rapidly evolve over the short-term.”

Insurance does not release the board or management from the development and execution of a robust risk management plan but it can provide a financial safeguard against costs associated with a cyber event. Cyber insurance coverage should be considered in the context of an overall cyber risk management process and cyber risk appetite.

With a robust analysis, the organization can

  • quantify the price of cyber risk,
  • develop effective risk mitigation,
  • transfer and risk financing strategy,
  • and decide if – and how much – cyber insurance to purchase.

This allows the board to have a robust conversation on the relationship between risk, reward and the cost of mitigation and can also prompt an evaluation of potential consequences by using statistical modeling to assess different damage scenarios.

CYBER INSURANCE ADOPTION IS INCREASING

The role of insurance in enhancing cyber resilience is increasingly being recognized by policymakers around the world, and the Organisation of Economic Co-operation and Development (OECD) is recommending actions to stimulate cyber insurance adoption.

Globally, it is expected the level of future demand for cyber insurance will depend on the frequency of high-profile cyber incidents as well as the evolving legislative and regulatory environment for privacy protections in many countries. In India, for example, there was a 50% increase in companies buying cybersecurity coverage 2016 to 2017. Research suggests that only 40% of US boards have reviewed their organization’s cyber insurance coverage in the past 12 months.

LIMITING FINANCIAL LOSSES

In the event of a debilitating attack, cyber insurance and associated services can limit an organization’s financial damage from direct and indirect costs and help accelerate its recovery. (See Exhibit 3: Direct and Indirect Costs Associated with a Cyber Attack.) For example, as a result of the NotPetya attack, one global company reported a decline in operating margins and income, with losses in excess of US$500 million in the last fiscal year. The company noted the costs were driven by

  • investments in enhanced systems in order to prevent future attacks;
  • cost of incentives offered to customers to restore confidence and maintain business relationships;
  • additional costs due to claims for service failures; costs associated with data breach or data loss due to third-parties;
  • and “other consequences of which we are not currently aware but may subsequently discover.”

Indeed, the very process of assessing and purchasing cyber insurance can bolster cyber resilience by creating important incentives that drive behavioral change, including:

  • Raising awareness inside the organization on the importance of information security.
  • Fostering a broader dialogue among the cyber risk stakeholders within an organization.
  • Generating an organization-wide approach to ongoing cyber risk management by all aspects of the organization.
  • Assessing the strength of cyber defenses, particularly amid a rapidly changing cyber environment.

CR Ex 3

Click here to access Marsh’s and WCD’s detailed report

 

A Transformation in Progress – Perspectives and approaches to IFRS 17

The International Financial Reporting Standard 17 (IFRS 17) was issued in May 2017 by the International Accounting Standards Board (IASB) and has an effective date of 1st January 2021. The standard represents the most significant change in financial reporting for decades, placing greater demand on legacy accounting and actuarial systems. The regulation is intended to increase transparency and provide greater comparability of profitability across the insurance sector.

IFRS 17 will fundamentally change the face of profit and loss reporting. It will introduce a new set of Key Performance Indicators (KPIs), and change the way that base dividend or gross payments are calculated. To give an example, gross premiums will no longer be recorded under profit and loss. This is just one of the wide-ranging shifts that insurers must take on board in the way they structure their business to achieve the best possible commercial outcomes.

In early 2018 SAS asked 100 executives working in the insurance industry to share their opinions about the standard and strategies for compliance. The research shed light on the sector’s sentiment towards the regulation, challenges and opportunities that IFRS 17 presents, along with the steps organisations are taking to achieve compliance. The aims of the study were to better understand the views of the industry and how insurers are preparing to implement the standard. The objective was to share an unbiased view of the peer group’s analysis of, and approach to, tackling the challenges during the adjustment period. The information garnered is intended to help inform insurers’ decision-making during the early stages of their own projects, helping them arrive at the best-placed strategy for their business.

This report reveals the findings of the survey and provides guidance on how organisations might best achieve compliance. It provides a subjective, datadriven view of IFRS 17 along with valuable market context for insurance professionals who are developing their own strategies for tackling the new standard.

SAS’ research indicates that UK insurers do not underestimate the cost of IFRS 17 or the level of change it will likely introduce. Overall, 97 per cent of survey respondents said that they expected the standard to increase the cost and complexity of operating in insurance.

Companies will need to

  • introduce a new system of KPIs
  • and make changes in management information reports

to monitor performance under the revised profitability metrics. Forward looking strategic planning will also need to incorporate potential volatility and any ramifications within the insurance industry. To achieve this, firms will need to ensure the main parties involved co-operate and work together in a more integrated way.

The cost of these measures will, of course, differ considerably between organisations of different sizes, specialisms and complexities. However, the cost of compliance also greatly depends on

  • the approach taken by decision-makers,
  • the partners they choose
  • and the solutions they select.

Perhaps more instructive is that 90 per cent believe compliance costs will be greater than those demanded by the Solvency II Directive, aimed at insurers retaining strong financial buffers so they can meet claims from policyholders.

The European Commission estimated that it cost EU insurers between £3 and £4 billion to implement Solvency II, which was designed to standardise what had been a piecemeal approach to insurance regulations across the EU. Almost half (48 per cent) predict that IFRS 17 will cost substantially more.

Respondents are preparing for major alterations to their current accounting and actuarial systems, from minor upgrades all the way to wholesale replacements. Data management systems will be the prime target for review, with 84 per cent of respondents planning to either make additional investment (25 per cent), upgrade (34 per cent), or replace them (25 per cent). Finance, accounting and actuarial systems will also see significant innovation, as 83 per cent and 81 per cent respectively prepare for significant investment.

The use of analytics appears to be the most divisive area for insurers. While 27 per cent of participants are confident they will need to make no changes to their analytics systems or processes, 28 per cent plan to replace them entirely. A majority of 71 per cent still expect to make at least some reform.

IFRS17

IFRS17 2

Click here to access SAS’ Whitepaper

 

The IFRS 9 Impairment Model and its Interaction with the Basel Framework

In the wake of the 2008 financial crisis, the International Accounting Standards Board (IASB) in cooperation with the Financial Accounting Standards Board (FASB) launched a project to address the weaknesses of both International Accounting Standard (IAS) 39 and the US generally accepted accounting principles (GAAP), which had been the international standards for determining financial assets and liabilities accounting in financial statements since 2001.

By July 2014, the IASB finalized and published its new International Financial Reporting Standard (IFRS) 9 methodology, to be implemented by January 1, 2018 (with the standard available for early adoption). IFRS 9 will cover financial organizations across Europe, the Middle East, Asia, Africa, Oceana, and the Americas (excluding the US). For financial assets that fall within the scope of the IFRS 9 impairment approach, the impairment accounting expresses a financial asset’s expected credit loss as the projected present value of the estimated cash shortfalls over the expected life of the asset. Expected losses may be considered on either a 12-month or lifetime basis, depending on the level of credit risk associated with the asset, and should be reassessed at each reporting date. The projected value is then recognized in the profit and loss (P&L) statement.

Most banks subject to IFRS 9 are also subject to Basel III Accord capital requirements and, to calculate credit risk-weighted assets, use either standardized or internal ratings-based approaches. The new IFRS 9 provisions will impact the P&L that in turn needs to be reflected in the calculation for impairment provisions for regulatory capital. The infrastructure to calculate and report on expected loss drivers of capital adequacy is already in place. The data, models, and processes used today in the Basel framework can in some instances be used for IFRS 9 provision modeling, albeit with significant adjustments. Not surprisingly, a Moody’s Analytics survey conducted with 28 banks found that more than 40% of respondents planned to integrate IFRS 9 requirements into their Basel infrastructure.

Arguably the biggest change brought by IFRS 9 is incorporation of credit risk data into an accounting and therefore financial reporting process. Essentially, a new kind of interaction between finance and risk functions at the organization level is needed, and these functions will in turn impact data management processes. The implementation of the IFRS 9 impairment model challenges the way risk and finance data analytics are defined, used, and governed throughout an institution. IFRS 9 is not the only driver of this change.

Basel Committee recommendations, European Banking Authority (EBA) guidelines and consultation papers, and specific supervisory exercises, such as stress testing and Internal Capital Adequacy Assessment Process (ICAAP), are forcing firms to consider a more data-driven and forward-looking approach in risk management and financial reporting.

Accounting and Risk Management: An Organization and Cultural Perspective

The implementation of IFRS 9 processes that touch on both finance and risk functions creates the need to take into account differences in culture, as well as often different understandings of the concept of loss in the two functions.

  • The finance function is focused on product (i.e., internal reporting based on internal data) and is driven by accounting standards.
  • The risk function, however, is focused on the counterparty (i.e., probability of default) and is driven by a different set of regulations and guidelines.

This difference in focus leads the two functions to adopt these differing approaches when dealing with impairment:

  • The risk function uses a stochastic approach to model losses, and a database to store data and run the calculations.
  • Finance uses arithmetical operations to report the expected/ incurred losses on the P&L, and uses decentralized data to populate reporting templates.

In other words, finance is driven by economics, and risk by statistical analysis. Thus, the concept of loss differs between teams or groups: A finance team views it as part of a process and analyzes loss in isolation from other variables, while the risk team sees loss as absolute and objectively observable with an aggregated view.

IFRS 9 requires a cross-functional approach, highlighting the need to reconcile risk and finance methodologies.

The data from finance in combination with the credit risk models from risk should drive the process.

  • The risk function runs the impairment calculation, whilst providing objective, independent, and challenger views (risk has no P&L or bonus-driven incentive) to the business assumptions.
  • Finance supports the process by providing data and qualitative overlay.

Credit Risk Modeling and IFRS 9 Impairment Model

Considering concurrent requirements across a range of regulatory guidelines, such as stress testing, and reporting requirements, such as common reporting (COREP) and financial reporting (FINREP), the challenge around the IFRS 9 impairment model is two-fold:

  • Models: How to harness the current Basel-prescribed credit risk models to make them compliant with the IFRS 9 impairment model.
  • Data: How (and whether) the data captured for Basel capital calculation can be used to model expected credit losses under IFRS 9.

IFRS9 Basel3

Click here to access Moody’s detailed report

Targeting A Technology Dividend In Risk Management

Many drivers are shaping the context of risk management today. Macroeconomic headwinds, global geopolitical uncertainty, and ever more frequent and damaging cyber events have been in the vanguard of the challenges leading to heightened risk perceptions.

MACROECONOMIC HEADWINDS

Macroeconomic headwinds driven by global and Asian debt levels, low growth, anti-globalization sentiments, increasing policy uncertainty and the expected hike in US interest rates, all represent significant challenges. As Andrew Glenister, Regional Risk Advisor at BT Hong Kong, notes: “Macroeconomic and geopolitical risks are an increasing part of our internal discussions, particularly across Asia and Africa, and recent surprises on the world’s political scene have demonstrated that nothing can be taken for granted, and that the experts aren’t always right! At the same time our business is facing new challenges from the changing regulatory and global environment and can be impacted by a far greater range and variety of events from across the world.

These challenges are particularly pronounced for export-dependent economies, which comprise most of Asia. Concurrently, many leading economies in Asia-Pacific such as China, Singapore, and Australia are struggling to maintain labor productivity and productivity growth. Productivity-enhancing policies are required, including capital investments in new technology and workforce development. These new technologypowered productivity strategies will inevitably bring modifications to risk management and the role of the risk function. Risk teams will need to use their established capabilities to anticipate potential implications of this context, and develop new capabilities for managing risks using emerging technologies.

HIDDEN RISKS ARISING FROM NEW TECHNOLOGIES

Global perceptions of risk, as measured in Marsh & McLennan Company’s annual work with the World Economic Forum, are more elevated than ever. Technological advancements, for example, are increasingly exposing organizations to emerging risks such as data fraud and cybersecurity threats. Indeed, the WannaCry and Petya ransomware attacks were a harsh reminder of this for firms across the globe. This point of view is well echoed in our survey, in which 51 percent of respondents state that cybersecurity risk is the second-most impactful risk for their firms, following strategic risk.

In fact, two of the three most pressing global risks identified by risk managers relate to technology and cybersecurity. Moreover, as reflected in the MMC Asia Pacific Risk Center’s annual Evolving Risk Concern in Asia-Pacific report, the interconnectedness of risks – which may not be apparent to businesses – compounds the impacts of risk events. For example, the effects of advancement in automation may lead to rising economic inequality as it threatens to displace manufacturing jobs that have been the main livelihood of millions of lower-income Asians. As Susan Valdez, Senior Vice President and Chief Corporate Services Officer of Aboitiz Equity Ventures (and a PARIMA Philippines board member) points out, “Corporate digital transformation creates a whole new set of risks and could alter the context of cyber risk and information security risk. Because of the evolving nature of threats from hacking, malware, phishing and other forms of attacks, existing mitigations are constantly challenged and need to be continually updated to address vulnerabilities.” The confluence of risks facing Asia-Pacific is posing significant challenges to businesses.

THE EVOLVING REGULATORY LANDSCAPE

A “deluge of regulation” has followed the dramatic events of the Global Financial Crisis, especially in financial service industries. Non-financial service industries also face a rising tide of regulation, motivated by trends such as cybersecurity concerns, rising anti-globalization sentiments and climate change, just to name a few. Asia-Pacific regulators are following international precedent by increasing oversight of multiple areas including stress testing, recovery and resolution planning, as well as in required capital estimation regulation.

An increasing number of Asia-Pacific countries including China, Singapore, and Australia have recently introduced cybersecurity laws to be in line with the global best practice. Moreover, rising protectionism including sudden changes in trade policies, taxes or tariff regulations have been witnessed in other regions, which also create increased pressure on risk management.

RM Function.png

Click here to access Marsh Parima study

Cybersecurity Risk Management Oversight – A Tool for Board Members

Companies are facing not only increasing cyber threats but also new laws and regulations for managing and reporting on data security and cybersecurity risks.

Boards of directors face an enormous challenge: to oversee how their companies manage cybersecurity risk. As boards tackle this oversight challenge, they have a valuable resource in Certified Public Accountants (CPAs) and in the public company auditing profession.

CPAs bring to bear core values—including independence, objectivity, and skepticism—as well as deep expertise in providing independent assurance services in both the financial statement audit and a variety of other subject matters. CPA firms have played a role in assisting companies with information security for decades. In fact, four of the leading 13 information security and cybersecurity consultants are public accounting firms.

This tool provides questions board members charged with cybersecurity risk oversight can use as they engage in discussions about cybersecurity risks and disclosures with management and CPA firms.

The questions are grouped under four key areas:

  1. Understanding how the financial statement auditor considers cybersecurity risk
  2. Understanding the role of management and responsibilities of the financial statement auditor related to cybersecurity disclosures
  3. Understanding management’s approach to cybersecurity risk management
  4. Understanding how CPA firms can assist boards of directors in their oversight of cybersecurity risk management

This publication is not meant to provide an all-inclusive list of questions or to be seen as a checklist; rather, it provides examples of the types of questions board members may ask of management and the financial statement auditor. The dialogue that these questions spark can help clarify the financial statement auditor’s responsibility for cybersecurity risk considerations in the context of the financial statement audit and, if applicable, the audit of internal control over financial reporting (ICFR). This dialogue can be a way to help board members develop their understanding of how the company is managing its cybersecurity risks.

Additionally, this tool may help board members with cybersecurity risk oversight learn more about other incremental offerings from CPA firms. One example is the cybersecurity risk management reporting framework developed by the American Institute of CPAs (AICPA). The framework enables CPAs to examine and report on management-prepared cybersecurity information, thereby boosting the confidence that stakeholders place on a company’s initiatives.

With this voluntary, market-driven framework, companies can also communicate pertinent information regarding their cybersecurity risk management efforts and educate stakeholders about the systems, processes, and controls that are in place to detect, prevent, and respond to breaches.

AICPA

Click here to access CAQ’s detailed White Paper and Questionnaires