Achieving Effective IFRS 17 Reporting – Enabling the right accounting policy through technology

Executive summary

International Financial Reporting Standard (IFRS) 17, the first comprehensive global accounting standard for insurance products, is due to be implemented in 2023, and is the latest standard developed by the International Accounting Standards Board (IASB) in its push for international accounting standards.

IFRS 17, following other standards such as IFRS 9 and Current Expected Credit Losses (CECL), is the latest move toward ‘risk-ware accounting’, a framework that aims to incorporate financial and non-financial risk into accounting valuation.

As a principles-based standard, IFRS 17 provides room for different interpretations, meaning that insurers have choices to make about how to comply. The explicit integration of financial and non-financial risk has caused much discussion about the unprecedented and distinctive modeling challenges that IFRS 17 presents. These could cause ‘tunnel vision’ among insurers when it comes to how they approach compliance.

But all stages of IFRS 17 compliance are important, and each raises distinct challenges. By focusing their efforts on any one aspect of the full compliance value chain, insurers can risk failing to adequately comply. In the case of IFRS 17, it is not necessarily accidental non-compliance that is at stake, but rather the sub-optimal presentation of the business’ profits.

To achieve ‘ideal’ compliance, firms need to focus on the logistics of reporting as much as on the mechanics of modeling. Effective and efficient reporting comprises two elements: presentation and disclosure. Reporting is the culmination of the entire compliance value chain, and decisions made further up the chain can have a significant impact on the way that value is presented. Good reporting is achieved through a mixture of technology and accounting policy, and firms should follow several strategies in achieving this:

  • Anticipate how the different IFRS 17 measurement models will affect balance sheet volatility.
  • Understand the different options for disclosure, and which approach is best for specific institutional needs.
  • Streamline IFRS 17 reporting with other reporting duties.
  • Where possible, aim for collaborative report generation while maintaining data integrity.
  • Explore and implement technology that can service IFRS 17’s technical requirements for financial reporting.
  • Store and track data on a unified platform.

In this report we focus on the challenges associated with IFRS 17 reporting, and consider solutions to those challenges from the perspectives of accounting policy and technology implementation. And in highlighting the reporting stage of IFRS 17 compliance, we focus specifically on how decisions about the presentation of data can dictate the character of final disclosure.

  • Introduction: more than modeling

IFRS 17 compliance necessitates repeated stochastic calculations to capture financial and nonfinancial risk (especially in the case of long-term insurance contracts). Insurance firms consistently identify modeling and data management as the challenges they most anticipate having to address in their efforts to comply. Much of the conversation and ‘buzz’ surrounding IFRS 17 has therefore centered on its modeling requirements, and in particular the contractual service margin (CSM) calculation.

But there is always a danger that firms will get lost in the complexity of compliance and forget the aim of IFRS 17. Although complying with IFRS 17 involves multiple disparate process elements and activities, it is still essentially an accounting
standard
. First and foremost its aim is to ensure the transparent and comparable disclosure of the value of insurance services.
So while IFRS 17 calculations are crucial, they are just one stage in the compliance process, and ultimately enable the intended outcome: reporting.

Complying with the modeling requirements of IFRS 17 should not create ‘compliance tunnel vision’ at the expense of the presentation and disclosure of results. Rather, presentation and disclosure are the culmination of the IFRS 17 compliance process flow and are key elements of effective reporting (see Figure 1).

  • Developing an IFRS 17 accounting policy

A key step in developing reporting compliance is having an accounting policy tailored to a firm’s specific interaction with IFRS 17. Firms have decisions to make about how to comply, together with considerations of the knock-on effects IFRS 17 will have on the presentation of their comprehensive statements of income.

There are a variety of considerations: in some areas IFRS 17 affords a degree of flexibility; in others it does not. Areas that will substantially affect the appearance of firms’ profits are:

• The up-front recognition of loss and the amortization of profit.
• The new unit of account.
• The separation of investment components from insurance services.
• The recognition of interest rate changes under the general measurement model (GMM).
Deferred acquisition costs under the premium allocation approach (PAA).

As a principles-based standard, IFRS 17 affords a degree of flexibility in how firms approach valuation. One of its aims is to insure that entity specific risks and diverse contract features are adequately reflected in valuations, while still safeguarding reporting comparability. This flexibility also gives firms some degree of control over the way that value and risk are portrayed in financial statements. However, some IFRS 17 stipulations will lead to inevitable accounting mismatches and balance-sheet volatility.

Accounting policy impacts and choices – Balance sheet volatility

One unintended consequence of IFRS 17 compliance is balance sheet volatility. As an occurrence of risk-aware accounting, IFRS 17 requires the value of insurance services to be market-adjusted. This adjustment is based on a firm’s projection of future cash flow, informed by calculated financial risk. Moreover, although this will not be the first time firms are incorporating non-financial risk into valuations, it is the first time it has to be explicit.

Market volatility will be reflected in the balance sheet, as liabilities and assets are subject to interest rate fluctuation and other financial risks. The way financial risk is incorporated into the value of a contract can also contribute to balance sheet volatility. The way it is incorporated is dictated by the measurement model used to value it, which depends on the eligibility of the contract.

There are three measurement models, the PAA, the GMM and the variable fee approach (VFA). All three are considered in the next section.

The three measurement models

Features of the three measurement models (see Figure 2) can have significant effects on how profit – represented by the CSM – is presented and ultimately disclosed.

To illustrate the choices around accounting policy that insurance firms will need to consider and make, we provide two specific examples, for the PAA and the GMM.

Accounting policy choices: the PAA

When applying the PAA to shorter contracts – generally those of fewer than 12 months – firms have several choices to make about accounting policy. One is whether to defer acquisition costs. Unlike previous reporting regimes, under IFRS17’s PAA indirect costs cannot be deferred as acquisition costs. Firms can either expense these costs upfront or defer them and amortize the cost over the length of the contract. Expensing acquisition costs as they are incurred may affect whether a group of contracts is characterized as onerous at inception. Deferring acquisition costs reduces the liability for the remaining coverage; however, it may also increase the loss recognized in the income statement for onerous contracts.

Accounting policy choices: the GMM

Under IFRS 17, revenue is the sum of

  • the release of CSM,
  • changes in the risk adjustment,
  • and expected net cash outflows, excluding any investment components.

Excluding any investment component from revenue recognition will have significant impacts on contracts being sold by life insurers.

Contracts without direct participation features measured under the GMM use a locked-in discount rate – whether this is calculated ‘top down’ or ‘bottom up’ is at the discretion of the firm. Changes to the CSM have to be made using the discount rate set at the initial recognition of the contract. Changes in financial variables that differ from the locked-in discount rate cannot be integrated into the CSM, so appear as insurance service value.

A firm must account for the changes directly in the comprehensive income statement, and this can also contribute to balance sheet volatility.

As part of their accounting policy firms have a choice about how to recognize changes in discount rates and other changes to financial risk assumptions – between other comprehensive income (OCI) and profit and loss (P&L). Recognizing fluctuations in discount rates and financial risk in the OCI reduces some volatility in P&L. Firms also recognize the fair value of assets
in the OCI under IFRS 9.

  • The technology perspective

Data integrity and control

At the center of IFRS 17 compliance and reporting is the management of a wide spectrum of data – firms will have to gather and generate data from historic, current and forward-looking perspectives.

Creating IFRS 17 reports will be a non-linear process, and data will be incorporated as it becomes available from multiple sources. For many firms, contending with this level of data granularity and volume will be a big leap from other reporting requirements. The maturity of an insurer’s data infrastructure is partly defined by the regulatory and reporting context it was built in, and in which it operates – entities across the board will have to upgrade their data management technology.

In regions such as Southeast Asia and the Middle East, however, data management on the scale of IFRS 17 is unprecedented. Entities operating in these regions in particular will have to expend considerable effort to upgrade their infrastructure. Manual spreadsheets and complex legacy systems will have to be replaced with data management technology across the compliance value chain.

According to a 2018 survey by Deloitte, 87% of insurers believed that their systems technology required upgrades to capture the new data they have to handle and perform the calculations they require for compliance. Capturing data inputs was cited as the biggest technology challenge.

Tracking and linking the data lifecycle

Compliance with IFRS 17 demands data governance across the entire insurance contract valuation process. The data journey starts at the data source and travels through aggregation and modeling processes all the way to the disclosure stage (see Figure 3).

In this section we focus on the specific areas of data lineage, data tracking and the auditing processes that run along the entire data compliance value chain. For contracts longer than 12 months, the valuation process will be iterative, as data is transformed multiple times by different users. Having a single version of reporting data makes it easier to collaborate, track and manage the iterative process of adapting to IFRS 17. Cloud platforms help to address this challenge, providing an effective means of storing and managing the large volumes of reporting data generated by IFRS 17. The cloud allows highly scalable, flexible technology to be delivered on demand, enabling simultaneous access to the same data for internal teams and external advisors.

It is essential that amendments are tracked and stored as data falls through different hands and passes through different IFRS 17 ‘compliance stages’. Data lineage processes can systematically track users’ interactions with data and improve the ‘auditability’ of the compliance process and users’ ‘ownership’ of activity.

Data linking is another method of managing IFRS 17 reporting data. Data linking contributes to data integrity while enabling multiple users to make changes to data. It enables the creation of relationships across values while maintaining the integrity of the source value, so changing the source value creates corresponding changes across all linked values. Data linking also enables the automated movement of data from spreadsheets to financial reports, updating data as it is changed and tracking users’ changes to it.

Disclosing the data

Highlighting how IFRS 17 is more than just a compliance exercise, it will have a fundamental impact on how insurance companies report their data internally, to regulators, and to financial markets. For the final stage of compliance, firms will need to adopt a new format for the balance sheet, P&L statement and cash flow statements.

In addition to the standard preparation of financial statements, IFRS 17 will require a number of disclosures, including the explanation of recognized amounts, significant judgements made in applying IFRS 17, and the nature and extent of risks arising from insurance contracts. As part of their conversion to IFRS 17, firms will need to assess how data will have to be managed on a variety of levels, including

  • transactions,
  • financial statements,
  • regulatory disclosures,
  • internal key performance indicators
  • and communications to financial markets.

Communication with capital markets will be more complex, because of changes that will have to be made in several areas:

  • The presentation of financial results.
  • Explanations of how calculations were made, and around the increased complexity of the calculations.
  • Footnotes to explain how data is being reported in ‘before’ and ‘after’ conversion scenarios.

During their transition, organizations will have to report and explain to the investor community which changes were the result of business performance and which were the result of a change in accounting basis. The new reporting basis will also impact how data will be reported internally, as well as overall effects on performance management. The current set of key metrics used for performance purposes, including volume, revenue, risk and profitability, will have to be adjusted for the new methodology and accounting basis. This could affect how data will be reported on and reconciled for current regulatory reporting requirements including Solvency II, local solvency standards, and broader statutory and tax reporting.

IFRS 17 will drive significant changes in the current reporting environment. To address this challenge, firms must plan how they will manage both the pre-conversion and post-conversion data sets, the preparation of pre-, post-, and comparative financial statements, and the process of capturing and disclosing all of the narrative that will support and explain these financial results.

In addition, in managing the complexity of the numbers and the narrative before, during and after the conversion, reporting systems will also need to scale to meet the requirements of regulatory reporting – including disclosure in eXtensible Business
Reporting Language (XBRL) in some jurisdictions. XBRL is a global reporting markup language that enables the encoding of documents in a human and machine-legible format for business reporting (The IASB publishes its IFRS Taxonomy files in
XBRL).

But XBRL tagging can be a complex, time-consuming and repetitive process, and firms should consider using available technology partners to support the tagging and mapping demands of document drafting.

Building your data and analytics strategy

When it comes to being data-driven, organizations run the gamut with maturity levels. Most believe that data and analytics provide insights. But only one-third of respondents to a TDWI survey said they were truly data-driven, meaning they analyze data to drive decisions and actions.

Successful data-driven businesses foster a collaborative, goal-oriented culture. Leaders believe in data and are governance-oriented. The technology side of the business ensures sound data quality and puts analytics into operation. The data management strategy spans the full analytics life cycle. Data is accessible and usable by multiple people – data engineers and data scientists, business analysts and less-technical business users.

TDWI analyst Fern Halper conducted research of analytics and data professionals across industries and identified the following five best practices for becoming a data-driven organization.

1. Build relationships to support collaboration

If IT and business teams don’t collaborate, the organization can’t operate in a data-driven way – so eliminating barriers between groups is crucial. Achieving this can improve market performance and innovation; but collaboration is challenging. Business decision makers often don’t think IT understands the importance of fast results, and conversely, IT doesn’t think the business understands data management priorities. Office politics come into play.

But having clearly defined roles and responsibilities with shared goals across departments encourages teamwork. These roles should include: IT/architecture, business and others who manage various tasks on the business and IT sides (from business sponsors to DevOps).

2. Make data accessible and trustworthy

Making data accessible – and ensuring its quality – are key to breaking down barriers and becoming data-driven. Whether it’s a data engineer assembling and transforming data for analysis or a data scientist building a model, everyone benefits from trustworthy data that’s unified and built around a common vocabulary.

As organizations analyze new forms of data – text, sensor, image and streaming – they’ll need to do so across multiple platforms like data warehouses, Hadoop, streaming platforms and data lakes. Such systems may reside on-site or in the cloud. TDWI recommends several best practices to help:

  • Establish a data integration and pipeline environment with tools that provide federated access and join data across sources. It helps to have point-and-click interfaces for building workflows, and tools that support ETL, ELT and advanced specifications like conditional logic or parallel jobs.
  • Manage, reuse and govern metadata – that is, the data about your data. This includes size, author, database column structure, security and more.
  • Provide reusable data quality tools with built-in analytics capabilities that can profile data for accuracy, completeness and ambiguity.

3. Provide tools to help the business work with data

From marketing and finance to operations and HR, business teams need self-service tools to speed and simplify data preparation and analytics tasks. Such tools may include built-in, advanced techniques like machine learning, and many work across the analytics life cycle – from data collection and profiling to monitoring analytical models in production.

These “smart” tools feature three capabilities:

  • Automation helps during model building and model management processes. Data preparation tools often use machine learning and natural language processing to understand semantics and accelerate data matching.
  • Reusability pulls from what has already been created for data management and analytics. For example, a source-to-target data pipeline workflow can be saved and embedded into an analytics workflow to create a predictive model.
  • Explainability helps business users understand the output when, for example, they’ve built a predictive model using an automated tool. Tools that explain what they’ve done are ideal for a data-driven company.

4. Consider a cohesive platform that supports collaboration and analytics

As organizations mature analytically, it’s important for their platform to support multiple roles in a common interface with a unified data infrastructure. This strengthens collaboration and makes it easier for people to do their jobs.

For example, a business analyst can use a discussion space to collaborate with a data scientist while building a predictive model, and during testing. The data scientist can use a notebook environment to test and validate the model as it’s versioned and metadata is captured. The data scientist can then notify the DevOps team when the model is ready for production – and they can use the platform’s tools to continually monitor the model.

5. Use modern governance technologies and practices

Governance – that is, rules and policies that prescribe how organizations protect and manage their data and analytics – is critical in learning to trust data and become data-driven. But TDWI research indicates that one-third of organizations don’t govern their data at all. Instead, many focus on security and privacy rules. Their research also indicates that fewer than 20 percent of organizations do any type of analytics governance, which includes vetting and monitoring models in production.

Decisions based on poor data – or models that have degraded – can have a negative effect on the business. As more people across an organization access data and build  models, and as new types of data and technologies emerge (big data, cloud, stream mining), data governance practices need to evolve. TDWI recommends three features of governance software that can strengthen your data and analytics governance:

  • Data catalogs, glossaries and dictionaries. These tools often include sophisticated tagging and automated procedures for building and keeping catalogs up to date – as well as discovering metadata from existing data sets.
  • Data lineage. Data lineage combined with metadata helps organizations understand where data originated and track how it was changed and transformed.
  • Model management. Ongoing model tracking is crucial for analytics governance. Many tools automate model monitoring, schedule updates to keep models current and send alerts when a model is degrading.

In the future, organizations may move beyond traditional governance council models to new approaches like agile governance, embedded governance or crowdsourced governance.

But involving both IT and business stakeholders in the decision-making process – including data owners, data stewards and others – will always be key to robust governance at data-driven organizations.

SAS1

There’s no single blueprint for beginning a data analytics project – never mind ensuring a successful one.

However, the following questions help individuals and organizations frame their data analytics projects in instructive ways. Put differently, think of these questions as more of a guide than a comprehensive how-to list.

1. Is this your organization’s first attempt at a data analytics project?

When it comes to data analytics projects, culture matters. Consider Netflix, Google and Amazon. All things being equal, organizations like these have successfully completed data analytics projects. Even better, they have built analytics into their cultures and become data-driven businesses.

As a result, they will do better than neophytes. Fortunately, first-timers are not destined for failure. They should just temper their expectations.

2. What business problem do you think you’re trying to solve?

This might seem obvious, but plenty of folks fail to ask it before jumping in. Note here how I qualified the first question with “do you think.” Sometimes the root cause of a problem isn’t what we believe it to be; in other words, it’s often not what we at first think.

In any case, you don’t need to solve the entire problem all at once by trying to boil the ocean. In fact, you shouldn’t take this approach. Project methodologies (like agile) allow organizations to take an iterative approach and embrace the power of small batches.

3. What types and sources of data are available to you?

Most if not all organizations store vast amounts of enterprise data. Looking at internal databases and data sources makes sense. Don’t make the mistake of believing, though, that the discussion ends there.

External data sources in the form of open data sets (such as data.gov) continue to proliferate. There are easy methods for retrieving data from the web and getting it back in a usable format – scraping, for example. This tactic can work well in academic environments, but scraping could be a sign of data immaturity for businesses. It’s always best to get your hands on the original data source when possible.

Caveat: Just because the organization stores it doesn’t mean you’ll be able to easily access it. Pernicious internal politics stifle many an analytics endeavor.

4. What types and sources of data are you allowed to use?

With all the hubbub over privacy and security these days, foolish is the soul who fails to ask this question. As some retail executives have learned in recent years, a company can abide by the law completely and still make people feel decidedly icky about the privacy of their purchases. Or, consider a health care organization – it may not technically violate the Health Insurance Portability and Accountability Act of 1996 (HIPAA), yet it could still raise privacy concerns.

Another example is the GDPR. Adhering to this regulation means that organizations won’t necessarily be able to use personal data they previously could use – at least not in the same way.

5. What is the quality of your organization’s data?

Common mistakes here include assuming your data is complete, accurate and unique (read: nonduplicate). During my consulting career, I could count on one hand the number of times a client handed me a “perfect” data set. While it’s important to cleanse your data, you don’t need pristine data just to get started. As Voltaire said, “Perfect is the enemy of good.”

6. What tools are available to extract, clean, analyze and present the data?

This isn’t the 1990s, so please don’t tell me that your analytic efforts are limited to spreadsheets. Sure, Microsoft Excel works with structured data – if the data set isn’t all that big. Make no mistake, though: Everyone’s favorite spreadsheet program suffers from plenty of limitations, in areas like:

  • Handling semistructured and unstructured data.
  • Tracking changes/version control.
  • Dealing with size restrictions.
  • Ensuring governance.
  • Providing security.

For now, suffice it to say that if you’re trying to analyze large, complex data sets, there are many tools well worth exploring. The same holds true for visualization. Never before have we seen such an array of powerful, affordable and user-friendly tools designed to present data in interesting ways.

Caveat 1: While software vendors often ape each other’s features, don’t assume that each application can do everything that the others can.

Caveat 2: With open source software, remember that “free” software could be compared to a “free” puppy. To be direct: Even with open source software, expect to spend some time and effort on training and education.

7. Do your employees possess the right skills to work on the data analytics project?

The database administrator may well be a whiz at SQL. That doesn’t mean, though, that she can easily analyze gigabytes of unstructured data. Many of my students need to learn new programs over the course of the semester, and the same holds true for employees. In fact, organizations often find that they need to:

  • Provide training for existing employees.
  • Hire new employees.
  • Contract consultants.
  • Post the project on sites such as Kaggle.
  • All of the above.

Don’t assume that your employees can pick up new applications and frameworks 15 minutes at a time every other week. They can’t.

8. What will be done with the results of your analysis?

A company routinely spent millions of dollars recruiting MBAs at Ivy League schools only to see them leave within two years. Rutgers MBAs, for their part, stayed much longer and performed much better.

Despite my findings, the company continued to press on. It refused to stop going to Harvard, Cornell, etc. because of vanity. In his own words, the head of recruiting just “liked” going to these schools, data be damned.

Food for thought: What will an individual, group, department or organization do with keen new insights from your data analytics projects? Will the result be real action? Or will a report just sit in someone’s inbox?

9. What types of resistance can you expect?

You might think that people always and willingly embrace the results of data-oriented analysis. And you’d be spectacularly wrong.

Case in point: Major League Baseball (MLB) umpires get close ball and strike calls wrong more often than you’d think. Why wouldn’t they want to improve their performance when presented with objective data? It turns out that many don’t. In some cases, human nature makes people want to reject data and analytics that contrast with their world views. Years ago, before the subscription model became wildly popular, some Blockbuster executives didn’t want to believe that more convenient ways to watch movies existed.

Caveat: Ignore the power of internal resistance at your own peril.

10. What are the costs of inaction?

Sure, this is a high-level query and the answers depend on myriad factors.

For instance, a pharma company with years of patent protection will respond differently than a startup with a novel idea and competitors nipping at its heels. Interesting subquestions here include:

  • Do the data analytics projects merely confirm what we already know?
  • Do the numbers show anything conclusive?
  • Could we be capturing false positives and false negatives?

Think about these questions before undertaking data analytics projects Don’t take the queries above as gospel. By and large, though, experience proves that asking these questions frames the problem well and sets the organization up for success – or at least minimizes the chance of a disaster.

SAS2

Most organizations understand the importance of data governance in concept. But they may not realize all the multifaceted, positive impacts of applying good governance practices to data across the organization. For example, ensuring that your sales and marketing analytics relies on measurably trustworthy customer data can lead to increased revenue and shorter sales cycles. And having a solid governance program to ensure your enterprise data meets regulatory requirements could help you avoid penalties.

Companies that start data governance programs are motivated by a variety of factors, internal and external. Regardless of the reasons, two common themes underlie most data governance activities: the desire for high-quality customer information, and the need to adhere to requirements for protecting and securing that data.

What’s the best way to ensure you have accurate customer data that meets stringent requirements for privacy and security?

For obvious reasons, companies exert significant effort using tools and third-party data sets to enforce the consistency and accuracy of customer data. But there will always be situations in which the managed data set cannot be adequately synchronized and made consistent with “real-world” data. Even strictly defined and enforced internal data policies can’t prevent inaccuracies from creeping into the environment.

sas3

Why you should move beyond a conventional approach to data governance?

When it comes to customer data, the most accurate sources for validation are the customers themselves! In essence, every customer owns his or her information, and is the most reliable authority for ensuring its quality, consistency and currency. So why not develop policies and methods that empower the actual owners to be accountable for their data?

Doing this means extending the concept of data governance to the customers and defining data policies that engage them to take an active role in overseeing their own data quality. The starting point for this process fits within the data governance framework – define the policies for customer data validation.

A good template for formulating those policies can be adapted from existing regulations regarding data protection. This approach will assure customers that your organization is serious about protecting their data’s security and integrity, and it will encourage them to actively participate in that effort.

Examples of customer data engagement policies

  • Data protection defines the levels of protection the organization will use to protect the customer’s data, as well as what responsibilities the organization will assume in the event of a breach. The protection will be enforced in relation to the customer’s selected preferences (which presumes that customers have reviewed and approved their profiles).
  • Data access control and security define the protocols used to control access to customer data and the criteria for authenticating users and authorizing them for particular uses.
  • Data use describes the ways the organization will use customer data.
  • Customer opt-in describes the customers’ options for setting up the ways the organization can use their data.
  • Customer data review asserts that customers have the right to review their data profiles and to verify the integrity, consistency and currency of their data. The policy also specifies the time frame in which customers are expected to do this.
  • Customer data update describes how customers can alert the organization to changes in their data profiles. It allows customers to ensure their data’s validity, integrity, consistency and currency.
  • Right-to-use defines the organization’s right to use the data as described in the data use policy (and based on the customer’s selected profile options). This policy may also set a time frame associated with the right-to-use based on the elapsed time since the customer’s last date of profile verification.

The goal of such policies is to establish an agreement between the customer and the organization that basically says the organization will protect the customer’s data and only use it in ways the customer has authorized – in return for the customer ensuring the data’s accuracy and specifying preferences for its use. This model empowers customers to take ownership of their data profile and assume responsibility for its quality.

Clearly articulating each party’s responsibilities for data stewardship benefits both the organization and the customer by ensuring that customer data is high-quality and properly maintained. Better yet, recognize that the value goes beyond improved revenues or better compliance.

Empowering customers to take control and ownership of their data just might be enough to motivate self-validation.

Click her to access SAS’ detailed analysis

The Future of CFO’s Business Partnering

BP² – the next generation of Business Partner

The role of business partner has become almost ubiquitous in organizations today. According to respondents of this survey, 88% of senior finance professionals already consider themselves to be business partners. This key finding suggests that the silo mentality is breaking down and, at last, departments and functions are joining forces to teach and learn from each other to deliver better performance. But the scope of the role, how it is defined, and how senior finance executives characterize their own business partnering are all open to interpretation. And many of the ideas are still hamstrung by traditional finance behaviors and aspirations, so that the next generation of business partners as agents of change and innovation languish at the bottom of the priority list.

The scope of business partnering

According to the survey, most CFOs see business partnering as a blend of traditional finance and commercial support, while innovation and change are more likely to be seen as outside the scope of business partnering. 57% of senior finance executives strongly agree that a business partner should challenge budgets, plans and forecasts. Being involved in strategy and development followed closely behind with 56% strongly agreeing that it forms part of the scope of business partnering, while influencing commercial decisions was a close third.

The pattern that emerges from the survey is that traditional and commercial elements are given more weight within the scope of business partnering than being a catalyst for change and innovation. This more radical change agenda is only shared by around 36% of respondents, indicating that finance professionals still largely see their role in traditional or commercial terms. They have yet to recognize the finance function’s role in the next generation of business partnering, which can be

  • the catalyst for innovation in business models,
  • for process improvements
  • and for organizational change.

Traditional and commercial business partners aren’t necessarily less important than change agents, but the latter has the potential to add the most value in the longer term, and should at least be in the purview of progressive CFOs who want to drive change and encourage growth.

Unfortunately, this is not an easy thing to change. Finding time for any business partnering can be a struggle, but CFOs spend disproportionately less time on activities that bring about change than on traditional business partnering roles. Without investing time and effort into it, CFOs will struggle to fulfill their role as the next generation of business partner.

Overall 45% of CFOs struggle to make time for any business partnering, so it won’t come as a surprise that, ultimately, only 57% of CFOs believe their finance team efforts as business partners are well regarded by the operational functions.

The four personas of business partnering

Ask a room full of CFOs what business partnering means and you’ll get a room full of answers, each one influenced by their personal journey through the changing business landscape. By its very variability, this important business process is being enacted in many ways. FSN, the survey authors, did not seek to define business partnering. Instead, the survey asked respondents to define business partnering in their own words, and the 366 detailed answers were all different. But underlying the diversity were patterns of emphasis that defined four ‘personas’ or styles of business partnering, each exerting its own influence on the growth of the business over time.

A detailed analysis of the definitions and the frequency of occurrence of key phrases and expressions allowed us to plot these personas, their relative weight, together with their likely impact on growth over time.

FSN1

The size of the bubbles denotes the frequency (number) of times an attribute of business partnering was referenced in the definitions and these were plotted in terms of their likely contribution to growth in the short to long term.

The greatest number of comments by far coalesced around the bottom left-hand quadrant denoting a finance-centric focus on short to medium term outcomes, i.e., the traditional finance business partner. But there was an encouraging drift upwards and rightwards towards the quadrant denoting what we call the next generation of business partner, “BP²” (BP Squared), a super-charged business partner using his or her wide experience, purview and remit to help bring about change in the organization, for example, new business models, new processes and innovative methods of organizational deployment.

Relatively few of the 383 business partners offering definitions of a business partner, concerned themselves with top line growth i.e. with involvement in commercial sales negotiations or the sales pipeline – a critical part of influencing growth.

Finally, surprisingly few finance business partners immersed themselves in strategy development or saw their role as helping to ensure strategic alignment. It suggests that the ongoing transition of the CFO’s role from financial steward to strategic advisor is not as advanced as some would suggest.

Financial Performance Drivers

Most CFOs and senior finance executives define the role of the business partner in traditional financial terms. They are there to explain and illuminate the financial operations, be a trusted, safe pair of hands that manages business risk, and provide s ome operational support. The focus for these CFOs is on communicating a clear understanding of the financial imperative in order to steer the performance of the business prudently.

This ideal reflects the status quo and perpetuates the traditional view of finance, and the role of the CFO. It’s one where the finance function remains a static force, opening up only so far as to allow the rest of the business to see how it functions and make them more accountable to it. While it is obviously necessary for other functions to understand and support a financial strategy, the drawback of this approach is the shortcomings for the business as a whole. Finance-centric business partnering provides some short-term outcomes but does little to promote more than pedestrian growth. It’s better than nothing, but it’s far from the best.

Top-Line Drivers

In the upper quadrant, top line drivers focus on driving growth and sales with a collaborative approach to commercial decision-making. This style of business partnering can have a positive effect on earnings, as improvements in commercial operations and the management of the sales pipeline are translated into revenue.

But while top line drivers are linked to higher growth than financial-focused business partners, the outcome tends to be only short term. The key issue for CFOs is that very few of them even allude to commercial partnerships when defining the scope of business partnering. They ignore the potential for the finance function to help improve the commercial outcomes, like sales or the collection of debt or even a change in business models.

Strategic Aligners

Those CFOs who focus on strategic alignment in their business partnering approach tend to see longer term results. They use analysis and strategy to drive decisionmaking, bringing business goals into focus through partnerships and collaborative working. This business benefit helps to strengthen the foundation of the business in the long term, but it isn’t the most effective in driving substantial growth. And again, there is a paucity of CFOs and senior finance executives who cited strategy development and analysis in their definition of business partnering.

Catalysts for change

The CFOs who were the most progressive and visionary in their definition of business partnering use the role as a catalyst for change. They challenge their colleagues, influence the strategic direction of the business, and generate momentum through change and innovation from the very heart of the finance function. These finance executives get involved in decision-making, and understand the need to influence, advise and challenge in order to promote change. This definition is the one that translates into sustained high growth.

The four personas are not mutually exclusive. Some CFOs view business partnering as a combination of some or all of these attributes. But the preponderance of opinion is clustered around the traditional view of finance, while very little is to do with being a catalyst for change.

How do CFOs characterize their finance function?

However CFOs choose to define the role of business partnering, each function has its own character and style. According to the survey, 17% have a finance-centric approach to business partnering, limiting the relationship to financial stewardship and performance. A further 18% have to settle for a light-touch approach where they are occasionally invited to become involved in commercial decision-making. This means 35% of senior finance executives are barely involved in any commercial decision-making at all.

More positively, the survey showed that 46% are considered to be trusted advisors, and are sought out by operational business teams for opinions before they make big commercial or financial decisions.

But at the apex of the business partnering journey are the change agents, who make up a paltry 19% of the senior finance executives surveyed. These forward thinkers are frequently catalysts for change, suggesting new business processes and areas where the company can benefit from innovation. This is the next stage in the evolution of both the role of the modern CFO and the role of the finance function at the heart of business innovation. We call CFOs in this category BP² (BP Squared) to denote the huge distance between these forward-thinking individuals and the rest of the pack.

Measuring up

Business partnering can be a subtle yet effective process, but it’s not easy to measure. 57% of organizations have no agreed way of measuring the success of business partnering, and 34% don’t think it’s possible to separate and quantify the value added through this collaboration.

Yet CFOs believe there is a strong correlation between business partnering and profitability – with 91% of respondents saying their business partnering efforts significantly add to profitability. While it’s true that some of the outcomes of business partnering are intangible, it is still important to be able to make a direct connection between it and improved performance, otherwise those efforts may be ineffective but are allowed to continue.

One solution is to use 360 degree appraisals, drawing in a wider gamut of feedback including business partners and internal customers to ascertain the effectiveness of the process. Finance business partnering can also be quantified if there are business model changes, like the move from product sales to services, which require a generous underpinning of financial input to be carried out effectively.

Business partnering offers companies a way to inexpensively

  • pool all their best resources to generate ideas,
  • spark innovation
  • and positively add value to the business.

First CFOs need to recognize the importance of business partnering, widen their idea of how it can add value, and then actually set aside the enough time to become agents of change and growth.

Data unlocks business partnering

Data is the most valuable organizational currency in today’s competitive business environment. Most companies are still in the process of working out the best method to collect, collate and use the tsunami of data available to them in order to generate insight. Some organizations are just at the start of their data journey, others are more advanced, and our research confirms that their data profile will make a significant difference to how well their business partnering works.

FSN2

The survey asked how well respondents’ data supported the role of business partnering, and the responses showed that 18% were data overloaded. This meant business partners have too many conflicting data sources and poor data governance, leaving them with little actual usable data to support the partnering process.

26% were data constrained, meaning they cannot get hold of the data they need to drive insight and decision making.

And a further 34% were technology constrained, muddling through without the tech savvy resources or tools to fully exploit the data they already have. These senior finance executives may know the data is there, sitting in an ERP or CRM system, but can’t exploit it because they lack the right technology tools.

The final 22% have achieved data mastery, where they actively manage their data as a corporate asset, and have the tools and resources to exploit it in order to give their company a competitive edge.

This means 78% overall are hampered by data constraints and are failing to use data effectively to get the best out of their business partnering. While the good intentions are there, it is a weak partnership because there is little of substance to work with.

FSN3

The diagram above is the Business Partnering Maturity Model as it relates to data. It illustrates that there is a huge gap in performance between how effective data masters and data laggards are at business partnering.

The percentage of business partners falling into each category of data management (‘data overloaded’, ‘data constrained,’ etc) has been plotted together with how well these finance functions feel that business partnering is regarded by the operational units as well as their perceived influence on change.

The analysis reveals that “Data masters” are in a league of their own. They are significantly more likely to be well regarded by the operations and are more likely to act as change agents in their business partnering role.

We know from FSN’s 2018 Innovation in Financial Reporting survey that data masters, who similarly made up around one fifth of senior finance executives surveyed, are also more innovative. That research showed they were more likely to have worked on innovative projects in the last three years, and were less likely to be troubled by obstacles to reporting and innovation.

Data masters also have a more sophisticated approach to business partnering. They’re more likely to be change agents, are more often seen as a trusted advisor and they’re more involved in decision making. Interestingly, two-thirds of data masters have a formal or agreed way to measure the success of business partnering, compared to less than 41% of data constrained CFOs, and 36% of technology constrained and data overloaded finance executives. They’re also more inclined to perform 360 degree appraisals with their internal customers to assess the success of their business partnering. This means they can monitor and measure their success, which allows them to adapt and improve their processes.

The remainder, i.e. those that have not mastered their data, are clustered around a similar position on the Business Partnering Maturity Model, i.e., there is little to separate them around how well they are regarded by operational business units or whether they are in a position to influence change.

The key message from this survey is that data masters are the stars of the modern finance function, and it is a sentiment echoed through many of FSN’s surveys over the last few years.

The Innovation in Financial Reporting survey also found that data masters outperformed their less able competitors in three key performance measures that are indicative of financial health and efficiency: 

  • they close their books faster,
  • reforecast quicker and generate more accurate forecasts,
  • and crucially they have the time to add value to the organization.

People, processes and technology

So, if data is the key to driving business partnerships, where do the people, processes and technology come in? Business partnering doesn’t necessarily come naturally to everyone. Where there is no experience of it in previous positions, or if the culture is normally quite insular, sometimes CFOs and senior finance executives need focused guidance. But according to the survey, 77% of organizations expect employees to pick up business partnering on the job. And only just over half offer specialized training courses to support them.

Each company and department or function will be different, but businesses need to support their partnerships, either with formal structures or at the very least with guidance from experienced executives to maximize the outcome. Meanwhile processes can be a hindrance to business partnering in organizations where there is a lack of standardization and automation. The survey found that 71% of respondents agreed or strongly agreed that a lack of automation hinders the process of business partnering.

This was followed closely by a lack of standardization, and a lack of unification, or integration in corporate systems. Surprisingly the constraints of too many or too complex spreadsheets only hindered 61% of CFOs, the lowest of all obstacles but still a substantial stumbling block to effective partnerships. The hindrances reflect the need for better technology to manage the data that will unlock real inter-departmental insight, and 83% of CFOs said that better software to support data analytics is their most pressing need when supporting effective business partnerships.

Meanwhile 81% are looking to future technology to assist in data visualization to make improvements to their business partnering.

FSN4

This echoes the findings of FSN’s The Future of Planning, Budgeting and Forecasting survey which identified users of cutting edge visualization tools as the most effective forecasters. Being able to visually demonstrate financial data and ideas in an engaging and accessible way is particularly important in business partnering, when the counterparty doesn’t work in finance and may have only rudimentary knowledge of complex financial concepts.

Data is a clear differentiator. Business partners who can access, analyze and explain organizational data are more likely to

  • generate real insight,
  • engage their business partners
  • and become a positive agent of change and growth.

Click here to access Workiva’s and FSN’s Survey²

Integrating Finance, Risk and Regulatory Reporting (FRR) through Comprehensive Data Management

Data travels faster than ever, anywhere and all the time. Yet as fast as it moves, it has barely been able to keep up with the expanding agendas of financial supervisors. You might not know it to look at them, but the authorities in Basel, Washington, London, Singapore and other financial and political centers are pretty swift themselves when it comes to devising new requirements for compiling and reporting data. They seem to want nothing less than a renaissance in the way institutions organize and manage their finance, risk and regulatory reporting activities.

The institutions themselves might want the same thing. Some of the business strategies and tactics that made good money for banks before the global financial crisis have become unsustainable and cut into their profitability. More stringent regulator frameworks imposed since the crisis require the implementation of complex, data-intensive stress testing procedures and forecasting models that call for unceasing monitoring and updating. The days of static reports capturing a moment in a firm’s life are gone. One of the most challenging data management burdens is rooted in duplication. The evolution of regulations has left banks with various bespoke databases across five core functions:

  • credit,
  • treasury,
  • profitability analytics,
  • financial reporting
  • and regulatory reporting,

with the same data inevitably appearing and processed in multiple places. This hodgepodge of bespoke marts simultaneously leads to both the duplication of data and processes, and the risk of inconsistencies – which tend to rear their head at inopportune moments (i.e. when consistent data needs to be presented to regulators). For example,

  • credit extracts core loan, customer and credit data;
  • treasury pulls core cash flow data from all instruments;
  • profitability departments pull the same instrument data as credit and treasury and add ledger information for allocations;
  • financial reporting pulls ledgers and some subledgers for reporting;
  • and regulatory reporting pulls the same data yet again to submit reports to regulators per prescribed templates.

The ever-growing list of considerations has compelled firms to revise, continually and on the fly, not just how they manage their data but how they manage their people and basic organizational structures. An effort to integrate activities and foster transparency – in particular through greater cooperation among risk and finance – has emerged across financial services. This often has been in response to demands from regulators, but some of the more enlightened leaders in the industry see it as the most sensible way to comply with supervisory mandates and respond to commercial exigencies, as well. Their ability to do that has been constrained by the variety, frequency and sheer quantity of information sought by regulators, boards and senior executives. But that is beginning to change as a result of new technological capabilities and, at least as important, new management strategies. This is where the convergence of Finance, Risk and Regulatory Reporting (FRR) comes in. The idea behind the FRR theme is that sound regulatory compliance and sound business analytics are manifestations of the same set of processes. Satisfying the demands of supervisory authorities and maximizing profitability and competitiveness in the marketplace involve similar types of analysis, modeling and forecasting. Each is best achieved, therefore, through a comprehensive, collaborative organizational structure that places the key functions of finance, risk and regulatory reporting at its heart.

The glue that binds this entity together and enables it to function as efficiently and cost effectively as possible – financially and in the demands placed on staff – is a similarly comprehensive and unified FRR data management. The right architecture will permit data to be drawn upon from all relevant sources across an organization, including disparate legacy hardware and software accumulated over the years in silos erected for different activities ad geographies. Such an approach will reconcile and integrate this data and present it in a common, consistent, transparent fashion, permitting it to be deployed in the most efficient way within each department and for every analytical and reporting need, internal and external.

The immense demands for data, and for a solution to manage it effectively, have served as a catalyst for a revolutionary development in data management: Regulatory Technology, or RegTech. The definition is somewhat flexible and tends to vary with the motivations of whoever is doing the defining, but RegTech basically is the application of cutting-edge hardware, software, design techniques and services to the idiosyncratic challenges related to financial reporting and compliance. The myriad advances that fall under the RegTech rubric, such as centralized FRR or RegTech data management and analysis, data mapping and data visualization, are helping financial institutions to get out in front of the stringent reporting requirements at last and accomplish their efforts to integrate finance, risk and regulatory reporting duties more fully, easily and creatively.

A note of caution though: While new technologies and new thinking about how to employ them will present opportunities to eliminate weaknesses that are likely to have crept into the current architecture, ferreting out those shortcomings may be tricky because some of them will be so ingrained and pervasive as to be barely recognizable. But it will have to be done to make the most of the systems intended to improve or replace existing ones.

Just what a solution should encompass to enable firms to meet their data management objectives depends on the

  • specifics of its business, including its size and product lines,
  • the jurisdictions in which it operates,
  • its IT budget
  • and the tech it has in place already.

But it should accomplish three main goals:

  1. Improving data lineage by establishing a trail for each piece of information at any stage of processing
  2. Providing a user-friendly view of the different processing step to foster transparency
  3. Working together seamlessly with legacy systems so that implementation takes less time and money and imposes less of a burden on employees.

The two great trends in financial supervision – the rapid rise in data management and reporting requirements, and the demands for greater organizational integration – can be attributed to a single culprit: the lingering silo structure. Fragmentation continues to be supported by such factors as a failure to integrate the systems of component businesses after a merger and the tendency of some firms to find it more sensible, even if it may be more costly and less efficient in the long run, to install new hardware and software whenever a new set of rules comes along. That makes regulators – the people pressing institutions to break down silos in the first place – inadvertently responsible for erecting new barriers.

This bunker mentality – an entrenched system of entrenchment – made it impossible to recognize the massive buildup of credit difficulties that resulted in the global crisis. It took a series of interrelated events to spark the wave of losses and insolvencies that all but brought down the financial system. Each of them might have appeared benign or perhaps ominous but containable when taken individually, and so the occupants of each silo, who could only see a limited number of the warning signs, were oblivious to the extent of the danger. More than a decade has passed since the crisis began, and many new supervisory regimens have been introduced in its aftermath. Yet bankers, regulators and lawmakers still feel the need, with justification, to press institutions to implement greater organizational integration to try to forestall the next meltdown. That shows how deeply embedded the silo system is in the industry.

Data requirements for the development that, knock on wood, will limit the damage from the next crisis – determining what will happen, rather than identifying and explaining what has already happened – are enormous. The same goes for running an institution in a more integrated way. It’s not just more data that’s needed, but more kinds of data and more reliable data. A holistic, coordinated organizational structure, moreover, demands that data be analyzed at a higher level to reconcile the massive quantities and types of information produced within each department. And institutions must do more than compile and sort through all that data. They have to report it to authorities – often quarterly or monthly, sometimes daily and always when something is flagged that could become a problem. Indeed, some data needs to be reported in real time. That is a nearly impossible task for a firm still dominated by silos and highlights the need for genuinely new design and implementation methods that facilitate the seamless integration of finance, risk and regulatory reporting functions. Among the more data-intensive regulatory frameworks introduced or enhanced in recent years are:

  • IFRS 9 Financial Instruments and Current Expected Credit Loss. The respective protocols of the International Accounting Standards Board and Financial Accounting Standards Board may provide the best examples of the forwardthinking approach – and rigorous reporting, data management and compliance procedures – being demanded. The standards call for firms to forecast credit impairments to assets on their books in near real time. The incurred-loss model being replaced merely had banks present bad news after the fact. The number of variables required to make useful forecasts, plus the need for perpetually running estimates that hardly allow a chance to take a breath, make the standards some of the most data-heavy exercises of all.
  • Stress tests here, there and everywhere. Whether for the Federal Reserve’s Comprehensive Capital Analysis and Review (CCAR) for banks operating in the United States, the Firm Data Submission Framework (FDSF) in Britain or Asset Quality Reviews, the version conducted by the European Banking Authority (EBA) for institutions in the euro zone, stress testing has become more frequent and more free-form, too, with firms encouraged to create stress scenarios they believe fit their risk profiles and the characteristics of their markets. Indeed, the EBA is implementing a policy calling on banks to conduct stress tests as an ongoing risk management procedure and not merely an assessment of conditions at certain discrete moments.
  • Dodd-Frank Wall Street Reform and Consumer Protection Act. The American law expands stress testing to smaller institutions that escape the CCAR. The act also features extensive compliance and reporting procedures for swaps and other over-the-counter derivative contracts.
  • European Market Infrastructure Regulation. Although less broad in scope than Dodd-Frank, EMIR has similar reporting requirements for European institutions regarding OTC derivatives.
  • AnaCredit, Becris and FR Y-14. The European Central Bank project, known formally as the Analytical Credit Dataset, and its Federal Reserve equivalent for American banks, respectively, introduce a step change in the amount and granularity of data that needs to be reported. Information on loans and counterparties must be reported contract by contract under AnaCredit, for example. Adding to the complication and the data demands, the European framework permits national variations, including some with particularly rigorous requirements, such as the Belgian Extended Credit Risk Information System (Becris).
  • MAS 610. The core set of returns that banks file to the Monetary Authority of Singapore are being revised to require information at a far more granular level beginning next year. The number of data elements that firms have to report will rise from about 4,000 to about 300,000.
  • Economic and Financial Statistics Review (EFS). The Australian Prudential Authority’s EFS Review constitutes a wide-ranging update to the regulator’s statistical data collection demands. The sweeping changes include requests for more granular data and new forms in what would be a three-phase implementation spanning two years, requiring parallel and trial periods running through 2019 and beyond.

All of those authorities, all over the world, requiring that much more information present a daunting challenge, but they aren’t the only ones demanding that finance, risk and regulatory reporting staffs raise their games. Boards, senior executives and the real bosses – shareholders – have more stringent requirements of their own for profitability, capital efficiency, safety and competitiveness. Firms need to develop more effective data management and analysis in this cause, too.

The critical role of data management was emphasized and codified in Document 239 of the Basel Committee on Banking Supervision (BCBS), “Principles for Effective Risk Data Aggregation and Risk Reporting.” PERDARR, as it has come to be called in the industry, assigns data management a central position in the global supervisory architecture, and the influence of the 2013 paper can be seen in mandates far and wide. BCBS 239 explicitly linked a bank’s ability to gauge and manage risk with its ability to function as an integrated, cooperative unit rather than a collection of semiautonomous fiefdoms. The process of managing and reporting data, the document makes clear, enforces the link and binds holistic risk assessment to holistic operating practices. The Basel committee’s chief aim was to make sure that institutions got the big picture of their risk profile so as to reveal unhealthy concentrations of exposure that might be obscured by focusing on risk segment by segment. Just in case that idea might escape some executive’s notice, the document mentions the word “aggregate,” in one form or another, 86 times in the 89 ideas, observations, rules and principles it sets forth.

The importance of aggregating risks, and having data management and reporting capabilities that allow firms to do it, is spelled out in the first of these: ‘One of the most significant lessons learned from the global financial crisis that began in 2007 was that banks’ information technology (IT) and data architectures were inadequate to support the broad management of financial risks. Many banks lacked the ability to aggregate risk exposures and identify concentrations quickly and accurately at the bank group level, across business lines and between legal entities. Some banks were unable to manage their risks properly because of weak risk data aggregation capabilities and risk reporting practices. This had severe consequences to the banks themselves and to the stability of the financial system as a whole.’

If risk data management was an idea whose time had come when BCBS 239 was published five years ago, then RegTech should have been the means to implement the idea. RegTech was being touted even then, or soon after, as a set of solutions that would allow banks to increase the quantity and quality of the data they generate, in part because RegTech itself was quantitatively and qualitatively ahead of the hardware and software with which the industry had been making do. There was just one ironic problem: Many of the RegTech solutions on the market at the time were highly specialized and localized products and services from small providers. That encouraged financial institutions to approach data management deficiencies gap by gap, project by project, perpetuating the compartmentalized, siloed thinking that was the scourge of regulators and banks alike after the global crisis. The one-problem-at-a-time approach also displayed to full effect another deficiency of silos: a tendency for work to be duplicated, with several departments each producing the same information, often in different ways and with different results. That is expensive and time consuming, of course, and the inconsistencies that are likely to crop up make the data untrustworthy for regulators and for executives within the firm that are counting on it.

Probably the most critical feature of a well thought-out solution is a dedicated, focused and central FRR data warehouse that can chisel away at the barriers between functions, even at institutions that have been slow to abandon a siloed organizational structure reinforced with legacy systems.

FRR

With :

  • E : Extract
  • L : Load
  • T : Transform Structures
  • C : Calculations
  • A : Aggregation
  • P : Presentation

 

Click here to access Wolters Kluwer’s White Paper

 

 

Navigating the new world – Preparing for insurance accounting change (IFRS 17)

If implementation of the forthcoming insurance contracts standard is to reach the best possible outcome for your organization, we believe it needs to be seen as more than just a compliance exercise. This will entail

  • combining multiple strands into a common program,
  • identifying linkages
  • and addressing dependencies

across the business in a logical sequence and thinking strategically about possible effects on the organization and its stakeholders. A well-developed and ‘living’ plan assigns clear accountabilities and breaks down objectives into manageable tasks for delivery to realistic time-scales in order to establish an effective blue-print for success.

Our methodology groups activities into four manageable phases:

  1. assess the change
  2. design your response
  3. implement your design
  4. sustain your new practices, securely embedding them in business as usual.

Key success factors

Our experience shows us there are many factors that will contribute to successfully implementing insurance accounting change, including:

  1. Dedicated staff: In our experience the single biggest factor contributing to program success is the presence of full-time staff dedicated to the project, with a wide range of skills including data management, IT implementation and project management and who know your business.
  2. Spend sufficient time and energy on the initial impact phase: It is essential that an insurer plans for this critical phase and allows for sufficient time to perform a gap analysis on a line-by-line basis through the income statement and balance sheet and supports disclosures.
  3. Consider fundamental questions surrounding core business drivers: earnings trends, growth opportunities and target operating models. The earlier effects are identified, the more time an insurer will have to develop and implement a strategic response.
  4. Training staff: Many organizations underestimate the amount of personnel training required. Designing a comprehensive training strategy and program is highly complex and requires careful planning.
  5. Robust project planning: The plan must be achievable and continuously refined with formal tracking and monitoring.
  6. Clear communications: Communication needs to be both formal and informal and applied throughout the life of the program.
  7. Careful change management: IFRS conversion will lead to significant changes in how people do their jobs. Some of the biggest challenges have arisen when the cultural issues have not been acknowledged and addressed.
  8. More than just an accounting and actuarial project: Implementing the forthcoming insurance contracts project will undoubtedly be a multi-disciplinary effort.
    1. IT specialists consider the functionality of source systems and enterprise performance management (EPM) systems;
    2. Change management specialists focus on behavioral change and communication;
    3. specialists in commercial functions (tax, data management, executive incentives, etc.) bring a holistic approach to the program.

Robust project management helps to bring everything together coherently.

Assessing what the forthcoming standard will mean for you

Accounting, actuarial, tax and reporting

Q1. What are the key accounting, actuarial, tax and disclosure differences between our current generally accepted accounting principles (GAAP) and the new standards? What are the key decisions that need to be made by management regarding the alternative treatments that are available?

Data, systems and processes

Q2. What will the impact be for our data requirements, and on the systems and processes used for

  • data collection,
  • actuarial projections,
  • calculating and accruing interest on the contractual service margin
  • and consolidation and financial reporting systems?

Are there quick fixes that we can use? Can we leverage recent investments in infrastructure or will we need a major overhaul?

Q3. How will the group‘s close and other processes be impacted?

Business

Q4. What is the estimated directional impact on profit and equity and what are the key decisions and judgments that this will influence?

Q5. What are the key impacts for my business and how will these be influenced by the choices open to us? Who will need to understand results and metrics on the new basis?

People and change management

Q6. Who will be impacted by the conversion, what skills and resources are likely to be needed and what training needs can we identify?

Program management

Q7. What would a high-level conversion plan look like and what is its likely impact on resources?

IFRS17 3

Click here to access KPMG’s methodology paper

Global Governance Insights on Emerging Risks

A HEIGHTENED FOCUS ON RESPONSE AND RECOVERY

Over a third of directors of US public companies now discuss cybersecurity at every board meeting. Cyber risks are being driven onto the agenda by

  • high-profile data breaches,
  • distributed denial of services (DDoS) attacks,
  • and rising ransomware and cyber extortion attacks.

The concern about cyber risks is justified. The annual economic cost of cyber-crime is estimated at US$1.5 trillion and only about 15% of that loss is currently covered by insurance.

MMC Global Risk Center conducted research and interviews with directors from WCD to understand the scope and depth of cyber risk management discussions in the boardroom. The risk of cyberattack is a constantly evolving threat and the interviews highlighted the rising focus on resilience and recovery in boardroom cyber discussions. Approaches to cyber risks are maturing as organizations recognize them as an enterprise business risk, not just an information technology (IT) problem.

However, board focus varies significantly across industries, geographies, organization size and regulatory context. For example, business executives ranked cyberattacks among the top five risks of doing business in the Asia Pacific region but Asian organizations take 1.7 times longer than the global median to discover a breach and spend on average 47% less on information security than North American firms.

REGULATION ON THE RISE

Tightening regulatory requirements for cybersecurity and breach notification across the globe such as

  • the EU GDPR,
  • China’s new Cyber Security Law,
  • and Australia’s Privacy Amendment,

are also propelling cyber onto the board agenda. Most recently, in February 2018, the USA’s Securities and Exchange Commission (SEC) provided interpretive guidance to assist public companies in preparing disclosures about cybersecurity risks and incidents.

Regulations relating to transparency and notifications around cyber breaches drive greater discussion and awareness of cyber risks. Industries such as

  • financial services,
  • telecommunications
  • and utilities,

are subject to a large number of cyberattacks on a daily basis and have stringent regulatory requirements for cybersecurity.

Kris Manos, Director, KeyCorp, Columbia Forest Products, and Dexter Apache Holdings, observed, “The manufacturing sector is less advanced in addressing cyber threats; the NotPetya and WannaCry attacks flagged that sector’s vulnerability and has led to a greater focus in the boardroom.” For example, the virus forced a transportation company to shut down all of its communications with customers and also within the company. It took several weeks before business was back to normal, and the loss of business was estimated to have been as high as US$300 million. Overall, it is estimated that as a result of supply chain disruptions, consumer goods manufacturers, transport and logistics companies, pharmaceutical firms and utilities reportedly suffered, in aggregate, over US$1 billion in economic losses from the NotPetya attacks. Also, as Cristina Finocchi Mahne, Director, Inwit, Italiaonline, Banco Desio, Natuzzi and Trevi Group, noted, “The focus on cyber can vary across industries depending also on their perception of their own clients’ concerns regarding privacy and data breaches.”

LESSONS LEARNED: UPDATE RESPONSE PLANS AND EVALUATE THIRD-PARTY RISK

The high-profile cyberattacks in 2017, along with new and evolving ransomware onslaughts, were learning events for many organizations. Lessons included the need to establish relationships with organizations that can assist in the event of a cyberattack, such as l

  • aw enforcement,
  • regulatory agencies and recovery service providers
  • including forensic accountants and crisis management firms.

Many boards need to increase their focus on their organization’s cyber incident response plans. A recent global survey found that only 30% of companies have a cyber response plan and a survey by the National Association of Corporate Directors (NACD) suggests that only 60% of boards have reviewed their breach response plan over the past 12 months. Kris Manos noted, “[If an attack occurs,] it’s important to be able to quickly access a response plan. This also helps demonstrate that the organization was prepared to respond effectively.”

Experienced directors emphasized the need for effective response plans alongside robust cyber risk mitigation programs to ensure resilience, as well as operational and reputation recovery. As Jan Babiak, Director, Walgreens Boots Alliance, Euromoney Institutional Investor, and Bank of Montreal, stressed, “The importance of the ’respond and recover’ phase cannot be overstated, and this focus needs to rapidly improve.”

Directors need to review how the organization will communicate and report breaches. Response plans should include preliminary drafts of communications to all stakeholders including customers, suppliers, regulators, employees, the board, shareholders, and even the general public. The plan should also consider legal requirements around timelines to report breaches so the organization is not hit with financial penalties that can add to an already expensive and reputationally damaging situation. Finally, the response plan also needs to consider that normal methods of communication (websites, email, etc.) may be casualties of the breach. A cyber response plan housed only on the corporate network may be of little use in a ransomware attack.

Other lessons included the need to focus on cyber risks posed by third-party suppliers, vendors and other impacts throughout the supply chain. Shirley Daniel, Director, American Savings Bank, and Pacific Asian Management Institute, noted, “Such events highlight vulnerability beyond your organization’s control and are raising the focus on IT security throughout the supply chain.” Survey data suggests that about a third of organizations do not assess the cyber risk of vendors and suppliers. This is a critical area of focus as third-party service providers (e.g., software providers, cloud services providers, etc.) are increasingly embedded in value chains.

FRUSTRATIONS WITH OVERSIGHT

Most directors expressed frustrations and challenges with cyber risk oversight even though the topic is frequently on meeting agendas. Part of the challenge is that director-level cyber experts are thin on the ground; most boards have only one individual serving as the “tech” or “cyber” person. A Spencer Stuart survey found that 41% of respondents said their board had at least one director with cyber expertise, with an additional 7% who are in the process of recruiting one. Boards would benefit from the addition of experienced individuals who can identify the connections between cybersecurity and overall company strategy.

A crucial additional challenge is obtaining clarity on the organization’s overall cyber risk management framework. (See Exhibit 1: Boards Need More Information on Cyber Investments.) Olga Botero, Director, Evertec, Inc., and Founding Partner, C&S Customers and Strategy, observed, “There are still many questions unanswered for boards, including:

  • How good is our security program?
  • How do we compare to peers?

There is a big lack of benchmarking on practices.” Anastassia Lauterbach, Director, Dun & Bradstreet, and member of Evolution Partners Advisory Board, summarized it well, “Boards need a set of KPIs for cybersecurity highlighting their company’s

  • unique business model,
  • legacy IT,
  • supplier and partner relationships,
  • and geographical scope.”

CR Ex 1

Nearly a quarter of boards are dissatisfied with the quality of management-provided information related to cybersecurity because of insufficient transparency, inability to benchmark and difficulty of interpretation.

EFFECTIVE OVERSIGHT IS BUILT ON A COMPREHENSIVE CYBER RISK MANAGEMENT FRAMEWORK

Organizations are maturing from a “harden the shell” approach to a protocol based on understanding and protecting core assets and optimizing resources. This includes the application of risk disciplines to assess and manage risk, including quantification and analytics. (See Exhibit 2: Focus Areas of a Comprehensive Cyber Risk Management Framework.) Quantification shifts the conversation from a technical discussion about threat vectors and system vulnerabilities to one focused on maximizing the return on an organization’s cyber spending and lowering its total cost of risk.

CR Ex 2

Directors also emphasized the need to embed the process in an overall cyber risk management framework and culture. “The culture must emphasize openness and learning from mistakes. Culture and cyber risk oversight go hand in hand,” said Anastassia Lauterbach. Employees should be encouraged to flag and highlight potential cyber incidents, such as phishing attacks, as every employee plays a vital role in cyber risk management. Jan Babiak noted, “If every person in the organization doesn’t view themselves as a human firewall, you have a soft underbelly.” Mary Beth Vitale, Director, GEHA and CoBiz Financial, Inc., also noted, “Much of cyber risk mitigation is related to good housekeeping such as timely patching of servers and ongoing employee training and alertness.”

Boards also need to be alert. “Our board undertakes the same cybersecurity training as employees,” noted Wendy Webb, Director, ABM Industries. Other boards are putting cyber updates and visits to security centers on board “offsite” agendas.

THE ROLE OF CYBER INSURANCE

Although the perception of many directors is that cyber insurance provides for limited coverage, the insurance is increasingly viewed as an important component of a cyber risk management framework and can support response and recovery plans. Echoing this sentiment, Geeta Mathur, Director, Motherson Sumi Ltd, IIFL Holdings Ltd, and Tata Communication Transformation Services Ltd., commented, « There is a lack of information and discussion on risk transfer options at the board level. The perception is that it doesn’t cover much particularly relating to business interruption on account of cyber threats.” Cristina Finocchi Mahne also noted, “Currently, management teams may not have a positive awareness of cyber insurance, but we expect this to rapidly evolve over the short-term.”

Insurance does not release the board or management from the development and execution of a robust risk management plan but it can provide a financial safeguard against costs associated with a cyber event. Cyber insurance coverage should be considered in the context of an overall cyber risk management process and cyber risk appetite.

With a robust analysis, the organization can

  • quantify the price of cyber risk,
  • develop effective risk mitigation,
  • transfer and risk financing strategy,
  • and decide if – and how much – cyber insurance to purchase.

This allows the board to have a robust conversation on the relationship between risk, reward and the cost of mitigation and can also prompt an evaluation of potential consequences by using statistical modeling to assess different damage scenarios.

CYBER INSURANCE ADOPTION IS INCREASING

The role of insurance in enhancing cyber resilience is increasingly being recognized by policymakers around the world, and the Organisation of Economic Co-operation and Development (OECD) is recommending actions to stimulate cyber insurance adoption.

Globally, it is expected the level of future demand for cyber insurance will depend on the frequency of high-profile cyber incidents as well as the evolving legislative and regulatory environment for privacy protections in many countries. In India, for example, there was a 50% increase in companies buying cybersecurity coverage 2016 to 2017. Research suggests that only 40% of US boards have reviewed their organization’s cyber insurance coverage in the past 12 months.

LIMITING FINANCIAL LOSSES

In the event of a debilitating attack, cyber insurance and associated services can limit an organization’s financial damage from direct and indirect costs and help accelerate its recovery. (See Exhibit 3: Direct and Indirect Costs Associated with a Cyber Attack.) For example, as a result of the NotPetya attack, one global company reported a decline in operating margins and income, with losses in excess of US$500 million in the last fiscal year. The company noted the costs were driven by

  • investments in enhanced systems in order to prevent future attacks;
  • cost of incentives offered to customers to restore confidence and maintain business relationships;
  • additional costs due to claims for service failures; costs associated with data breach or data loss due to third-parties;
  • and “other consequences of which we are not currently aware but may subsequently discover.”

Indeed, the very process of assessing and purchasing cyber insurance can bolster cyber resilience by creating important incentives that drive behavioral change, including:

  • Raising awareness inside the organization on the importance of information security.
  • Fostering a broader dialogue among the cyber risk stakeholders within an organization.
  • Generating an organization-wide approach to ongoing cyber risk management by all aspects of the organization.
  • Assessing the strength of cyber defenses, particularly amid a rapidly changing cyber environment.

CR Ex 3

Click here to access Marsh’s and WCD’s detailed report

 

2018 AI predictions – 8 insights to shape your business strategy

  1. AI will impact employers before it impacts employment
  2. AI will come down to earth—and get to work
  3. AI will help answer the big question about data
  4. Functional specialists, not techies, will decide the AI talent race
  5. Cyberattacks will be more powerful because of AI—but so
    will cyberdefense
  6. Opening AI’s black box will become a priority
  7. Nations will spar over AI
  8. Pressure for responsible AI won’t be on tech companies alone

Key implications

1) AI will impact employers before it impacts employment

As signs grow this year that the great AI jobs disruption will be a false alarm, people are likely to more readily accept AI in the workplace and society. We may hear less about robots taking our jobs, and more about robots making our jobs (and lives) easier. That in turn may lead to a faster uptake of AI than some organizations are expecting.

2) AI will come down to earth—and get to work

Leaders don’t need to adopt AI for AI’s sake. Instead, when they look for the best solution to a business need, AI will increasingly play a role. Does the organization want to automate billing, general accounting and budgeting, and many compliance functions? How about automating parts of procurement, logistics, and customer care? AI will likely be a part of the solution, whether or not users even perceive it.

3) AI will help answer the big question about data

Those enterprises that have already addressed data governance for one application will have a head start on the next initiative. They’ll be on their way to developing best practices for effectively leveraging their data resources and working across organizational boundaries. There’s no substitute for organizations getting their internal data ready to support AI and other innovations, but there is a supplement: Vendors are increasingly taking public sources of data, organizing it into data lakes, and preparing it for AI to use.

4) Functional specialists, not techies, will decide the AI talent race

Enterprises that intend to take full advantage of AI shouldn’t just bid for the most brilliant computer scientists. If they want to get AI up and running quickly, they should move to provide functional specialists with AI literacy. Larger organizations should prioritize by determining where AI is likely to disrupt operations first and start upskilling there.

5) Cyberattacks will be more powerful because of AI—but so will cyberdefense

In other parts of the enterprise, many organizations may choose to go slow on AI, but in cybersecurity there’s no holding back: Attackers will use AI, so defenders will have to use it too. If an organization’s IT department or cybersecurity provider isn’t already using AI, it has to start thinking immediately about AI’s short- and long-term security applications. Sample use cases include distributed denial of service (DDOS) pattern recognition, prioritization of log alerts for escalation and investigation, and risk-based authentication. Since even AI-wary organizations will have to use AI for cybersecurity, cyberdefense will be many enterprises’ first experience with AI. We see this fostering familiarity with AI and willingness to use it elsewhere. A further spur to AI acceptance will come from its hunger for data: The greater AI’s presence and access to data throughout an organization, the better it can defend against cyberthreats. Some organizations are already building out on-premise and cloud-based “threat lakes,” that will enable AI capabilities.

6) Opening AI’s black box will become a priority

We expect organizations to face growing pressure from end users and regulators to deploy AI that is explainable, transparent, and provable. That may require vendors to share some secrets. It may also require users of deep learning and other advanced AI to deploy new techniques that can explain previously incomprehensible AI. Most AI can be made explainable—but at a cost. As with any other process, if every step must be documented and explained, the process becomes slower and may be more expensive. But opening black boxes will reduce certain risks and help establish stakeholder trust.

7) Nations will spar over AI

If China starts to produce leading AI developments, the West may respond. Whether it’s a “Sputnik moment” or a more gradual realization that they’re losing their lead, policymakers may feel pressure to change regulations and provide funding for AI. More countries should issue AI strategies, with implications for companies. It wouldn’t surprise us to see Europe, which is already moving to protect individuals’ data through its General Data Protection Regulation (GDPR), issue policies to foster AI in the region.

8) Pressure for responsible AI won’t be on tech companies alone

As organizations face pressure to design, build, and deploy AI systems that deserve trust and inspire it, many will establish teams and processes to look for bias in data and models and closely monitor ways malicious actors could “trick” algorithms. Governance boards for AI may also be appropriate for many enterprises.

AI PWC

Click here to access PWC’s detailed predictions report

 

Keeping up with shifting compliance goalposts in 2018 – Five focal areas for investment

Stakeholders across the organization are increasingly seeking greater compliance effectiveness, efficiency, cost cutting, and agility in compliance activities to further compete in the expanding digital and automated world.

Organizations are being reinforced this way to continuously improve their compliance activities, because in the future, integration and automation of compliance activities is an imperative. To prepare for tomorrow, organizations must invest today.

When positioning your organization for the future, keep in mind the following five areas for investment:

1. Operational integration

Regulators are increasingly spotlighting the need for operational integration within a compliance risk management program, meaning that compliance needs to be integrated in business processes and into people’s performance of their job duties on a day-to-day basis.

When approaching the governance of managing compliance efforts, a more centralized, or a hybrid approach, strengthens the organization’s overall compliance risk management control environment.

2. Automation of compliance activities

The effectiveness of compliance increases when there is integration across an enterprise and successful automation of processes. Compliance leaders are turning toward intelligent automation as an answer for slimming down compliance costs, and becoming more nimble and agile in an ever-increasingly competitive world. When intelligent automation is on the table to support possible compliance activities, some important considerations must be made:

  • Compliance program goals for the future
  • Implementation dependencies and interdependencies
  • Determining how automation will and can support the business
  • Enhancing competitiveness and agility in executing its compliance activities

Automating compliance activities can also help augment resource allocation and realize greater accuracy by implementing repetitive tasks into the automation.

3. Accountability

Regulators increasingly expect organization to implement performance management and compensation programs to encourage prudent risk-taking. In fact, identified by the KPMG CCO Survey, 55% of CCOs identified “enhancing accountability and compliance responsibilities” as a top 3 priority in 2017.

It is essential that disciplinary and incentive protocols be consistently applied to high-level employees. To do so sends a message that seniority and success do not exempt anyone from following the rules.

4. Formalized risk assessments

Regulatory guidelines and expectations released in 2017 set forth specific focal areas that compliance leaders should ensure are covered in their risk assessments.

  • Evaluating the data needs of the compliance program can help the organization migrate to a more data-driven metrics environment in a controlled way.
  • Availability, integrity, and accuracy of data is needed to understand and assess compliance risks enterprise-wide. The use of data quality assessments to evaluate the compliance impact can help address this challenge.
  • Implementing a data governance model to share data across the 3 lines of defense is a good way of reassuring data owners and stakeholders that the data will be used consistent with the agreed upon model.
  • Further integration and aggregation of data is needed to avoid unintentionally ‘underestimating” compliance risks because of continuous change in measurement of compliance programs and data & analytics.
  • To maximize the benefits of data & analytics, leading organizations are building analytics directly into their compliance processes in order to identify risk scenarios in real time and to enhance their risk coverage in a cost-effective way.

5. Continuous improvement

Compliance efforts by organizations need to continuously evolve to ensure the control environment remains firm while risk trends appear, risks emerge, and regulatory expectations shift.

Compliance and business leaders must continuously improve their compliance activities in pursuit of greater effectiveness, efficiency, agility, and resiliency. Because by continuously improving, organizations can methodically position their organizations for the future.

KPMG

Click here to access KPMG’s detailed White Paper

The General Data Protection Regulation (GDPR) Primer – What The Insurance Industry Needs To Know, And How To Overcome Cyber Risk Liability As A Result.

SCOPE

The regulation applies if the

  • data controller (organization that collects data from EU residents)
  • or processor (organization that processes data on behalf of data controller e.g. cloud service providers)
  • or the data subject (person)

is based in the EU. Furthermore, the Regulation also applies to organizations based outside the European Union if they collect or process personal data of EU residents. Per the European Commission, “personal data is any information relating to an individual, whether it relates to his or her private, professional or public life. It can be anything from

  • a name,
  • a home address,
  • a photo,
  • an email address,
  • bank details,
  • posts on social networking websites,
  • medical information,
  • or a computer’s IP address.”

The regulation does not apply to the processing of personal data for national security activities or law enforcement; however, the data protection reform package includes a separate Data Protection Directive for the police and criminal justice sector that provides robust rules on personal data exchanges at national, European and international level.

SINGLE SET OF RULES AND ONE-STOP SHOP

A single set of rules will apply to all EU member states. Each member state will establish an independent Supervisory Authority (SA) to hear and investigate complaints, sanction administrative breaches, etc. SA’s in each member state will cooperate with other SA’s, providing mutual assistance and organizing joint operations. Where a business has multiple establishments in the EU, it will have a single SA as its “lead authority”, based on the location of its “main establishment” (i.e., the place where the main processing activities take place). The lead authority will act as a “one-stop shop” to supervise all the processing activities of that business throughout the EU. A European Data Protection Board (EDPB) will coordinate the SAs.

There are exceptions for data processed in an employment context and data processed security, that still might be subject to individual country regulations.

RESPONSIBILITY AND ACCOUNTABILITY

The notice requirements remain and are expanded. They must include the retention time for personal data and contact information for data controller and data protection officer must be provided.

Automated individual decision-making, including profiling (Article 22) is made disputable. Citizens now have the right to question and fight decisions that affect them that have been made on a purely computer generated basis.

To be able to demonstrate compliance with the GDPR, the data controller should implement measures which meet the principles of data protection by design and data protection by default. Privacy by Design and by Default require that data protection measures are designed into the development of business processes for products and services. Such measures include pseudonymizing personal data, by the controller, as soon as possible.

It is the responsibility and liability of the data controller to implement effective measures and can demonstrate the compliance of processing activities even if the processing is carried out by a data processor on behalf of the controller.

Data Protection Impact Assessments must be conducted when specific risks occur to the rights and freedoms of data subjects. Risk assessment and mitigation is required and prior approval of the Data Protection Authorities (DPA) is required for high risks. Data Protection Officers (DPO) are to ensure compliance within organizations.

DPO must be appointed:

  • for all public authorities, except for courts acting in their judicial capacity
  • if the core activities of the controller or the processor consist of
  • by their nature, their scope and/or their purposes, require regular and systematic
    monitoring of data subjects on a large scale
  • processing on a large scale of special categories of data pursuant to Article 9 and
    personal data relating to criminal convictions and offences referred to in Article 10
    processing operations which, for the purposes of national

GDPR in a Box

 

Click here to access Clarium’s detailed paper

Mastering Risk with “Data-Driven GRC”

Overview

The world is changing. The emerging risk landscape in almost every industry vertical has changed. Effective methodologies for managing risk have changed (whatever your perspective:

  • internal audit,
  • external audit/consulting,
  • compliance,
  • enterprise risk management,

or otherwise).

Finally, technology itself has changed, and technology consumers expect to realize more value, from technology that is more approachable, at lower cost.

How are these factors driving change in organizations?:

Emerging Risk Landscapes

Risk has the attention of top executives. Risk shifts quickly in an economy where “speed of change” is the true currency of business, and it emerges in entirely new forms in a world where globalization and automation are forcing shifts in the core values and initiatives of global enterprises.

Evolving Governance, Risk, and Compliance Methodologies

Across risk and control oriented functions spanning a variety of

  • audit functions,
  • fraud,
  • compliance,
  • quality management,
  • enterprise risk management,
  • financial control,

and many more, global organizations are acknowledging a need to provide more risk coverage at lower cost (measured in both time and currency), which is driving reinventions of methodology and automation.

Empowerment Through Technology

Gartner, the leading analyst firm in the enterprise IT space, is very clear that the convergence of four forces,

  • Cloud,
  • Mobile,
  • Data,
  • and Social

is driving the empowerment of individuals as they interact with each other and their information through well-designed technology. In most organizations, there is no coordinated effort to leverage organizational changes emerging from these three factors in order to develop an integrated approach to mastering risk management. The emerging opportunity is to leverage the change that is occurring, to develop new programs; not just for technology, of course, but also for the critical people, methodology, and process issues. The goal is to provide senior management with a comprehensive and dynamic view of the effectiveness of how an organization is managing risk and embracing change, set in the context of overall strategic and operational objectives.

Where are organizations heading?

“Data Driven GRC” represents a consolidation of methodologies, both functional and technological, that dramatically enhance the opportunity to address emerging risk landscapes and, in turn, maximizing the reliability of organizational performance. This paper examines the key opportunities to leverage change—both from a risk and an organizational performance management perspective—to build integrated, data-driven GRC processes that optimize the value of audit and risk management activities, as well as the investments in supporting tools and techniques.

Data Driven GRC

Click here to access ACL’s detailed White Paper