Building your data and analytics strategy

When it comes to being data-driven, organizations run the gamut with maturity levels. Most believe that data and analytics provide insights. But only one-third of respondents to a TDWI survey said they were truly data-driven, meaning they analyze data to drive decisions and actions.

Successful data-driven businesses foster a collaborative, goal-oriented culture. Leaders believe in data and are governance-oriented. The technology side of the business ensures sound data quality and puts analytics into operation. The data management strategy spans the full analytics life cycle. Data is accessible and usable by multiple people – data engineers and data scientists, business analysts and less-technical business users.

TDWI analyst Fern Halper conducted research of analytics and data professionals across industries and identified the following five best practices for becoming a data-driven organization.

1. Build relationships to support collaboration

If IT and business teams don’t collaborate, the organization can’t operate in a data-driven way – so eliminating barriers between groups is crucial. Achieving this can improve market performance and innovation; but collaboration is challenging. Business decision makers often don’t think IT understands the importance of fast results, and conversely, IT doesn’t think the business understands data management priorities. Office politics come into play.

But having clearly defined roles and responsibilities with shared goals across departments encourages teamwork. These roles should include: IT/architecture, business and others who manage various tasks on the business and IT sides (from business sponsors to DevOps).

2. Make data accessible and trustworthy

Making data accessible – and ensuring its quality – are key to breaking down barriers and becoming data-driven. Whether it’s a data engineer assembling and transforming data for analysis or a data scientist building a model, everyone benefits from trustworthy data that’s unified and built around a common vocabulary.

As organizations analyze new forms of data – text, sensor, image and streaming – they’ll need to do so across multiple platforms like data warehouses, Hadoop, streaming platforms and data lakes. Such systems may reside on-site or in the cloud. TDWI recommends several best practices to help:

  • Establish a data integration and pipeline environment with tools that provide federated access and join data across sources. It helps to have point-and-click interfaces for building workflows, and tools that support ETL, ELT and advanced specifications like conditional logic or parallel jobs.
  • Manage, reuse and govern metadata – that is, the data about your data. This includes size, author, database column structure, security and more.
  • Provide reusable data quality tools with built-in analytics capabilities that can profile data for accuracy, completeness and ambiguity.

3. Provide tools to help the business work with data

From marketing and finance to operations and HR, business teams need self-service tools to speed and simplify data preparation and analytics tasks. Such tools may include built-in, advanced techniques like machine learning, and many work across the analytics life cycle – from data collection and profiling to monitoring analytical models in production.

These “smart” tools feature three capabilities:

  • Automation helps during model building and model management processes. Data preparation tools often use machine learning and natural language processing to understand semantics and accelerate data matching.
  • Reusability pulls from what has already been created for data management and analytics. For example, a source-to-target data pipeline workflow can be saved and embedded into an analytics workflow to create a predictive model.
  • Explainability helps business users understand the output when, for example, they’ve built a predictive model using an automated tool. Tools that explain what they’ve done are ideal for a data-driven company.

4. Consider a cohesive platform that supports collaboration and analytics

As organizations mature analytically, it’s important for their platform to support multiple roles in a common interface with a unified data infrastructure. This strengthens collaboration and makes it easier for people to do their jobs.

For example, a business analyst can use a discussion space to collaborate with a data scientist while building a predictive model, and during testing. The data scientist can use a notebook environment to test and validate the model as it’s versioned and metadata is captured. The data scientist can then notify the DevOps team when the model is ready for production – and they can use the platform’s tools to continually monitor the model.

5. Use modern governance technologies and practices

Governance – that is, rules and policies that prescribe how organizations protect and manage their data and analytics – is critical in learning to trust data and become data-driven. But TDWI research indicates that one-third of organizations don’t govern their data at all. Instead, many focus on security and privacy rules. Their research also indicates that fewer than 20 percent of organizations do any type of analytics governance, which includes vetting and monitoring models in production.

Decisions based on poor data – or models that have degraded – can have a negative effect on the business. As more people across an organization access data and build  models, and as new types of data and technologies emerge (big data, cloud, stream mining), data governance practices need to evolve. TDWI recommends three features of governance software that can strengthen your data and analytics governance:

  • Data catalogs, glossaries and dictionaries. These tools often include sophisticated tagging and automated procedures for building and keeping catalogs up to date – as well as discovering metadata from existing data sets.
  • Data lineage. Data lineage combined with metadata helps organizations understand where data originated and track how it was changed and transformed.
  • Model management. Ongoing model tracking is crucial for analytics governance. Many tools automate model monitoring, schedule updates to keep models current and send alerts when a model is degrading.

In the future, organizations may move beyond traditional governance council models to new approaches like agile governance, embedded governance or crowdsourced governance.

But involving both IT and business stakeholders in the decision-making process – including data owners, data stewards and others – will always be key to robust governance at data-driven organizations.

SAS1

There’s no single blueprint for beginning a data analytics project – never mind ensuring a successful one.

However, the following questions help individuals and organizations frame their data analytics projects in instructive ways. Put differently, think of these questions as more of a guide than a comprehensive how-to list.

1. Is this your organization’s first attempt at a data analytics project?

When it comes to data analytics projects, culture matters. Consider Netflix, Google and Amazon. All things being equal, organizations like these have successfully completed data analytics projects. Even better, they have built analytics into their cultures and become data-driven businesses.

As a result, they will do better than neophytes. Fortunately, first-timers are not destined for failure. They should just temper their expectations.

2. What business problem do you think you’re trying to solve?

This might seem obvious, but plenty of folks fail to ask it before jumping in. Note here how I qualified the first question with “do you think.” Sometimes the root cause of a problem isn’t what we believe it to be; in other words, it’s often not what we at first think.

In any case, you don’t need to solve the entire problem all at once by trying to boil the ocean. In fact, you shouldn’t take this approach. Project methodologies (like agile) allow organizations to take an iterative approach and embrace the power of small batches.

3. What types and sources of data are available to you?

Most if not all organizations store vast amounts of enterprise data. Looking at internal databases and data sources makes sense. Don’t make the mistake of believing, though, that the discussion ends there.

External data sources in the form of open data sets (such as data.gov) continue to proliferate. There are easy methods for retrieving data from the web and getting it back in a usable format – scraping, for example. This tactic can work well in academic environments, but scraping could be a sign of data immaturity for businesses. It’s always best to get your hands on the original data source when possible.

Caveat: Just because the organization stores it doesn’t mean you’ll be able to easily access it. Pernicious internal politics stifle many an analytics endeavor.

4. What types and sources of data are you allowed to use?

With all the hubbub over privacy and security these days, foolish is the soul who fails to ask this question. As some retail executives have learned in recent years, a company can abide by the law completely and still make people feel decidedly icky about the privacy of their purchases. Or, consider a health care organization – it may not technically violate the Health Insurance Portability and Accountability Act of 1996 (HIPAA), yet it could still raise privacy concerns.

Another example is the GDPR. Adhering to this regulation means that organizations won’t necessarily be able to use personal data they previously could use – at least not in the same way.

5. What is the quality of your organization’s data?

Common mistakes here include assuming your data is complete, accurate and unique (read: nonduplicate). During my consulting career, I could count on one hand the number of times a client handed me a “perfect” data set. While it’s important to cleanse your data, you don’t need pristine data just to get started. As Voltaire said, “Perfect is the enemy of good.”

6. What tools are available to extract, clean, analyze and present the data?

This isn’t the 1990s, so please don’t tell me that your analytic efforts are limited to spreadsheets. Sure, Microsoft Excel works with structured data – if the data set isn’t all that big. Make no mistake, though: Everyone’s favorite spreadsheet program suffers from plenty of limitations, in areas like:

  • Handling semistructured and unstructured data.
  • Tracking changes/version control.
  • Dealing with size restrictions.
  • Ensuring governance.
  • Providing security.

For now, suffice it to say that if you’re trying to analyze large, complex data sets, there are many tools well worth exploring. The same holds true for visualization. Never before have we seen such an array of powerful, affordable and user-friendly tools designed to present data in interesting ways.

Caveat 1: While software vendors often ape each other’s features, don’t assume that each application can do everything that the others can.

Caveat 2: With open source software, remember that “free” software could be compared to a “free” puppy. To be direct: Even with open source software, expect to spend some time and effort on training and education.

7. Do your employees possess the right skills to work on the data analytics project?

The database administrator may well be a whiz at SQL. That doesn’t mean, though, that she can easily analyze gigabytes of unstructured data. Many of my students need to learn new programs over the course of the semester, and the same holds true for employees. In fact, organizations often find that they need to:

  • Provide training for existing employees.
  • Hire new employees.
  • Contract consultants.
  • Post the project on sites such as Kaggle.
  • All of the above.

Don’t assume that your employees can pick up new applications and frameworks 15 minutes at a time every other week. They can’t.

8. What will be done with the results of your analysis?

A company routinely spent millions of dollars recruiting MBAs at Ivy League schools only to see them leave within two years. Rutgers MBAs, for their part, stayed much longer and performed much better.

Despite my findings, the company continued to press on. It refused to stop going to Harvard, Cornell, etc. because of vanity. In his own words, the head of recruiting just “liked” going to these schools, data be damned.

Food for thought: What will an individual, group, department or organization do with keen new insights from your data analytics projects? Will the result be real action? Or will a report just sit in someone’s inbox?

9. What types of resistance can you expect?

You might think that people always and willingly embrace the results of data-oriented analysis. And you’d be spectacularly wrong.

Case in point: Major League Baseball (MLB) umpires get close ball and strike calls wrong more often than you’d think. Why wouldn’t they want to improve their performance when presented with objective data? It turns out that many don’t. In some cases, human nature makes people want to reject data and analytics that contrast with their world views. Years ago, before the subscription model became wildly popular, some Blockbuster executives didn’t want to believe that more convenient ways to watch movies existed.

Caveat: Ignore the power of internal resistance at your own peril.

10. What are the costs of inaction?

Sure, this is a high-level query and the answers depend on myriad factors.

For instance, a pharma company with years of patent protection will respond differently than a startup with a novel idea and competitors nipping at its heels. Interesting subquestions here include:

  • Do the data analytics projects merely confirm what we already know?
  • Do the numbers show anything conclusive?
  • Could we be capturing false positives and false negatives?

Think about these questions before undertaking data analytics projects Don’t take the queries above as gospel. By and large, though, experience proves that asking these questions frames the problem well and sets the organization up for success – or at least minimizes the chance of a disaster.

SAS2

Most organizations understand the importance of data governance in concept. But they may not realize all the multifaceted, positive impacts of applying good governance practices to data across the organization. For example, ensuring that your sales and marketing analytics relies on measurably trustworthy customer data can lead to increased revenue and shorter sales cycles. And having a solid governance program to ensure your enterprise data meets regulatory requirements could help you avoid penalties.

Companies that start data governance programs are motivated by a variety of factors, internal and external. Regardless of the reasons, two common themes underlie most data governance activities: the desire for high-quality customer information, and the need to adhere to requirements for protecting and securing that data.

What’s the best way to ensure you have accurate customer data that meets stringent requirements for privacy and security?

For obvious reasons, companies exert significant effort using tools and third-party data sets to enforce the consistency and accuracy of customer data. But there will always be situations in which the managed data set cannot be adequately synchronized and made consistent with “real-world” data. Even strictly defined and enforced internal data policies can’t prevent inaccuracies from creeping into the environment.

sas3

Why you should move beyond a conventional approach to data governance?

When it comes to customer data, the most accurate sources for validation are the customers themselves! In essence, every customer owns his or her information, and is the most reliable authority for ensuring its quality, consistency and currency. So why not develop policies and methods that empower the actual owners to be accountable for their data?

Doing this means extending the concept of data governance to the customers and defining data policies that engage them to take an active role in overseeing their own data quality. The starting point for this process fits within the data governance framework – define the policies for customer data validation.

A good template for formulating those policies can be adapted from existing regulations regarding data protection. This approach will assure customers that your organization is serious about protecting their data’s security and integrity, and it will encourage them to actively participate in that effort.

Examples of customer data engagement policies

  • Data protection defines the levels of protection the organization will use to protect the customer’s data, as well as what responsibilities the organization will assume in the event of a breach. The protection will be enforced in relation to the customer’s selected preferences (which presumes that customers have reviewed and approved their profiles).
  • Data access control and security define the protocols used to control access to customer data and the criteria for authenticating users and authorizing them for particular uses.
  • Data use describes the ways the organization will use customer data.
  • Customer opt-in describes the customers’ options for setting up the ways the organization can use their data.
  • Customer data review asserts that customers have the right to review their data profiles and to verify the integrity, consistency and currency of their data. The policy also specifies the time frame in which customers are expected to do this.
  • Customer data update describes how customers can alert the organization to changes in their data profiles. It allows customers to ensure their data’s validity, integrity, consistency and currency.
  • Right-to-use defines the organization’s right to use the data as described in the data use policy (and based on the customer’s selected profile options). This policy may also set a time frame associated with the right-to-use based on the elapsed time since the customer’s last date of profile verification.

The goal of such policies is to establish an agreement between the customer and the organization that basically says the organization will protect the customer’s data and only use it in ways the customer has authorized – in return for the customer ensuring the data’s accuracy and specifying preferences for its use. This model empowers customers to take ownership of their data profile and assume responsibility for its quality.

Clearly articulating each party’s responsibilities for data stewardship benefits both the organization and the customer by ensuring that customer data is high-quality and properly maintained. Better yet, recognize that the value goes beyond improved revenues or better compliance.

Empowering customers to take control and ownership of their data just might be enough to motivate self-validation.

Click her to access SAS’ detailed analysis

From Risk to Strategy : Embracing the Technology Shift

The role of the risk manager has always been to understand and manage threats to a given business. In theory, this involves a very broad mandate to capture all possible risks, both current and future. In practice, however, some risk managers are assigned to narrower, siloed roles, with tasks that can seem somewhat disconnected from key business objectives.

Amidst a changing risk landscape and increasing availability of technological tools that enable risk managers to do more, there is both a need and an opportunity to move toward that broader risk manager role. This need for change – not only in the risk manager’s role, but also in the broader approach to organizational risk management and technological change – is driven by five factors.

Marsh Ex 1

The rapid pace of change has many C-suite members questioning what will happen to their business models. Research shows that 73 percent of executives predict significant industry disruption in the next three years (up from 26 percent in 2018). In this challenging environment, risk managers have a great opportunity to demonstrate their relevance.

USING NEW TOOLS TO MANAGE RISKS

Emerging technologies present compelling opportunities for the field of risk management. As discussed in our 2017 report, the three levers of data, analytics, and processes allow risk professionals a framework to consider technology initiatives and their potential gains. Emerging tools can support risk managers in delivering a more dynamic, in-depth view of risks in addition to potential cost-savings.

However, this year’s survey shows that across Asia-Pacific, risk managers still feel they are severely lacking knowledge of emerging technologies across the business. Confidence scores were low in all but one category, risk management information systems (RMIS). These scores were only marginally higher for respondents in highly regulated industries (financial services and energy utilities), underscoring the need for further training across all industries.

Marsh Ex 3

When it comes to technology, risk managers should aim for “digital fluency, a level of familiarity that allows them to

  • first determine how technologies can help address different risk areas,
  • and then understand the implications of doing so.

They need not understand the inner workings of various technologies, as their niche should remain aligned with their core expertise: applying risk technical skills, principles, and practices.

CULTIVATING A “DIGITAL-FIRST” MIND-SET

Successful technology adoption does not only present a technical skills challenge. If risk function digitalization is to be effective, risk managers must champion a cultural shift to a “digital-first” mindset across the organization, where all stakeholders develop a habit of thinking about how technology can be used for organizational benefit.

For example, the risk manager of the future will be looking to glean greater insights using increasingly advanced analytics capabilities. To do this, they will need to actively encourage their organization

  • to collect more data,
  • to use their data more effectively,
  • and to conduct more accurate and comprehensive analyses.

Underlying the risk manager’s digitalfirst mind-set will be three supporting mentalities:

1. The first of these is the perception of technology as an opportunity rather than a threat. Some understandable anxiety exists on this topic, since technology vendors often portray technology as a means of eliminating human input and labor. This framing neglects the gains in effectiveness and efficiency that allow risk managers to improve their judgment and decision making, and spend their time on more value-adding activities. In addition, the success of digital risk transformations will depend on the risk professionals who understand the tasks being digitalized; these professionals will need to be brought into the design and implementation process right from the start. After all, as the Japanese saying goes, “it is workers who give wisdom to the machines.” Fortunately, 87 percent of PARIMA surveyed members indicated that automating parts of the risk manager’s job to allow greater efficiency represents an opportunity for the risk function. Furthermore, 63 percent of respondents indicated that this was not merely a small opportunity, but a significant one (Exhibit 6). This positive outlook makes an even stronger statement than findings from an earlier global study in which 72 percent of employees said they see technology as a benefit to their work

2. The second supporting mentality will be a habit of looking for ways in which technology can be used for benefit across the organization, not just within the risk function but also in business processes and client solutions. Concretely, the risk manager can embody this culture by adopting a data-driven approach, whereby they consider:

  • How existing organizational data sources can be better leveraged for risk management
  • How new data sources – both internal and external – can be explored
  • How data accuracy and completeness can be improved

“Risk managers can also benefit from considering outside-the-box use cases, as well as keeping up with the technologies used by competitors,” adds Keith Xia, Chief Risk Officer of OneHealth Healthcare in China.

This is an illustrative rather than comprehensive list, as a data-driven approach – and more broadly, a digital mind-set – is fundamentally about a new way of thinking. If risk managers can grow accustomed to reflecting on technologies’ potential applications, they will be able to pre-emptively spot opportunities, as well as identify and resolve issues such as data gaps.

3. All of this will be complemented by a third mentality: the willingness to accept change, experiment, and learn, such as in testing new data collection and analysis methods. Propelled by cultural transformation and shifting mind-sets, risk managers will need to learn to feel comfortable with – and ultimately be in the driver’s seat for – the trial, error, and adjustment that accompanies digitalization.

MANAGING THE NEW RISKS FROM EMERGING TECHNOLOGIES

The same technological developments and tools that are enabling organizations to transform and advance are also introducing their own set of potential threats.

Our survey shows the PARIMA community is aware of this dynamic, with 96 percent of surveyed members expecting that emerging technologies will introduce some – if not substantial – new risks in the next five years.

The following exhibit gives a further breakdown of views from this 96 percent of respondents, and the perceived sufficiency of their existing frameworks. These risks are evolving in an environment where there are already questions about the relevance and sufficiency of risk identification frameworks. Risk management has become more challenging due to the added complexity from rapid shifts in technology, and individual teams are using risk taxonomies with inconsistent methodologies, which further highlight the challenges that risk managers face in managing their responses to new risk types.

Marsh Ex 9

To assess how new technology in any part of the organization might introduce new risks, consider the following checklist :

HIGH-LEVEL RISK CHECKLIST FOR EMERGING TECHNOLOGY

  1. Does the use of this technology cut across existing risk types (for example, AI risk presents a composite of technology risk, cyber risk, information security risk, and so on depending on the use case and application)? If so, has my organization designated this risk as a new, distinct category of risk with a clear definition and risk appetite?
  2. Is use of this technology aligned to my company’s strategic ambitions and risk appetite ? Are the cost and ease of implementation feasible given my company’s circumstances?
  3. Can this technology’s implications be sufficiently explained and understood within my company (e.g. what systems would rely on it)? Would our use of this technology make sense to a customer?
  4. Is there a clear view of how this technology will be supported and maintained internally, for example, with a digitally fluent workforce and designated second line owner for risks introduced by this technology (e.g. additional cyber risk)?
  5. Has my company considered the business continuity risks associated with this technology malfunctioning?
  6. Am I confident that there are minimal data quality or management risks? Do I have the high quality, large-scale data necessary for advanced analytics? Would customers perceive use of their data as reasonable, and will this data remain private, complete, and safe from cyberattacks?
  7. Am I aware of any potential knock-on effects or reputational risks – for example, through exposure to third (and fourth) parties that may not act in adherence to my values, or through invasive uses of private customer information?
  8. Does my organization understand all implications for accounting, tax, and any other financial reporting obligations?
  9. Are there any additional compliance or regulatory implications of using this technology? Do I need to engage with regulators or seek expert advice?
  10. For financial services companies: Could I explain any algorithms in use to a customer, and would they perceive them to be fair? Am I confident that this technology will not violate sanctions or support crime (for example, fraud, money laundering, terrorism finance)?

SECURING A MORE TECHNOLOGY-CONVERSANT RISK WORKFORCE

As risk managers focus on digitalizing their function, it is important that organizations support this with an equally deliberate approach to their people strategy. This is for two reasons, as Kate Bravery, Global Solutions Leader, Career at Mercer, explains: “First, each technological leap requires an equivalent revolution in talent; and second, talent typically becomes more important following disruption.”

While upskilling the current workforce is a positive step, as addressed before, organizations must also consider a more holistic talent management approach. Risk managers understand this imperative, with survey respondents indicating a strong desire to increase technology expertise in their function within the next five years.

Yet, little progress has been made in adding these skills to the risk function, with a significant gap persisting between aspirations and the reality on the ground. In both 2017 and 2019 surveys, the number of risk managers hoping to recruit technology experts has been at least 4.5 times the number of teams currently possessing those skills.

Marsh Ex 15

EMBEDDING RISK CULTURE THROUGHOUT THE ORGANIZATION

Our survey found that a lack of risk management thinking in other parts of the organization is the biggest barrier the risk function faces in working with other business units. This is a crucial and somewhat alarming finding – but new technologies may be able to help.

Marsh Ex 19

As technology allows for increasingly accurate, relevant, and holistic risk measures, organizations should find it easier to develop risk-based KPIs and incentives that can help employees throughout the business incorporate a risk-aware approach into their daily activities.

From an organizational perspective, a first step would be to describe risk limits and risk tolerance in a language that all stakeholders can relate to, such as potential losses. Organizations can then cascade these firm-wide risk concepts down to operational business units, translating risk language into tangible and relevant incentives that encourages behavior that is consistent with firm values. Research shows that employees in Asia want this linkage, citing a desire to better align their individual goals with business goals.

The question thus becomes how risk processes can be made an easy, intuitive part of employee routines. It is also important to consider KPIs for the risk team itself as a way of encouraging desirable behavior and further embedding a risk-aware culture. Already a majority of surveyed PARIMA members use some form of KPIs in their teams (81 percent), and the fact that reporting performance is the most popular service level measure supports the expectation that PARIMA members actively keep their organization informed.

Marsh Ex 21

At the same time, these survey responses also raise a number of questions. Forty percent of organizations indicate that they measure reporting performance, but far fewer are measuring accuracy (15 percent) or timeliness (16 percent) of risk analytics – which are necessary to achieve improved reporting performance. Moreover, the most-utilized KPIs in this year’s survey tended to be tangible measures around cost, from which it can be difficult to distinguish a mature risk function from a lucky one.

SUPPORTING TRANSFORMATIONAL CHANGE PROGRAMS

Even with a desire from individual risk managers to digitalize and complement organizational intentions, barriers still exist that can leave risk managers using basic tools. In 2017, cost and budgeting concerns were the single, standout barrier to risk function digitalization, chosen by 67 percent of respondents, well clear of second placed human capital concerns at 18 percent. This year’s survey responses were much closer, with a host of ongoing barriers, six of which were cited by more than 40 percent of respondents.

Marsh Ex 22

Implementing the nuts and bolts of digitalization will require a holistic transformation program to address all these barriers. That is not to say that initiatives must necessarily be massive in scale. In fact, well-designed initiatives targeting specific business problems can be a great way to demonstrate success that can then be replicated elsewhere to boost innovation.

Transformational change is inherently difficult, in particular where it spans both technological as well as people dimensions. Many large organizations have generally relied solely on IT teams for their “digital transformation” initiatives. This approach has had limited success, as such teams are usually designed to deliver very specific business functionalities, as opposed to leading change initiatives. If risk managers are to realize the benefits of such transformation, it is incumbent on them to take a more active role in influencing and leading transformation programs.

Click here to access Marsh’s and Parima’s detailed report

The Future of CFO’s Business Partnering

BP² – the next generation of Business Partner

The role of business partner has become almost ubiquitous in organizations today. According to respondents of this survey, 88% of senior finance professionals already consider themselves to be business partners. This key finding suggests that the silo mentality is breaking down and, at last, departments and functions are joining forces to teach and learn from each other to deliver better performance. But the scope of the role, how it is defined, and how senior finance executives characterize their own business partnering are all open to interpretation. And many of the ideas are still hamstrung by traditional finance behaviors and aspirations, so that the next generation of business partners as agents of change and innovation languish at the bottom of the priority list.

The scope of business partnering

According to the survey, most CFOs see business partnering as a blend of traditional finance and commercial support, while innovation and change are more likely to be seen as outside the scope of business partnering. 57% of senior finance executives strongly agree that a business partner should challenge budgets, plans and forecasts. Being involved in strategy and development followed closely behind with 56% strongly agreeing that it forms part of the scope of business partnering, while influencing commercial decisions was a close third.

The pattern that emerges from the survey is that traditional and commercial elements are given more weight within the scope of business partnering than being a catalyst for change and innovation. This more radical change agenda is only shared by around 36% of respondents, indicating that finance professionals still largely see their role in traditional or commercial terms. They have yet to recognize the finance function’s role in the next generation of business partnering, which can be

  • the catalyst for innovation in business models,
  • for process improvements
  • and for organizational change.

Traditional and commercial business partners aren’t necessarily less important than change agents, but the latter has the potential to add the most value in the longer term, and should at least be in the purview of progressive CFOs who want to drive change and encourage growth.

Unfortunately, this is not an easy thing to change. Finding time for any business partnering can be a struggle, but CFOs spend disproportionately less time on activities that bring about change than on traditional business partnering roles. Without investing time and effort into it, CFOs will struggle to fulfill their role as the next generation of business partner.

Overall 45% of CFOs struggle to make time for any business partnering, so it won’t come as a surprise that, ultimately, only 57% of CFOs believe their finance team efforts as business partners are well regarded by the operational functions.

The four personas of business partnering

Ask a room full of CFOs what business partnering means and you’ll get a room full of answers, each one influenced by their personal journey through the changing business landscape. By its very variability, this important business process is being enacted in many ways. FSN, the survey authors, did not seek to define business partnering. Instead, the survey asked respondents to define business partnering in their own words, and the 366 detailed answers were all different. But underlying the diversity were patterns of emphasis that defined four ‘personas’ or styles of business partnering, each exerting its own influence on the growth of the business over time.

A detailed analysis of the definitions and the frequency of occurrence of key phrases and expressions allowed us to plot these personas, their relative weight, together with their likely impact on growth over time.

FSN1

The size of the bubbles denotes the frequency (number) of times an attribute of business partnering was referenced in the definitions and these were plotted in terms of their likely contribution to growth in the short to long term.

The greatest number of comments by far coalesced around the bottom left-hand quadrant denoting a finance-centric focus on short to medium term outcomes, i.e., the traditional finance business partner. But there was an encouraging drift upwards and rightwards towards the quadrant denoting what we call the next generation of business partner, “BP²” (BP Squared), a super-charged business partner using his or her wide experience, purview and remit to help bring about change in the organization, for example, new business models, new processes and innovative methods of organizational deployment.

Relatively few of the 383 business partners offering definitions of a business partner, concerned themselves with top line growth i.e. with involvement in commercial sales negotiations or the sales pipeline – a critical part of influencing growth.

Finally, surprisingly few finance business partners immersed themselves in strategy development or saw their role as helping to ensure strategic alignment. It suggests that the ongoing transition of the CFO’s role from financial steward to strategic advisor is not as advanced as some would suggest.

Financial Performance Drivers

Most CFOs and senior finance executives define the role of the business partner in traditional financial terms. They are there to explain and illuminate the financial operations, be a trusted, safe pair of hands that manages business risk, and provide s ome operational support. The focus for these CFOs is on communicating a clear understanding of the financial imperative in order to steer the performance of the business prudently.

This ideal reflects the status quo and perpetuates the traditional view of finance, and the role of the CFO. It’s one where the finance function remains a static force, opening up only so far as to allow the rest of the business to see how it functions and make them more accountable to it. While it is obviously necessary for other functions to understand and support a financial strategy, the drawback of this approach is the shortcomings for the business as a whole. Finance-centric business partnering provides some short-term outcomes but does little to promote more than pedestrian growth. It’s better than nothing, but it’s far from the best.

Top-Line Drivers

In the upper quadrant, top line drivers focus on driving growth and sales with a collaborative approach to commercial decision-making. This style of business partnering can have a positive effect on earnings, as improvements in commercial operations and the management of the sales pipeline are translated into revenue.

But while top line drivers are linked to higher growth than financial-focused business partners, the outcome tends to be only short term. The key issue for CFOs is that very few of them even allude to commercial partnerships when defining the scope of business partnering. They ignore the potential for the finance function to help improve the commercial outcomes, like sales or the collection of debt or even a change in business models.

Strategic Aligners

Those CFOs who focus on strategic alignment in their business partnering approach tend to see longer term results. They use analysis and strategy to drive decisionmaking, bringing business goals into focus through partnerships and collaborative working. This business benefit helps to strengthen the foundation of the business in the long term, but it isn’t the most effective in driving substantial growth. And again, there is a paucity of CFOs and senior finance executives who cited strategy development and analysis in their definition of business partnering.

Catalysts for change

The CFOs who were the most progressive and visionary in their definition of business partnering use the role as a catalyst for change. They challenge their colleagues, influence the strategic direction of the business, and generate momentum through change and innovation from the very heart of the finance function. These finance executives get involved in decision-making, and understand the need to influence, advise and challenge in order to promote change. This definition is the one that translates into sustained high growth.

The four personas are not mutually exclusive. Some CFOs view business partnering as a combination of some or all of these attributes. But the preponderance of opinion is clustered around the traditional view of finance, while very little is to do with being a catalyst for change.

How do CFOs characterize their finance function?

However CFOs choose to define the role of business partnering, each function has its own character and style. According to the survey, 17% have a finance-centric approach to business partnering, limiting the relationship to financial stewardship and performance. A further 18% have to settle for a light-touch approach where they are occasionally invited to become involved in commercial decision-making. This means 35% of senior finance executives are barely involved in any commercial decision-making at all.

More positively, the survey showed that 46% are considered to be trusted advisors, and are sought out by operational business teams for opinions before they make big commercial or financial decisions.

But at the apex of the business partnering journey are the change agents, who make up a paltry 19% of the senior finance executives surveyed. These forward thinkers are frequently catalysts for change, suggesting new business processes and areas where the company can benefit from innovation. This is the next stage in the evolution of both the role of the modern CFO and the role of the finance function at the heart of business innovation. We call CFOs in this category BP² (BP Squared) to denote the huge distance between these forward-thinking individuals and the rest of the pack.

Measuring up

Business partnering can be a subtle yet effective process, but it’s not easy to measure. 57% of organizations have no agreed way of measuring the success of business partnering, and 34% don’t think it’s possible to separate and quantify the value added through this collaboration.

Yet CFOs believe there is a strong correlation between business partnering and profitability – with 91% of respondents saying their business partnering efforts significantly add to profitability. While it’s true that some of the outcomes of business partnering are intangible, it is still important to be able to make a direct connection between it and improved performance, otherwise those efforts may be ineffective but are allowed to continue.

One solution is to use 360 degree appraisals, drawing in a wider gamut of feedback including business partners and internal customers to ascertain the effectiveness of the process. Finance business partnering can also be quantified if there are business model changes, like the move from product sales to services, which require a generous underpinning of financial input to be carried out effectively.

Business partnering offers companies a way to inexpensively

  • pool all their best resources to generate ideas,
  • spark innovation
  • and positively add value to the business.

First CFOs need to recognize the importance of business partnering, widen their idea of how it can add value, and then actually set aside the enough time to become agents of change and growth.

Data unlocks business partnering

Data is the most valuable organizational currency in today’s competitive business environment. Most companies are still in the process of working out the best method to collect, collate and use the tsunami of data available to them in order to generate insight. Some organizations are just at the start of their data journey, others are more advanced, and our research confirms that their data profile will make a significant difference to how well their business partnering works.

FSN2

The survey asked how well respondents’ data supported the role of business partnering, and the responses showed that 18% were data overloaded. This meant business partners have too many conflicting data sources and poor data governance, leaving them with little actual usable data to support the partnering process.

26% were data constrained, meaning they cannot get hold of the data they need to drive insight and decision making.

And a further 34% were technology constrained, muddling through without the tech savvy resources or tools to fully exploit the data they already have. These senior finance executives may know the data is there, sitting in an ERP or CRM system, but can’t exploit it because they lack the right technology tools.

The final 22% have achieved data mastery, where they actively manage their data as a corporate asset, and have the tools and resources to exploit it in order to give their company a competitive edge.

This means 78% overall are hampered by data constraints and are failing to use data effectively to get the best out of their business partnering. While the good intentions are there, it is a weak partnership because there is little of substance to work with.

FSN3

The diagram above is the Business Partnering Maturity Model as it relates to data. It illustrates that there is a huge gap in performance between how effective data masters and data laggards are at business partnering.

The percentage of business partners falling into each category of data management (‘data overloaded’, ‘data constrained,’ etc) has been plotted together with how well these finance functions feel that business partnering is regarded by the operational units as well as their perceived influence on change.

The analysis reveals that “Data masters” are in a league of their own. They are significantly more likely to be well regarded by the operations and are more likely to act as change agents in their business partnering role.

We know from FSN’s 2018 Innovation in Financial Reporting survey that data masters, who similarly made up around one fifth of senior finance executives surveyed, are also more innovative. That research showed they were more likely to have worked on innovative projects in the last three years, and were less likely to be troubled by obstacles to reporting and innovation.

Data masters also have a more sophisticated approach to business partnering. They’re more likely to be change agents, are more often seen as a trusted advisor and they’re more involved in decision making. Interestingly, two-thirds of data masters have a formal or agreed way to measure the success of business partnering, compared to less than 41% of data constrained CFOs, and 36% of technology constrained and data overloaded finance executives. They’re also more inclined to perform 360 degree appraisals with their internal customers to assess the success of their business partnering. This means they can monitor and measure their success, which allows them to adapt and improve their processes.

The remainder, i.e. those that have not mastered their data, are clustered around a similar position on the Business Partnering Maturity Model, i.e., there is little to separate them around how well they are regarded by operational business units or whether they are in a position to influence change.

The key message from this survey is that data masters are the stars of the modern finance function, and it is a sentiment echoed through many of FSN’s surveys over the last few years.

The Innovation in Financial Reporting survey also found that data masters outperformed their less able competitors in three key performance measures that are indicative of financial health and efficiency: 

  • they close their books faster,
  • reforecast quicker and generate more accurate forecasts,
  • and crucially they have the time to add value to the organization.

People, processes and technology

So, if data is the key to driving business partnerships, where do the people, processes and technology come in? Business partnering doesn’t necessarily come naturally to everyone. Where there is no experience of it in previous positions, or if the culture is normally quite insular, sometimes CFOs and senior finance executives need focused guidance. But according to the survey, 77% of organizations expect employees to pick up business partnering on the job. And only just over half offer specialized training courses to support them.

Each company and department or function will be different, but businesses need to support their partnerships, either with formal structures or at the very least with guidance from experienced executives to maximize the outcome. Meanwhile processes can be a hindrance to business partnering in organizations where there is a lack of standardization and automation. The survey found that 71% of respondents agreed or strongly agreed that a lack of automation hinders the process of business partnering.

This was followed closely by a lack of standardization, and a lack of unification, or integration in corporate systems. Surprisingly the constraints of too many or too complex spreadsheets only hindered 61% of CFOs, the lowest of all obstacles but still a substantial stumbling block to effective partnerships. The hindrances reflect the need for better technology to manage the data that will unlock real inter-departmental insight, and 83% of CFOs said that better software to support data analytics is their most pressing need when supporting effective business partnerships.

Meanwhile 81% are looking to future technology to assist in data visualization to make improvements to their business partnering.

FSN4

This echoes the findings of FSN’s The Future of Planning, Budgeting and Forecasting survey which identified users of cutting edge visualization tools as the most effective forecasters. Being able to visually demonstrate financial data and ideas in an engaging and accessible way is particularly important in business partnering, when the counterparty doesn’t work in finance and may have only rudimentary knowledge of complex financial concepts.

Data is a clear differentiator. Business partners who can access, analyze and explain organizational data are more likely to

  • generate real insight,
  • engage their business partners
  • and become a positive agent of change and growth.

Click here to access Workiva’s and FSN’s Survey²

Mastering Risk with “Data-Driven GRC”

Overview

The world is changing. The emerging risk landscape in almost every industry vertical has changed. Effective methodologies for managing risk have changed (whatever your perspective:

  • internal audit,
  • external audit/consulting,
  • compliance,
  • enterprise risk management,

or otherwise). Finally, technology itself has changed, and technology consumers expect to realize more value, from technology that is more approachable, at lower cost.

How are these factors driving change in organizations?:

Emerging Risk Landscapes

Risk has the attention of top executives. Risk shifts quickly in an economy where “speed of change” is the true currency of business, and it emerges in entirely new forms in a world where globalization and automation are forcing shifts in the core values and initiatives of global enterprises.

Evolving Governance, Risk, and Compliance Methodologies

Across risk and control oriented functions spanning a variety of audit functions, fraud, compliance, quality management, enterprise risk management, financial control, and many more, global organizations are acknowledging a need to provide more risk coverage at lower cost (measured in both time and currency), which is driving re-inventions of methodology and automation.

Empowerment Through Technology

Gartner, the leading analyst firm in the enterprise IT space, is very clear that the convergence of four forces—Cloud, Mobile, Data, and Social—is driving the empowerment of individuals as they interact with each other and their information through well-designed technology.

In most organizations, there is no coordinated effort to leverage organizational changes emerging from these three factors in order to develop an integrated approach to mastering risk management. The emerging opportunity is to leverage the change that is occurring, to develop new programs; not just for technology, of course, but also for the critical people, methodology, and process issues. The goal is to provide senior management with a comprehensive and dynamic view of the effectiveness of how an organization is managing risk and embracing change, set in the context of overall strategic and operational objectives.

Where are organizations heading?

“Data Driven GRC” represents a consolidation of methodologies, both functional and technological, that dramatically enhance the opportunity to address emerging risk landscapes and, in turn, maximizing the reliability of organizational performance.

This paper examines the key opportunities to leverage change—both from a risk and an organizational performance management perspective—to build integrated, data-driven GRC processes that optimize the value of audit and risk management activities, as well as the investments in supporting tools and techniques.

Functional Stakeholders of GRC Processes and Technology

The Institute of Internal Auditors’ (IIA) “Three Lines of Defense in Effective Risk Management and Control” model specifically addresses the “who and what” of risk management and control. It distinguishes and describes three role- and responsibility-driven functions:

  • Those that own and manage risks (management – the “first line”)
  • Those that oversee risks (risk, compliance, financial controls, IT – the “second line”)
  • Those functions that provide independent assurance over risks (internal audit – the “third line”)

The overarching context of these three lines acknowledges the broader role of organizational governance and governing bodies.

IIAA

Technology Solutions

Data-Driven GRC is not achievable without a technology platform that supports the steps illustrated above, and integrates directly with the organization’s broader technology environment to acquire the data needed to objectively assess and drive GRC activities.

From a technology perspective, there are four main components required to enable the major steps in Data-Driven GRC methodology:

1. Integrated Risk Assessment

Integrated risk assessment technology maintains the inventory of strategic risks and the assessment of how well they are managed. As the interface of the organization’s most senior professionals into GRC processes, it must be a tool relevant to and usable by executive management. This technology sets the priorities for risk mitigation efforts, thereby driving the development of project plans crafted by each of the functions in the different lines of defense.

2. Project & Controls Management

A project and controls management system (often referred to more narrowly as audit management systems or eGRC systems) enables the establishment of project plans in each risk and control function that map against the risk mitigation efforts identified as required. Projects can then be broken down into actionable sets of tactical level risks, controls that mitigate those risks, and tests that assess those controls.

This becomes the backbone of the organization’s internal control environment and related documentation and evaluation, all setting context for what data is actually required to be tested or monitored in order to meet the organization’s strategic objectives.

3. Risk & Control Analytics

If you think of Integrated Risk Assessment as the brain of the Data-Driven GRC program and the Project & Controls Management component as the backbone, then Risk & Control Analytics are the heart and lungs.

An analytic toolset is critical to reaching out into the organizational environment and acquiring all of the inputs (data) that are required to be aggregated, filtered, and processed in order to route back to the brain for objective decision making. It is important that this toolset be specifically geared toward risk and control analytics so that the filtering and processing functionality is optimized for identifying anomalies representing individual occurrences of risk, while being able to cope with huge populations of data and illustrate trends over time.

4. Knowledge Content

Supporting all of the technology components, knowledge content comes in many forms and provides the specialized knowledge of risks, controls, tests, and data required to perform and automate the methodology across a wide-range of organizational risk areas.

Knowledge content should be acquired in support of individual risk and control objectives and may include items such as:

  • Risk and control templates for addressing specific business processes, problems, or high-level risk areas
  • Integrated compliance frameworks that balance multiple compliance requirements into a single set of implemented and tested controls
  • Data extractors that access specific key corporate systems and extract data sets required for evaluation (e.g., an SAP supported organization may need an extractor that pulls a complete set of fixed asset data from their specific version of SAP that may be used to run all require tests of controls related to fixed assets)
  • Data analysis rule sets (or analytic scripts) that take a specific data set and evaluate what transactions in the data set violate the rules, indicating control failures occurred

Mapping these key technology pieces that make up an integrated risk and control technology platform against the completely integrated Data-Driven GRC methodology looks as follows:

DDGRC

When evaluating technology platforms, it is imperative that each piece of this puzzle directly integrates with the other; otherwise, manual aggregation of results will be required, which is not only laborious but also inconsistent, disorganized and (by definition) violates the Data-Driven GRC methodology.

HiPerfGRC

 

Click here to access ACL’s study

A Transformation in Progress – Perspectives and approaches to IFRS 17

The International Financial Reporting Standard 17 (IFRS 17) was issued in May 2017 by the International Accounting Standards Board (IASB) and has an effective date of 1st January 2021. The standard represents the most significant change in financial reporting for decades, placing greater demand on legacy accounting and actuarial systems. The regulation is intended to increase transparency and provide greater comparability of profitability across the insurance sector.

IFRS 17 will fundamentally change the face of profit and loss reporting. It will introduce a new set of Key Performance Indicators (KPIs), and change the way that base dividend or gross payments are calculated. To give an example, gross premiums will no longer be recorded under profit and loss. This is just one of the wide-ranging shifts that insurers must take on board in the way they structure their business to achieve the best possible commercial outcomes.

In early 2018 SAS asked 100 executives working in the insurance industry to share their opinions about the standard and strategies for compliance. The research shed light on the sector’s sentiment towards the regulation, challenges and opportunities that IFRS 17 presents, along with the steps organisations are taking to achieve compliance. The aims of the study were to better understand the views of the industry and how insurers are preparing to implement the standard. The objective was to share an unbiased view of the peer group’s analysis of, and approach to, tackling the challenges during the adjustment period. The information garnered is intended to help inform insurers’ decision-making during the early stages of their own projects, helping them arrive at the best-placed strategy for their business.

This report reveals the findings of the survey and provides guidance on how organisations might best achieve compliance. It provides a subjective, datadriven view of IFRS 17 along with valuable market context for insurance professionals who are developing their own strategies for tackling the new standard.

SAS’ research indicates that UK insurers do not underestimate the cost of IFRS 17 or the level of change it will likely introduce. Overall, 97 per cent of survey respondents said that they expected the standard to increase the cost and complexity of operating in insurance.

Companies will need to

  • introduce a new system of KPIs
  • and make changes in management information reports

to monitor performance under the revised profitability metrics. Forward looking strategic planning will also need to incorporate potential volatility and any ramifications within the insurance industry. To achieve this, firms will need to ensure the main parties involved co-operate and work together in a more integrated way.

The cost of these measures will, of course, differ considerably between organisations of different sizes, specialisms and complexities. However, the cost of compliance also greatly depends on

  • the approach taken by decision-makers,
  • the partners they choose
  • and the solutions they select.

Perhaps more instructive is that 90 per cent believe compliance costs will be greater than those demanded by the Solvency II Directive, aimed at insurers retaining strong financial buffers so they can meet claims from policyholders.

The European Commission estimated that it cost EU insurers between £3 and £4 billion to implement Solvency II, which was designed to standardise what had been a piecemeal approach to insurance regulations across the EU. Almost half (48 per cent) predict that IFRS 17 will cost substantially more.

Respondents are preparing for major alterations to their current accounting and actuarial systems, from minor upgrades all the way to wholesale replacements. Data management systems will be the prime target for review, with 84 per cent of respondents planning to either make additional investment (25 per cent), upgrade (34 per cent), or replace them (25 per cent). Finance, accounting and actuarial systems will also see significant innovation, as 83 per cent and 81 per cent respectively prepare for significant investment.

The use of analytics appears to be the most divisive area for insurers. While 27 per cent of participants are confident they will need to make no changes to their analytics systems or processes, 28 per cent plan to replace them entirely. A majority of 71 per cent still expect to make at least some reform.

IFRS17

IFRS17 2

Click here to access SAS’ Whitepaper

 

The IFRS 9 Impairment Model and its Interaction with the Basel Framework

In the wake of the 2008 financial crisis, the International Accounting Standards Board (IASB) in cooperation with the Financial Accounting Standards Board (FASB) launched a project to address the weaknesses of both International Accounting Standard (IAS) 39 and the US generally accepted accounting principles (GAAP), which had been the international standards for determining financial assets and liabilities accounting in financial statements since 2001.

By July 2014, the IASB finalized and published its new International Financial Reporting Standard (IFRS) 9 methodology, to be implemented by January 1, 2018 (with the standard available for early adoption). IFRS 9 will cover financial organizations across Europe, the Middle East, Asia, Africa, Oceana, and the Americas (excluding the US). For financial assets that fall within the scope of the IFRS 9 impairment approach, the impairment accounting expresses a financial asset’s expected credit loss as the projected present value of the estimated cash shortfalls over the expected life of the asset. Expected losses may be considered on either a 12-month or lifetime basis, depending on the level of credit risk associated with the asset, and should be reassessed at each reporting date. The projected value is then recognized in the profit and loss (P&L) statement.

Most banks subject to IFRS 9 are also subject to Basel III Accord capital requirements and, to calculate credit risk-weighted assets, use either standardized or internal ratings-based approaches. The new IFRS 9 provisions will impact the P&L that in turn needs to be reflected in the calculation for impairment provisions for regulatory capital. The infrastructure to calculate and report on expected loss drivers of capital adequacy is already in place. The data, models, and processes used today in the Basel framework can in some instances be used for IFRS 9 provision modeling, albeit with significant adjustments. Not surprisingly, a Moody’s Analytics survey conducted with 28 banks found that more than 40% of respondents planned to integrate IFRS 9 requirements into their Basel infrastructure.

Arguably the biggest change brought by IFRS 9 is incorporation of credit risk data into an accounting and therefore financial reporting process. Essentially, a new kind of interaction between finance and risk functions at the organization level is needed, and these functions will in turn impact data management processes. The implementation of the IFRS 9 impairment model challenges the way risk and finance data analytics are defined, used, and governed throughout an institution. IFRS 9 is not the only driver of this change.

Basel Committee recommendations, European Banking Authority (EBA) guidelines and consultation papers, and specific supervisory exercises, such as stress testing and Internal Capital Adequacy Assessment Process (ICAAP), are forcing firms to consider a more data-driven and forward-looking approach in risk management and financial reporting.

Accounting and Risk Management: An Organization and Cultural Perspective

The implementation of IFRS 9 processes that touch on both finance and risk functions creates the need to take into account differences in culture, as well as often different understandings of the concept of loss in the two functions.

  • The finance function is focused on product (i.e., internal reporting based on internal data) and is driven by accounting standards.
  • The risk function, however, is focused on the counterparty (i.e., probability of default) and is driven by a different set of regulations and guidelines.

This difference in focus leads the two functions to adopt these differing approaches when dealing with impairment:

  • The risk function uses a stochastic approach to model losses, and a database to store data and run the calculations.
  • Finance uses arithmetical operations to report the expected/ incurred losses on the P&L, and uses decentralized data to populate reporting templates.

In other words, finance is driven by economics, and risk by statistical analysis. Thus, the concept of loss differs between teams or groups: A finance team views it as part of a process and analyzes loss in isolation from other variables, while the risk team sees loss as absolute and objectively observable with an aggregated view.

IFRS 9 requires a cross-functional approach, highlighting the need to reconcile risk and finance methodologies.

The data from finance in combination with the credit risk models from risk should drive the process.

  • The risk function runs the impairment calculation, whilst providing objective, independent, and challenger views (risk has no P&L or bonus-driven incentive) to the business assumptions.
  • Finance supports the process by providing data and qualitative overlay.

Credit Risk Modeling and IFRS 9 Impairment Model

Considering concurrent requirements across a range of regulatory guidelines, such as stress testing, and reporting requirements, such as common reporting (COREP) and financial reporting (FINREP), the challenge around the IFRS 9 impairment model is two-fold:

  • Models: How to harness the current Basel-prescribed credit risk models to make them compliant with the IFRS 9 impairment model.
  • Data: How (and whether) the data captured for Basel capital calculation can be used to model expected credit losses under IFRS 9.

IFRS9 Basel3

Click here to access Moody’s detailed report

Mastering Risk with “Data-Driven GRC”

Where are organizations heading ?

“Data Driven GRC” represents a consolidation of methodologies, both functional and technological, that dramatically enhance the opportunity to address emerging risk landscapes and, in turn, maximizing the reliability of organizational performance. This paper examines the key opportunities to leverage change—both from a risk and an organizational performance management perspective—to build integrated, data-driven GRC processes that optimize the value of audit and risk management activities, as well as the investments in supporting tools and techniques.

Functional Stakeholders of GRC Processes and Technology

The Institute of Internal Auditors’ (IIA) “Three Lines of Defense in Effective Risk Management and Control” model specifically addresses the “who and what” of risk management and control. It distinguishes and describes three role- and responsibility-driven functions :

  • Those that own and manage risks (management – the “first line”)
  • Those that oversee risks (risk, compliance, financial controls, IT – the “second line”)
  • Those functions that provide independent assurance over risks (internal audit – the “third line”)

The overarching context of these three lines acknowledges the broader role of organizational governance and governing bodies.

Technology Deficiencies in the Three Lines of Defense

Since the emergence of Sarbanes-Oxley, the use of technology in risk and control related processes has truly started to take meaningful shape in many organizations. However, when looking across the risk and control oriented functions in most organizations, technology is still typically used on a departmental or point solution basis.

Third Line (internal audit) use of risk & control technology

For the past decade, surveys of internal auditors have consistently identified the more effective use of technology as among the most pressing issues facing the profession. Specifically, the responses to the surveys also referred to the need for increased use of technology for audit analysis, fraud detection, and continuous auditing. Other surveys also highlight a shortage of sufficient technology and data analysis skills within audit departments.

Much of the driving force for improving the use of technology is based on the desire to make the audit process itself more efficient and more effective, as well as to deliver more tangible value to the rest of the organization.

During the past decade, the role of the internal audit function itself has changed considerably. Internal audit’s traditional focus on cyclical audits and testing internal controls is evolving into one in which internal audit is expected to assess and report on the effectiveness of management’s processes to address risk overall. This often includes providing guidance and consultation to the business on best practices for managing risk and compliance within business process areas and maintaining effective control systems. The use of technology is an increasingly critical component of these best practices and in some cases internal audit is able to champion the implementation of high-impact, high-value technology within the business’s risk management and compliance processes, based on their own experience in using technology for assurance purposes.

There is considerable variation in the extent to which internal audit departments leverage technology. However it is certainly fair to say that for audit to be truly valuable and relevant within the context of organizational strategy, a significant improvement is required across the board. Internal audit as a profession simply is not moving forward at the pace of technology.

Some specific statistics from recent research reveals:

  • Only approximately 40% of internal audit departments use audit and documentation management systems from specialized vendors. The remainder use disorganized tools and processes, typically based on Microsoft Office® & shared folders.
  • Audit programs for specific business process areas and industries are usually developed through a combination of previously used programs and those shared on various audit-related websites. This approach does not address organization-specific risk.
  • Next generation testing techniques, especially data analytics, are overwhelmingly underutilized.

Second Line (risk, compliance, financial controls, IT) use of risk & control technology

Outside of audit, in other areas of risk and compliance, some organizations have acquired specialized departmental software, but the majority use only basic Office tools to maintain inventories of risks, document controls and perform risk assessments. In larger enterprises, it is not unusual to have a variety of different technologies and approaches applied in different operational entities or in different functional areas. This approach is usually more costly and less effective than one based on a common platform. Effective testing methods using technology are usually unavailable or left unconsidered.

In fact, second line of defense functions often rely heavily on inquiry-based methods such as surveying, which are proven ineffective at identifying the actual manifestations of risk in the organization. If analytical software is used in the business for investigations or monitoring transactions, it in many cases involves standard query tools or some form of generic business intelligence (BI) technology. Although good for providing summary level information or high-level trends, BI tools struggle to show the root cause of problems. And while they may have certain capabilities to prevent fraud and errors from occurring, or to flag exceptions, they are not sufficient to effectively trap the typical problem transactions that occur.

First Line (management) use of risk & control technology

While in some cases, first line management have access to better technology for use on specific pain point areas (e.g., continuous transaction monitoring technology used within finance departments), there is a common tendency for management to place far too much reliance on core business systems for effective control. While the large ERP and other system vendors seem to have extensive capabilities for preventing control deficiencies, the reality is that these are extremely extensive and complex systems and internal controls are usually the afterthought of those implementing them, not a core focus. For example, in many cases certain control settings are turned off to enable the ERP system to run more efficiently.

An integrated and collaborative approach to managing risks and monitoring controls in collaboration with the second and third lines of defense, using a common, independent methodology and technology platform, typically proves the most effective in accomplishing management’s key risk mitigation strategies.

DD GRC

 

Click here to access ACL’s White Paper