Can Data and Technology Support the Insurance Industry to Regain Lost Relevance?

Since the start of the Third Industrial Revolution in the 1980s, the world has changed in many different ways:

  • rapid introduction and adoption of technological innovation (global internet; social networks; mobile technologies; evolving payment solutions; data availability);
  • new economic realities (volatile and shorter economic cycles; interconnected financial climate; under utilisation of assets);
  • structural shifts in society’s values (desire for community; generational altruism; active citizenship);
  • and demographic readjustment (increasing population; urbanization; longer life expectancy; millennials in the work force).

While these changes have been happening, the Insurance industry has seemingly preferred to operate in a closed environment oblivious to much of the impact these changes could bring:

  • Resistance to change,
  • Failure to meet changing customer demands
  • Decrease in the importance of attritional risks

has led the Insurance industry to reduce its relevance.

However

  • the availability of data,
  • the introduction of new capital providers,
  • the impact of new business models emerging from the sharing economy
  • and the challenge of InsurTechs

are affecting the industry complacency. Collectively, these factors are creating the perfect storm for the incumbents allowing them to re-evaluate their preference for maintaining the status quo. There is an ever increasing expectation from the industry to be more innovative and deliver a vastly improved customer experience.

As data and emerging technology are accelerating the need for change, they are also opening doors. The industry is at cross roads where it can either choose to regain relevance by adapting to the new world order or it can continue to decline. Should it choose the latter, it could expose the US$ 5 trillion market to approaches from large technology firms and manufacturers who have the access to customers, transformational capabilities and more than enough capital to fill the void left by the traditional players.

Insurance industry is slow to evolve

The Insurance industry has historically lacked an appetite to evolve and has shown reluctance in adopting industry-wide changes. A number of key elements, have created high barriers to entry. New entrants have found it difficult to challenge the status quo and lack appetite to win market share from incumbents with significantly large balance sheets. Such high barriers have kept the impact of disruption to minimal, allowing the industry to stay complacent even when most other industries have undergone significant structural shifts. In many ways ‘Darwin’ has not been at work.

  • A complex value chain

The Insurance industry started with a simple value chain involving four roles – the insured, a broker who advices the insured, an underwriter who prices the risk and an investor who provides the capital to secure the risk. Over centuries, the chain has expanded to include multiple other roles essential in helping the spreading of large risks across a broad investor community, as shown below.

Aon1

These new parties have benefitted the chain by providing expertise, access to customers, secure handling of transactions, arbitration in case of disputes and spreading of risk coverage across multiple partners. However, this has also resulted in added complexities and inefficiencies as each risk now undergoes multiple handovers.

While a longer value chain offers opportunities to new entrants to attack at multiple points, the added complexities and the importance of scale reduces opportunities to cause real disruption.

  • Stringent regulations

Insurance is one of the highest regulated industries in the world. And since the global financial crisis of last decade, when governments across the globe bailed out several financial service providers including insurers, the focus on capital adequacy and customer safety has increased manifold.

While a proactive regulatory regime ensures a healthy operating standard with potential measures in place to avoid another financial meltdown, multiple surveys have highlighted the implications of increased regulatory burden, leading to increased costs and limited product innovation.

  • Scale and volatility of losses

The true value of any insurance product is realised when the customer receives payments for incurred losses. This means that insurers must maintain enough reserves at any time to meet these claims.

Over the years volatility in high severity losses have made it difficult for insurers to accurately predict the required capital levels.

In addition, regulators now require insurers to be adequately capitalised with enough buffer to sustain extreme losses for even the lowest probability of occurrence (for example 1-in-100 years event or 1-in-200 years event). This puts additional pressure on the insurers to maintain bulky balance sheets.

On the other hand, a large capital base gives established insurers advantage of scale and limits growth opportunities for smaller industry players/new entrants.

  • Need for proprietary and historical data

Accurate pricing of the risk is key to survival in the industry. The insurers (specifically underwriters supported by actuaries) rely excessively on experience and statistical analysis to determine the premiums that they would be willing to take to cover the risk.

Access to correct and historical data is of chief importance and has been a key differentiating factor amongst insurers. Since the dawn of Third Industrial Revolution in the 1980s, insurers have been involved in a race to acquire, store and develop proprietary databases that allow them to price risks better than the competitors.

The collection of these extensive databases by incumbent insurers have given them immense benefits over new entrants that do not typically have similar datasets. Additionally, the incumbents have continued to add on to these databases through an unchallenged continuation of underwriting– which has further widened the gap for new entrants.

Struggling to meet customer needs

Despite years of existence, the Insurance industry has failed to keep up with the demand for risk coverage. For example the economic value of losses from all natural disasters has consistently been more than the insured value of losses by an average multiple of 3x-4x.

The gap is not limited to natural disasters. As highlighted by Aon’s Global Risk Management Survey 2019, multiple top risks sighted by customers are either uninsurable or partially insurable leading to significant supply gap.

Aon2

Six of the top 10 risks, including Damage to reputation/brand and Cyber, require better data and analytical insights to achieve fully effective risk transfer. However, current capabilities are primarily applied to drive better pricing and claims certainty across existing risk pools, and have not yet reached their full potential for emerging risks.

This inability to meet customer need has been driven by both an expensive model (for most risks only 60% of premiums paid are actually returned to the insured) and a lack of innovation. Historically, the need for long data trends meant insurance products always trailed emerging risks.

Status Quo is being challenged

While the industry has been losing relevance, it is now facing new challenges which are creating pressure for change. While these challenges are impacting the incumbents they also provide the potential for insurance to regain its key role in supporting innovation. Creating opportunity for lower costs and new innovations.

The insurance customer landscape has changed considerably: traditional property and casualty losses are no longer the only main risks that corporations are focused on mitigating. The importance of intellectual property and brand/reputation in value creation is leading to a realignment in the customer risk profile.

Value in the corporate world is no longer driven by physical/ tangible assets. As technology has advanced, it has led to the growth of intangibles assets in the form of intellectual property. The graph below shows that 84% of market capitalization in 2018 was driven by intangible assets. While the five largest corporations in 1975 were manufacturing companies (IBM; Exxon Mobil; P&G; GE; 3M), that has completely changed in 2018 as the first five positions were occupied by Tech companies (Apple; Alphabet; Microsoft; Amazon; Facebook). Yet, organizations are only able to secure coverage to insure a relatively small portion of their intangible assets (15%) compared to insurance coverage for legacy tangible assets (59%).

Aon3

This shift represents both a challenge and an opportunity for the Insurance industry. The ability to provide coverage for intangible assets would enable insurance to regain relevance and support innovation and investment. Until it can, its importance is likely to remain muted.

InsurTech

The Insurance industry has had traditionally manual processes, and has been a paper driven industry with huge inefficiencies. While customers´ needs are evolving at an unprecedented quick pace, the incumbents´ large legacy systems and naturally conservative approach, make them slow to reach the market with new products and an improved customer experience.

InsurTechs are companies that use technology to make the traditional insurance value chain more efficient. They are beginning to reshape the Insurance industry by targeting particular value pools or services in the sector, rather than seek to provide end-to-end solutions.

InsurTechs have seen more than US$ 11 billion of funding since 2015, and the volume in 2018 is expected to reach US$ 3,8 billion (FT PARTNERS). While Insurtechs were originally viewed as a disruptive force competing with traditional insurers to gain market share, there is a growing collaboration and partnership with the incumbent players. Most of them are launched to help solve legacy insurer problems across the organization, from general inefficiency in operations to enhancing underwriting, distribution, and claims functions, especially in consumer facing insurance. More recently they are also moving into the commercial segment focusing on loss prevention and efficiency. (CATLIN, T. et al. 2017). Incumbent insurers have managed to leverage InsurTechs to speed up innovation (DELOITTE, 2018: 11). From a funding perspective most of the US$ 2.6 billion that went into the InsurTechs in the first nine months of 2018 came from incumbent Insurers. (MOODY`S, 2018: 6).

The accelerated use of technology and digital capabilities again represents both a challenge for the industry but also an opportunity to innovate and develop more efficient products and services.

Data and technology with potential to transform

Traditionally, the Insurance industry has used proprietary historic data to match the demand from risk owners with the supply from capital providers. Focusing on relative simplistic regression analysis as the main approach.

While robust, this approach is reliant on a long data history and limits insurers ability to move into new areas. Increasingly the transformative power of data and technology is changing this relationship, as shown in the graph below. While underwriting data used to be in the hands of the incumbents only, emerging technologies, new analytical techniques and huge increases in sensors are enabling usage of new forms of data that are much more freely accessible. In addition, these technologies are supporting instant delivery of in-depth analytics that can potentially lead to significant efficiency gains and new types of products.

Aon4

  • Artificial Intelligence

Artificial Intelligence – Robotic Process Automation (RPA) and Cognitive Intelligence (CI) – is know as any system that can perceive the world around it, analyse and understand the information it receives, take actions based on that understanding and improve its own performance by learning from what happended.

Artificial Intelligence not only gives the opportunity to reduce costs (process automation; reduction of cycle times; free up of thousands of people hours) but improves accuracy that results in better data quality. For insurers this offers significant potential to both enable new ways of interpreting data and understanding risks. As well as reducing the costs of many critical processes such as claims assessment.

This dual impact of better understanding and lower costs is highly valuable. Insurers’ spend on cognitive/artificial intelligence technologies is expected to rise 48% globally on an annual basis over five years, reaching US$ 1.4 billion by 2021. (DELOITTE, 2017: 15).

  • Internet of Things

The Internet of Things refers to the digitization of objects around us. It works by embedding advanced hardware (e.g. sensors, cameras and meters) into everyday objects and even people themselves, linking those objects further to online networks. (MOODY`S, 2018: 11).

For example, connected devices in the homes such as water leakage detectors, smoke alarms, C02 readers and sophisticated home security systems will support prevention and reduction in losses from water damage, fire and burglary, respectively.

The Internet of Things has the potential to significantly change the way that risks are underwritten. The ability to have access to data in ‘real time’ will provide greater precision in the pricing of risk and also help insurers to respond better to the evolving customer needs. Consider the example of home insurance; customers will be forced to resconsider the decision to buy home insurance as packaged currently when their house is already monitored 24/7 for break-ins and the sensors are constantly monitoring the appliances to prevent fires. The insurers could utilise the same data to develop customised insurance policies depending on usage and scope of sensors.

The Internet of Things applies equally to wearable devices with embedded sensors for tracking vital statistics to improve the health, safety and productivity of individuals at work. It is predicted that the connected health market will be worth US$ 61 billion by 2026.

The Internet of Things offers the Insurance industry an opportunity to reinvent itself and to move from simply insuring against risk to helping customers protect the properties / health. This integration of insurance with products through live sensor data can revolutionise how insurance is embedded into our every day lives.

  • Blockchain

All disruptive technologies have a “tipping point” – the exact moment when it moves from early adopters to widespread acceptance. Just as it was for Google in the late 1990s and smartphones in the 2000s, could we be approaching the tipping point for the next big disruptive technology – blockchain?

Essentially, blockchain is a shared digital ledger technology that allows a continuously growing number of transactions to be recorded and verified electronically over a network of computers. It holds an immutable record of data, stored locally by each party to remove the barrier of trust. Through smart contacts, blockchain can enable automation of tasks for more efficient processing. It made its debut in 2009 as the system used to track dealing in the first cryptocurrency, Bitcoin, and, since then, organisations around the world have spotted blockchain’s potential to transform operations.

Most industries are currently experimenting with blockchain to identify and prove successful use cases to embrace the technology in business as usual. IDC, a leading market intelligence firm, expects the spend on blockchain to increase from US$ 1.8 billion in 2018 to US$ 11.7 billion in 2022 at a growth rate of 60%.

With all the aforementioned benefits, blockchain also has potential to impact the Insurance industry. It can help Insurers reduce operational and administrative costs through automated verification of policyholders, auditable registration of claims and data from third parties, underwriting of small contracts and automation of claims procedures. Equally, it can help reduce the fraud which would contribute to reduce total cost.

In an industry where ‘trust’ is critical, the ability to have guaranteed contracts, with claims certainty will help the take-up of insurance in new areas. BCG estimates that blockchain could drastically improve the end-to-end processing of a motor insurance policy and any claims arising thereof as shown in the graph below.

Aon5

Conclusion

The relevance of insurance, which has declined over the last few decades, after peaking in the early 1980s, is set to increase again:

  • Big shifts in insurance needs, both in the commercial and consumer segments,
  • New sources of cheap capital,
  • Prevelance of cheap and accessible data and the technology to automate and analyse

will transform the Insurance industry.

Not only is this important for insurers, it is also important for all of us. Insurance is the grease behind investment and innovation. The long term decline in the Insurance´s industry ability to reduce risk could be a significant impediment on future growth.

However we believe that the reversal of this trend will mean that insurance can once again grow in its importance of protecting our key investments and activities.

Click here to access Aon’s White Paper

 

Data Search and Discovery in Insurance – An Overview of AI Capabilities

Historically, the insurance industry has collected vast amounts of data relevant to their customers, claims, and so on. This can be unstructured data in the form of PDFs, text documents, images, and videos, or structured data that has been organized for big data analytics.

As with other industries, the existence of such a trove of data in the insurance industry led many of the larger firms to adopt big data analytics and techniques to find patterns in the data that might reveal insights that drive business value.

Any such big data applications may require several steps of data management, including collection, cleansing, consolidation, and storage. Insurance firms that have worked with some form of big data analytics in the past might have access to structured data which can be ingested by AI algorithms with little additional effort on the part of data scientists.

The insurance industry might be ripe for AI applications due to the availability of vast amounts of historical data records and the existence of large global companies with the resources to implement complex AI projects. The data being collected by these companies comes from several channels and in different formats, and AI search and discovery projects in the space require several initial steps to organize and manage data.

Radim Rehurek, who earned his PhD in Computer Science from the Masaryk University Brno and founded RARE Technologies, points out:

« A majority of the data that insurance firms collect is likely unstructured to some degree. This poses several challenges to insurance companies in terms of collecting and structuring data, which is key to the successful implementation of AI systems. »

Giacomo Domeniconi, a post-doctoral researcher at IBM Watson TJ Research Center and Adjunct Professor for the course “High-Performance Machine Learning” at New York University, mentions structuring the data as the largest challenge for businesses:

“Businesses need to structure their information and create labeled datasets, which can be used to train the AI system. Yet creating this labeled dataset might be very challenging apply AI and in most cases would involve manually labeling a part of the data using the expertise of a specialist in the domain.”

Businesses face many challenges in terms of collecting and structuring their data, which is key to the successful implementation of AI systems. An AI application is only as good as the data it consumes.

Natural language processing (NLP) and machine learning models often need to be trained on large volumes of data. Data scientists tweak these models to improve their accuracy.

This is a process that might last several months from start to finish, even in cases where the model is being taught relatively rudimentary tasks, such as identifying semantic trends in an insurance company’s internal documentation.

Most AI systems necessarily require the data to be input into an AI system in a structured format. Businesses would need to collect, clean, and organize their data to meet these requirements.

Although creating NLP and machine learning models to solve real-world business problems is by itself a challenging task, this process cannot be started without a plan for organizing and structuring enough data for these models to operate at reasonable accuracy levels.

Large insurance firms might need to think about how their data at different physical locations across the world might be affected by local data regulations or differences in data storage legacy systems at each location. Even with all the data being made accessible, businesses would find that data might still need to be scrubbed to remove any incorrect, incomplete, improperly formatted, duplicate, or outlying data. Businesses would also find that in some cases regulations might mandate the signing of data sharing agreements between the involved parties or data might need to be moved to locations where it can be analyzed. Since the data is highly voluminous, moving the data accurately can prove to be a challenge by itself.

InsIA

Click here to access Iron Mountain – Emerj’s White Paper

 

The State of Connected Planning

We identify four major planning trends revealed in the data.

  • Trend #1: Aggressively growing companies plan more, plan better, and prioritize planning throughout the organization.

  • Trend #2: Successful companies use enterprise-scale planning solutions.

  • Trend #3: The right decisions combine people, processes, and technology.

  • Trend #4: Advanced analytics yield the insights for competitive advantage.

TREND 01 : Aggressively growing companies prioritize planning throughout the organization

Why do aggressively growing companies value planning so highly? To sustain an aggressive rate of growth, companies need to do two things:

  • Stay aggressively attuned to changes in the market, so they can accurately anticipate future trend
  • Keep employees across the company aligned on business objectives

This is why aggressively growing companies see planning as critical to realizing business goals.

Putting plans into action

Aggressively growing companies don’t see planning as an abstract idea. They also plan more often and more efficiently than other companies. Compared to their counterparts, aggressively growing companies plan with far greater frequency and are much quicker to incorporate market data into their plans

This emphasis on

  • efficiency,
  • speed,
  • and agility

produces real results. Compared to other companies, aggressively growing companies put more of their plans into action. Nearly half of aggressively growing companies turn more than three-quarters of their plans into reality.

For companies that experience a significant gap between planning and execution, here are three ways to begin to close it:

  1. Increase the frequency of your planning. By planning more often, you give yourself more flexibility, can incorporate market data more quickly, and have more time to change plans. A less frequent planning cadence, in contrast, leaves your organization working to incorporate plans that may lag months behind the market.
  2. Plan across the enterprise. Execution can go awry when plans made in one area of the business don’t take into account activities in another area. This disconnect can produce unreachable goals throughout the business, which can dramatically reduce the percentage of a plan that gets executed. To avoid this, create a culture of planning across the enterprise, ensuring that plans include relevant data from all business units.
  3. Leverage the best technology. As the statistic above shows, the companies who best execute on their plans are those who leverage cloud-based enterprise technology. This ensures that companies can plan with all relevant data and incorporate all necessary stakeholders. By doing this, companies can set their plans up for execution as they are made.

Anaplan1

TREND 02 : Successful companies use enterprise-scale planning solutions

Although the idea that planning assists all aspects of a business may seem like common sense, the survey data suggests that taking this assumption seriously can truly help companies come out ahead.

Executives across industries and geographies all agreed that planning benefits every single business outcome, including

  • enhancing revenues,
  • managing costs,
  • optimizing resources,
  • aligning priorities across the organization,
  • making strategies actionable,
  • anticipating market opportunities,
  • and responding to market changes.

In fact, 92 percent of businesses believe that better planning technology would provide better business outcomes for their company. Yet planning by itself is not always a panacea.

Planning does not always equal GOOD planning. What prepares a company for the future isn’t the simple act of planning. It’s the less-simple act of planning well. In business planning, band-aids aren’t solutions

What counts as good planning? As businesses know, planning is a complicated exercise,
involving multiple processes, many different people, and data from across the organization. Doing planning right, therefore, requires adopting a wide-angle view. It requires planners to be able to see past their individual functions and understand how changes in one part of the organization affect the organization as a whole.

The survey results suggest that the best way to give planners this enterprise-level perspective is to use the right technology. Companies whose technology can incorporate data from the entire enterprise are more successful. Companies whose planning technology cannot link multiple areas of the organization, or remove multiple obstacles to planning, in contrast, plan less successfully.

Here are three areas of consideration that can help you begin your Connected Planning journey.

  1. Get the right tools. Uncertainty and volatility continue to grow, and spreadsheets and point solutions lack the agility to pivot or accommodate the volumes of data needed to spot risks and opportunities. Consider tools such as cloud-based, collaborative Connected Planning platforms that use in-memory technology and execute real-time modeling with large volumes of data. Not only can teams work together but plans become more easily embraced and achievable.
  2. Operate from a single platform with reliable data. Traditionally, companies have used individual applications to plan for each business function. These solutions are usually disconnected from one another, which makes data unreliable and cross-functional collaboration nearly impossible. A shared platform that brings together plans with access to shared data reduces or altogether eliminates process inefficiencies and common errors that can lead to bad decision-making.
  3. Transform planning into a continuous, connected process. Sales, supply chain, marketing, and finance fulfill different purposes within the business, but they are inextricably linked and rely on each other for success. The ability to connect different business units through shared technology, data, and processes is at the core of a continuous and connected business planning process.

Anaplan2

TREND 03 The right decisions combine people, processes, and technology

As businesses examine different ways to drive faster, more effective decision-making, planning plays a critical role in meeting this goal. Ninety-nine percent of businesses say that planning is important to managing costs. According to 97 percent of all survey respondents,

  • enhancing revenues,
  • optimizing resource allocation,
  • and converting strategies into actions

are all business objectives for which planning is extremely crucial. Eighty-two percent of executives consider planning to be “critically important” for enhancing revenues.

For planning to be successful across an organization, it need to extend beyond one or two siloed business units. The survey makes this clear: 96 percent of businesses state that
planning is important for aligning priorities across the organization. Yet even though companies recognize planning as a critical business activity, major inefficiencies exist: 97 percent of respondents say that their planning can be improved.

The more planners, the merrier the planning

When describing what they could improve in their planning, four components were all named essential by a majority of respondents.

  • Having the right processes
  • Involving the right people
  • Having the right data
  • Having the right technology

To support strong and effective change management initiatives, successful businesses can build a Center of Excellence (COE). A COE is an internal knowledge-sharing community that brings domain expertise in creating, maturing, and sustaining high-performing business disciplines. It is comprised of an in-house team of subject matter experts who train and share best practices throughout the organization.

By designing a Center of Excellence framework, businesses can get more control over their planning processes with quality, speed, and value, especially as they continue to expand Connected Planning technology into more complex use cases across the company.

Here are six primary benefits that a COE can provide:

  1. Maintaining quality and control of the planning platform as use case expands.
  2. Establishing consistency to ensure reliability within best practices and business data.
  3. Fostering a knowledge-sharing environment to cultivate and develop internal expertise.
  4. Enabling up- and downstream visibility within a single, shared tool.
  5. Driving efficiency in developing, releasing, and maintaining planning models.
  6. Upholding centralized governance and communicating progress, updates, and value to executive sponsors.

Anaplan3

TREND 04 Advanced analytics yield the insights for competitive advantage

Disruption is no longer disruptive for businesses—it’s an expectation. Wide-spread globalization, fluid economies, emerging technologies, and fluctuating consumer demands make unexpected events and evolving business models the normal course of business today.

This emphasizes the critical need for a more proactive, agile, and responsive state of planning. As the data shows, companies that have implemented a more nimble approach to planning are more successful.

Planners don’t have to look far to find better insights. Companies who plan monthly or more are more likely to quickly incorporate new market data into their plans—updating forecasts and plans, assessing the impacts of changes, and keeping an altogether closer eye on ongoing business performance and targets.

However, not all companies are able to plan so continuously: Almost half of respondents indicate that it takes them weeks or longer to update plans with market changes. For businesses that operate in rapidly changing and competitive markets, this lag in planning can be a significant disadvantage.

Advancements in technology can alleviate this challenge. Ninety-two percent of businesses state that improved planning technology would provide better business outcomes for their company. The C-Suite, in particular, is even more optimistic about the adoption of improved technology: More than half of executives say that adopting better planning technology would result in “dramatically better” business performance.

Planning goes digital

Rather than planners hunting for data that simply validates a gut-feeling approach to planning, the survey results indicate that data now sits behind the wheel—informing, developing, improving, and measuring plans.

Organizations, as well as a majority of executives, describe digital transformation as a top priority. Over half of all organizations and 61 percent of executives say that digital transformation amplifies the importance of planning. As businesses move into the future, the increasing use of advanced analytics, which includes predictive analytics and spans to machine learning and artificial intelligence, will determine which businesses come out ahead.

Roadblocks to data-driven planning

Increasing uncertainty and market volatility make it imperative that businesses operate with agile planning that can be adjusted quickly and effectively. However, as planning response times inch closer to real time, nearly a third of organizations continue to cite two main roadblocks to implementing a more data-driven approach:

  • inaccurate planning data and
  • insufficient technology

Inaccurate data plagues businesses in all industries. Sixty-three percent of organizations that use departmental or point solutions, for example, and 59 percent of businesses that use on-premises solutions identify “having the right data” as a key area for improvement in planning. The use of point solutions, in particular, can keep data siloed. When data is stored in disparate technology across the organization, planners end up spending more time consolidating systems and information, which can compromise data integrity.

It’s perhaps these reasons that lead 46 percent of the organizations using point and on-premises solutions to say that better technologies are necessary to accommodate current market conditions. In addition, 43 percent of executives say that a move to cloud-based technology would benefit existing planning.

In both cases, data-driven planning remains difficult, as businesses not employing cloud-based, enterprise technology struggle with poor data accuracy. By moving to cloud-based technology, businesses can automate and streamline tedious processes, which

  • reduces human error,
  • improves productivity,
  • and provides stakeholders with increased visibility into performance.

State-of-planning research reveals that organizations identify multiple business planning
obstacles as equally problematic, indicating a need for increased analytics in solutions that can eliminate multiple challenges at once. Nearly half of all respondents shared a high desire for a collaborative platform that can be used by all functions and departments.

Highly analytical capabilities in planning solutions further support the evolving needs of
today’s businesses. In sales forecasting, machine learning methodologies can quickly analyze past pipeline data to make accurate forecast recommendations. When working in financial planning, machine learning can help businesses analyze weather, social media, and historical sales data to quickly discern their impact on sales.

Here are some additional benefits that machine learning methodologies in a collaborative planning platform can offer businesses:

  1. Manage change to existing plans and respond to periods of uncertainty with accurate demand forecasting and demand sensing
  2. Develop enlightened operations, real-time forecasting, and smart sourcing and resourcing plans
  3. Operations that maintain higher productivity and more control with lower maintenance costs
  4. Targeted customer experience programs that increase loyalty and improve customer engagement
  5. Products and services that are offered at the right price with effective trade promotions, resulting in higher conversions

Anaplan4

Click here to access Anaplan’s detailed White Paper

Insurance Fraud Report 2019

Let’s start with some numbers. In this 2019 Insurance Fraud survey, loss ratios were 73% in the US. On average, 10% of the incurred losses were related to fraud, resulting in losses of $34 billion per year.

By actively fighting fraud we can improve these ratios and our customers’ experience. It’s time to take our anti-fraud efforts to a higher level. To effectively fight fraud, a company needs support and commitment throughout the organization, from top management to customer service. Detecting fraudulent claims is important. However, it can’t be the only priority. Insurance carriers must also focus on portfolio quality instead of quantity or volume.

It all comes down to profitable portfolio growth. Why should honest customers have to bear the risks brought in by others? In the end, our entire society suffers from fraud. We’re all paying higher premiums to cover for the dishonest. Things don’t change overnight, but an effective industry-wide fraud approach will result in healthy portfolios for insurers and fair insurance premiums for customers. You can call this honest insurance.

The Insurance Fraud Survey was conducted

  • to gain a better understanding of the current market state,
  • the challenges insurers must overcome
  • and the maturity level of the industry regarding insurance fraud.

This report is a follow up to the Insurance Fraud & Digital Transformation Survey published in 2016. Fraudsters are constantly innovating, so it is important to continuously monitor developments. Today you are reading the latest update on insurance fraud. For some topics the results of this survey are compared to those from the 2016 study.

This report explores global fraud trends in P&C insurance. This research addresses

  • challenges,
  • different approaches,
  • engagement,
  • priority,
  • maturity
  • and data sharing.

It provides insights for online presence, mobile apps, visual screening technology, telematics and predictive analytics.

Fraud-Fighting-Culture

Fraudsters are getting smarter in their attempts to stay under their insurer’s radar. They are often one step ahead of the fraud investigator. As a result, money flows to the wrong people. Of course, these fraudulent claims payments have a negative effect on loss ratio and insurance premiums. Therefore, regulators in many countries around the globe created anti-fraud plans and fraud awareness campaigns. Several industry associations have also issued guidelines and proposed preventive measures to help insurers and their customers.

Fraud1

Engagement between Departments

Fraud affects the entire industry, and fighting it pays off. US insurers say that fraud has climbed over 60% over the last three years. Meanwhile, the total savings of proven fraud cases exceeded $116 million. Insurers are seeing an increase in fraudulent cases and believe awareness and cooperation between departments is key to stopping this costly problem.

Fraud2

Weapons to Fight Fraud

Companies like Google, Spotify and Uber all deliver personalized products or services. Data is the engine of it all. The more you know, the better you can serve your customers. This also holds true for the insurance industry. Knowing your customer is very important, and with lots of data, insurers now know them even better. You’d think in today’s fast digital age, fighting fraud would be an automated task.

That’s not the case. Many companies still rely on their staff instead of automated fraud solutions. 67% of the survey respondents state that their company fights fraud based on the gut feeling of their claim adjusters. There is little or no change when compared to 2016.

Fraud3

Data, Data, Data …

In the fight against fraud, insurance carriers face numerous challenges – many related to data. Compared to the 2016 survey results, there have been minor, yet important developments. Regulations around privacy and security have become stricter and clearer.

The General Data Protection Regulation (GDPR) is only one example of centralized rules being pushed from a governmental level. Laws like this improve clarity on what data can be used, how it may be leveraged, and for what purposes.

Indicating risks or detecting fraud is difficult when the quality of internal data is subpar. However, it is also a growing pain when trying to enhance the customer experience. To improve customer experience, internal data needs to be accurate.

Fraud4

Benefits of Using Fraud Detection Software

Fighting fraud can be a time-consuming and error-prone process, especially when done manually. This approach is often based on the knowledge of claims adjustors. But what if that knowledge leaves the company? The influence of bias or prejudice when investigating fraud also comes into play.

With well-organized and automated risk analysis and fraud detection, the chances of fraudsters slipping into the portfolio are diminished significantly. This is the common belief among 42% of insurers. And applications can be processed even faster. Straightthrough processing or touchless claims handling improves customer experience, and thus customer satisfaction. The survey reported 61% of insurers currently work with fraud detection software to improve realtime fraud detection.

Fraud5

Click here to access FRISS’ detailed Report

Is Your Company Ready for Artificial Intelligence?

Overview

Companies are rushing to invest in and pursue initiatives that use artificial intelligence (AI). Some hope to find opportunity to transform their business processes and gain competitive advantage and others are concerned about falling behind the technology curve. But the reality is that many AI initiatives don’t work as planned, largely because companies are not ready for AI.

However, it is possible to leverage AI to create real business value. The key to AI success is ensuring the organization is ready by having the basics in place, particularly structured analytics and automation. Other elements of AI readiness include

  • executive engagement and support,
  • data excellence,
  • organizational capabilities,
  • and completion of AI pilots.

Key Takeaways

There is tremendous AI hype and investment. Artificial intelligence is software that can make decisions without explicit instructions for each scenario, including an ability to learn and improve over time. The term “machine learning” is often used interchangeably with AI, but machine learning is just one approach to AI, though it is currently the approach generating the most attention. Today in most business situations where AI is relevant, machine learning is likely to be employed.

The hype around AI is tremendous and has accelerated in the last few years. It is rare to read a business-related article these days that doesn’t mention AI.

The AI hype is being accompanied by massive investments from corporations (like Amazon, Google, and Uber), as well as from venture capital firms.

Because organizations often pursue AI without fully understanding it or having the basics in place, many AI initiatives fail. The AI fervor is causing companies to hurriedly pursue AI. There is a rush to capitalize on AI, but significant frustration when it comes to actually delivering AI success. AI initiatives are often pursued for the wrong reasons and many AI initiatives experience pitfalls. Some key pitfalls are:

  • Expensive partnerships between large companies and startups without results.
  • Impenetrable black box systems.
  • Open source toolkits without programmers to code.

The root cause for these failures often boils down to companies confusing three different topics:

  • automation,
  • structured analytics,
  • and artificial intelligence.

AI1

Despite the challenges, some organizations are experiencing success with AI. While the hype around AI is overblown, there are organizations having success by leveraging AI to create business value, particularly when AI is used for customer support and in the back office.

The key to AI success is first having the basics in place. In assessing AI successes and failures, the presenters drew three conclusions:

  1. There is a huge benefit from first getting the basics right: automation and structured analytics are prerequisites to AI.
  2. The benefits from AI are greater once these basics have been done right.
  3. Organizations are capable of working with AI at scale only when the basics have been done at scale.

GETTING THE BASICS RIGHT

The most important basics for AI are automation and structured analytics.

  • Automation: In most businesses there are many examples of data processes that can be automated. In many of these examples, there is no point having advanced AI if the basics are not yet automated.
  • Structured analytics means applying standard statistical techniques to well-structured data. In most companies there is huge value in getting automation and structured analytics right before getting to more complicated AI.

Examples of how businesses use structured analytics and automation include:

  • Competitor price checking. A retailer created real-time pricing intelligence by automatically scraping prices from competitors’ websites.
  • Small business cash flow lending product. Recognizing the need for small business customers to acquire loans in days, not weeks, a bank created an online lending product built on structured analytics.

BENEFITS WHEN THE BASICS ARE IN PLACE

Once the basics of structured analytics and automation are in place, organizations see more value from AI—when AI is used in specific situations.

AI2

Examples of how adding AI on top of the basics helps improve business results are:

  • New product assortment decisions. Adding AI on top of structured analytics allowed a retailer to predict the performance of new products for which there was no historic data. With this information, the retailer was able to decide whether or not to add the product to the stores.
  • Promotions forecasting. A retailer was able to improve forecasting of promotional sales using AI. Within two months of implementation, machine learning was better than the old forecasts plus the corrections made by the human forecasting team.
  • Customer churn predictions. A telephone company used AI and structured analytics to identify how to keep at-risk customers from leaving.
  • Defect detection. An aerospace manufacturer used AI to supplement human inspection and improve defect detection.

AI AT SCALE AFTER THE BASICS ARE AT SCALE

Once an organization proves it can work with automation and structured analytics at scale, it is ready for AI at scale. Readiness for AI at scale goes beyond completing a few AI pilots in defined but isolated areas of capability; the basics need to be in use across the business.

Before undertaking AI, organizations need to assess their AI readiness. To be successful, organizations need to be ready for AI. Readiness consists of multiple elements, including

  • executive engagement and support,
  • data excellence,
  • organizational capabilities,
  • and an analytical orientation.

Organizations often struggle with data excellence and organizational capabilities.

AI3

Click here to access HBR and SAS article collection

Anlalytics Behind The Perfect Risk Score & Predictive Model

We are living in a progressively more connected world where smarter products and changing consumer expectations are disrupting nearly every industry. While the connected world is data intensive, complex to manage and challenging to harness, the opportunities for generating more value and new propositions are nearly endless.

Octo Telematics has invested in the development of algorithms and analytical tools to help our industry partners maximize opportunities from the connected world – and we continue to do so today. Through actionable intelligence based on the accurate analysis of data, industry partners can differentiate their products and services with innovative customer experiences.

In building globally recognized analytical capabilities to serve the global insurance marketplace, Octo Telematics acquired the usage-based insurance (UBI) assets of Willis Towers Watson, including its market-leading DriveAbility® solution. DriveAbility aggregates and analyses granular telematics and insurance data to provide an industry-leading driving score and assist insurers to design, score, issue and bind telematics-based insurance policies. It also facilitates relationships between stakeholders including automotive OEMs, telecommunication companies and insurers to present convenient, personalized insurance offers to customers using pre-analyzed driving data. Today, a strategic alliance with Willis Towers Watson on additional opportunities continues to enhance both companies’ suite of products and services.

Historically, insurance companies have made underwriting and pricing decisions based on static risk factors that are largely proxies for how, how much, when and where a vehicle is operated. By leveraging actual driving data, data scientists can build telematics-based risk scores that are significantly more predictive than any risk factor used by insurance companies today.

To get the full value from telematics, data scientists must have the right data and employ different techniques than those used for traditional actuarial analysis. Done correctly, insurers can create a score that provides

  • double-digit lift,
  • optimizes the lift above and beyond traditional factors
  • and identifies factors that cause accidents to happen.

Failure to follow best practices for model development will result in sub-optimal lift that makes the business case less compelling. Lift is just one factor that should be considered. To be truly effective, any risk score should also be transparent, cost-effective, flexible, implementable and acceptable to regulatory bodies. Even the most predictive scores may not be effective if they fail one or more of these categories.

Octo

Click here to access OCTO’s White Paper

The Future of Planning Budgeting and Forecasting

The world of planning, budgeting and forecasting is changing rapidly as new technologies emerge, but the actual pace of change within the finance departments of most organizations is rather more sluggish. The progress companies have made in the year since The Future of Planning, Budgeting and Forecasting 2016 has been incremental, with a little accuracy gained but very little change to the reliance on insight-limiting technologies like spreadsheets.

That said, CFOs and senior finance executives are beginning to recognize the factors that contribute to forecasting excellence, and there is a groundswell of support for change. They’ll even make time to do it, and we all know how precious a CFOs time can be, especially when basic improvements like automation and standardization haven’t yet been implemented.

The survey shows that most PBF functions are still using relatively basic tools, but it also highlights the positive difference more advanced technology like visualization techniques and charting can make to forecasting outcomes. For the early adopters of even more experimental technologies like machine learning and artificial intelligence, there is some benefit to being at the forefront of technological change. But the survey suggests that there is still some way to go before machines take over the planning, budgeting and forecasting function.

In the meantime, senior finance executives who are already delivering a respected, inclusive and strategic PBF service need to focus on becoming more insightful, which means using smart technologies in concert with non-financial data to deliver accurate, timely, long term forecasts that add real value to the business.

Making headway

CFOs are making incremental headway in improving their planning, budgeting and forecasting processes, reforecasting more frequently to improve accuracy. But spreadsheet use remains a substantial drag on process improvements, despite organizations increasingly looking towards new technologies to progress the PBF landscape.

That said, respondents seem open to change, recognizing the importance of financial planning and analysis as a separate discipline, which will help channel resources in that direction. At the moment, a slow and steady approach is enough to remain competitive, but as more companies make increasingly substantial changes to their PBF processes to generate better insight, those that fail to speed up will find they fall behind.

Leading the debate

FSN’s insights gleaned from across the finance function shed light on the changes happening within the planning, budgeting and forecasting function, and identify the processes that make a real difference to outcomes. Senior finance executives are taking heed of these insights and making changes within the finance function. The most important one is the increasing inclusion of non-financial data into forecasting and planning processes. The Future of The Finance Function 2016 identified this as a game-changer, for the finance function as a whole, and for PBF in particular. It is starting to happen now. Companies are looking towards data from functions outside of finance, like customer relationship management systems and other non-financial data sources.

Senior executives are also finally recognizing the importance of automation and standardization as the key to building a strong PBF foundation. Last year it languished near the bottom of CFO’s priority lists, but now it is at the top. With the right foundation, PBF can start to take advantage of the new technology that will improve forecasting outcomes, particularly in the cloud.

There is increasing maturity in the recognition of cloud solution benefits, beyond just cost, towards agility and scalability. With recognition comes implementation, and it is hoped that uptake of these technologies will follow with greater momentum.

Man vs machine

Cloud computing has enabled the growth of machine learning and artificial intelligence solutions, and we see these being embedded into our daily lives, in our cars, personal digital assistants and home appliances. In the workplace, machine learning tools are being used for

  • predictive maintenance,
  • fraud detection,
  • customer personalization
  • and automating finance processes.

In the planning, budgeting and forecasting function, machine learning tools can take data over time, apply parameters to the analysis, and then learn from the outcomes to improve forecasts.

On the face of it, machine learning appears to be a game changer, adding unbiased logic and immeasurable processing power to the forecasting process, but the survey doesn’t show a substantial improvement in forecasting outcomes for organizations that use experimental technologies like these. And the CFOs and senior finance executives who responded to the survey believe there are substantial limitations to the effective of machine forecasts. As the technology matures, and finance functions become more integrated, machine learning will proliferate, but right now it remains the domain of early adopters.

Analytic tools

Many of the cloud solutions for planning, budgeting and forecasting involve advanced analytic tools, from visualization techniques to machine learning. Yet the majority of respondents still use basic spreadsheets, pivot tables and business intelligence tools to mine their data for forecasting insight. But they need to be upgrading their toolbox.

The survey identifies users of cutting edge visualization tools as the most effective forecasters. They are more likely to utilize specialist PBF systems, and have an arsenal of PBF technology they have prioritized for implementation in the next three years to improve their forecasts.

Even experimental organizations that aren’t yet able to harness the full power of machine learning and AI, are still generating better forecasts than the analytic novices.

The survey results are clear, advanced analytics must become the new baseline technology, it is no longer enough on rely on simple spreadsheets and pivot tables when your competitors are several steps ahead.

Insight – the top trump

But technology can’t operate in isolation. Cutting edge tools alone won’t provide the in-depth insight that is needed to properly compete against nimble start-ups. CFOs must ensure their PBF processes are inclusive, drawing input from outside the financial bubble to build a rounded view of the organization. This will engender respect for the PBF outcomes and align them with the strategic direction of the business.

Most importantly though, organizations need to promote an insightful planning, budgeting and forecasting function, by using advanced analytic techniques and tools, coupled with a broad data pool, to reveal unexpected insights and pathways that lead to better business performance.

As FSN stated, today’s finance organizations are looking to:

  • provide in-depth insights;
  • anticipate change and;
  • verify business opportunities before they become apparent to competitors.

But AI and machine learning technologies are still too immature. And spreadsheet-based processes don’t have the necessary functions to fill these advanced needs. While some might argue that spreadsheet-based processes could work for small businesses, they become unmanageable as companies grow.

PBF

Click here to access Wolters Kluwers FSN detailed survey report