EIOPA reviews the use of Big Data Analytics in motor and health insurance

Data processing has historically been at the very core of the business of insurance undertakings, which is rooted strongly in data-led statistical analysis. Data has always been collected and processed to

  • inform underwriting decisions,
  • price policies,
  • settle claims
  • and prevent fraud.

There has long been a pursuit of more granular data-sets and predictive models, such that the relevance of Big Data Analytics (BDA) for the sector is no surprise.

In view of this, and as a follow-up of the Joint Committee of the European Supervisory Authorities (ESAs) cross-sectorial report on the use of Big Data by financial institutions,1 the European Insurance and Occupational Pensions Authority (EIOPA) decided to launch a thematic review on the use of BDA specifically by insurance firms. The aim is to gather further empirical evidence on the benefits and risks arising from BDA. To keep the exercise proportionate, the focus was limited to motor and health insurance lines of business. The thematic review was officially launched during the summer of 2018.

A total of 222 insurance undertakings and intermediaries from 28 jurisdictions have participated in the thematic review. The input collected from insurance undertakings represents approximately 60% of the total gross written premiums (GWP) of the motor and health insurance lines of business in the respective national markets, and it includes input from both incumbents and start-ups. In addition, EIOPA has collected input from its Members and Observers, i.e. national competent authorities (NCAs) from the European Economic Area, and from two consumers associations.

The thematic review has revealed a strong trend towards increasingly data-driven business models throughout the insurance value chain in motor and health insurance:

  • Traditional data sources such as demographic data or exposure data are increasingly combined (not replaced) with new sources like online media data or telematics data, providing greater granularity and frequency of information about consumer’s characteristics, behaviour and lifestyles. This enables the development of increasingly tailored products and services and more accurate risk assessments.

EIOPA BDA 1

  • The use of data outsourced from third-party data vendors and their corresponding algorithms used to calculate credit scores, driving scores, claims scores, etc. is relatively extended and this information can be used in technical models.

EIOPA BDA 2

  • BDA enables the development of new rating factors, leading to smaller risk pools and a larger number of them. Most rating factors have a causal link while others are perceived as being a proxy for other risk factors or wealth / price elasticity of demand.
  • BDA tools such as such as artificial intelligence (AI) or machine learning (ML) are already actively used by 31% of firms, and another 24% are at a proof of concept stage. Models based on these tools are often cor-relational and not causative, and they are primarily used on pricing and underwriting and claims management.

EIOPA BDA 3

  • Cloud computing services, which reportedly represent a key enabler of agility and data analytics, are already used by 33% of insurance firms, with a further 32% saying they will be moving to the cloud over the next 3 years. Data security and consumer protection are key concerns of this outsourcing activity.
  • Up take of usage-based insurance products will gradually continue in the following years, influenced by developments such as increasingly connected cars, health wearable devices or the introduction of 5G mobile technology. Roboadvisors and specially chatbots are also gaining momentum within consumer product and service journeys.

EIOPA BDA 4

EIOPA BDA 5

  • There is no evidence as yet that an increasing granularity of risk assessments is causing exclusion issues for high-risk consumers, although firms expect the impact of BDA to increase in the years to come.

In view of the evidence gathered from the different stake-holders, EIOPA considers that there are many opportunities arising from BDA, both for the insurance industry as well as for consumers. However, and although insurance firms generally already have in place or are developing sound data governance arrangements, there are also risks arising from BDA that need to be further addressed in practice. Some of these risks are not new, but their significance is amplified in the context of BDA. This is particularly the case regarding ethical issues with the fairness of the use of BDA, as well as regarding the

  • accuracy,
  • transparency,
  • auditability,
  • and explainability

of certain BDA tools such as AI and ML.

Going forward, in 2019 EIOPA’s InsurTech Task Force will conduct further work in these two key areas in collaboration with the industry, academia, consumer associations and other relevant stakeholders. The work being developed by the Joint Committee of the ESAs on AI as well as in other international fora will also be taken into account. EIOPA will also explore third-party data vendor issues, including transparency in the use of rating factors in the context of the EU-US insurance dialogue. Furthermore, EIOPA will develop guidelines on the use of cloud computing by insurance firms and will start a new workstream assessing new business models and ecosystems arising from InsurTech. EIOPA will also continue its on-going work in the area of cyber insurance and cyber security risks.

Click here to access EIOPA’s detailed Big Data Report

Anlalytics Behind The Perfect Risk Score & Predictive Model

We are living in a progressively more connected world where smarter products and changing consumer expectations are disrupting nearly every industry. While the connected world is data intensive, complex to manage and challenging to harness, the opportunities for generating more value and new propositions are nearly endless.

Octo Telematics has invested in the development of algorithms and analytical tools to help our industry partners maximize opportunities from the connected world – and we continue to do so today. Through actionable intelligence based on the accurate analysis of data, industry partners can differentiate their products and services with innovative customer experiences.

In building globally recognized analytical capabilities to serve the global insurance marketplace, Octo Telematics acquired the usage-based insurance (UBI) assets of Willis Towers Watson, including its market-leading DriveAbility® solution. DriveAbility aggregates and analyses granular telematics and insurance data to provide an industry-leading driving score and assist insurers to design, score, issue and bind telematics-based insurance policies. It also facilitates relationships between stakeholders including automotive OEMs, telecommunication companies and insurers to present convenient, personalized insurance offers to customers using pre-analyzed driving data. Today, a strategic alliance with Willis Towers Watson on additional opportunities continues to enhance both companies’ suite of products and services.

Historically, insurance companies have made underwriting and pricing decisions based on static risk factors that are largely proxies for how, how much, when and where a vehicle is operated. By leveraging actual driving data, data scientists can build telematics-based risk scores that are significantly more predictive than any risk factor used by insurance companies today.

To get the full value from telematics, data scientists must have the right data and employ different techniques than those used for traditional actuarial analysis. Done correctly, insurers can create a score that provides

  • double-digit lift,
  • optimizes the lift above and beyond traditional factors
  • and identifies factors that cause accidents to happen.

Failure to follow best practices for model development will result in sub-optimal lift that makes the business case less compelling. Lift is just one factor that should be considered. To be truly effective, any risk score should also be transparent, cost-effective, flexible, implementable and acceptable to regulatory bodies. Even the most predictive scores may not be effective if they fail one or more of these categories.

Octo

Click here to access OCTO’s White Paper

The Future of Planning Budgeting and Forecasting

The world of planning, budgeting and forecasting is changing rapidly as new technologies emerge, but the actual pace of change within the finance departments of most organizations is rather more sluggish. The progress companies have made in the year since The Future of Planning, Budgeting and Forecasting 2016 has been incremental, with a little accuracy gained but very little change to the reliance on insight-limiting technologies like spreadsheets.

That said, CFOs and senior finance executives are beginning to recognize the factors that contribute to forecasting excellence, and there is a groundswell of support for change. They’ll even make time to do it, and we all know how precious a CFOs time can be, especially when basic improvements like automation and standardization haven’t yet been implemented.

The survey shows that most PBF functions are still using relatively basic tools, but it also highlights the positive difference more advanced technology like visualization techniques and charting can make to forecasting outcomes. For the early adopters of even more experimental technologies like machine learning and artificial intelligence, there is some benefit to being at the forefront of technological change. But the survey suggests that there is still some way to go before machines take over the planning, budgeting and forecasting function.

In the meantime, senior finance executives who are already delivering a respected, inclusive and strategic PBF service need to focus on becoming more insightful, which means using smart technologies in concert with non-financial data to deliver accurate, timely, long term forecasts that add real value to the business.

Making headway

CFOs are making incremental headway in improving their planning, budgeting and forecasting processes, reforecasting more frequently to improve accuracy. But spreadsheet use remains a substantial drag on process improvements, despite organizations increasingly looking towards new technologies to progress the PBF landscape.

That said, respondents seem open to change, recognizing the importance of financial planning and analysis as a separate discipline, which will help channel resources in that direction. At the moment, a slow and steady approach is enough to remain competitive, but as more companies make increasingly substantial changes to their PBF processes to generate better insight, those that fail to speed up will find they fall behind.

Leading the debate

FSN’s insights gleaned from across the finance function shed light on the changes happening within the planning, budgeting and forecasting function, and identify the processes that make a real difference to outcomes. Senior finance executives are taking heed of these insights and making changes within the finance function. The most important one is the increasing inclusion of non-financial data into forecasting and planning processes. The Future of The Finance Function 2016 identified this as a game-changer, for the finance function as a whole, and for PBF in particular. It is starting to happen now. Companies are looking towards data from functions outside of finance, like customer relationship management systems and other non-financial data sources.

Senior executives are also finally recognizing the importance of automation and standardization as the key to building a strong PBF foundation. Last year it languished near the bottom of CFO’s priority lists, but now it is at the top. With the right foundation, PBF can start to take advantage of the new technology that will improve forecasting outcomes, particularly in the cloud.

There is increasing maturity in the recognition of cloud solution benefits, beyond just cost, towards agility and scalability. With recognition comes implementation, and it is hoped that uptake of these technologies will follow with greater momentum.

Man vs machine

Cloud computing has enabled the growth of machine learning and artificial intelligence solutions, and we see these being embedded into our daily lives, in our cars, personal digital assistants and home appliances. In the workplace, machine learning tools are being used for

  • predictive maintenance,
  • fraud detection,
  • customer personalization
  • and automating finance processes.

In the planning, budgeting and forecasting function, machine learning tools can take data over time, apply parameters to the analysis, and then learn from the outcomes to improve forecasts.

On the face of it, machine learning appears to be a game changer, adding unbiased logic and immeasurable processing power to the forecasting process, but the survey doesn’t show a substantial improvement in forecasting outcomes for organizations that use experimental technologies like these. And the CFOs and senior finance executives who responded to the survey believe there are substantial limitations to the effective of machine forecasts. As the technology matures, and finance functions become more integrated, machine learning will proliferate, but right now it remains the domain of early adopters.

Analytic tools

Many of the cloud solutions for planning, budgeting and forecasting involve advanced analytic tools, from visualization techniques to machine learning. Yet the majority of respondents still use basic spreadsheets, pivot tables and business intelligence tools to mine their data for forecasting insight. But they need to be upgrading their toolbox.

The survey identifies users of cutting edge visualization tools as the most effective forecasters. They are more likely to utilize specialist PBF systems, and have an arsenal of PBF technology they have prioritized for implementation in the next three years to improve their forecasts.

Even experimental organizations that aren’t yet able to harness the full power of machine learning and AI, are still generating better forecasts than the analytic novices.

The survey results are clear, advanced analytics must become the new baseline technology, it is no longer enough on rely on simple spreadsheets and pivot tables when your competitors are several steps ahead.

Insight – the top trump

But technology can’t operate in isolation. Cutting edge tools alone won’t provide the in-depth insight that is needed to properly compete against nimble start-ups. CFOs must ensure their PBF processes are inclusive, drawing input from outside the financial bubble to build a rounded view of the organization. This will engender respect for the PBF outcomes and align them with the strategic direction of the business.

Most importantly though, organizations need to promote an insightful planning, budgeting and forecasting function, by using advanced analytic techniques and tools, coupled with a broad data pool, to reveal unexpected insights and pathways that lead to better business performance.

As FSN stated, today’s finance organizations are looking to:

  • provide in-depth insights;
  • anticipate change and;
  • verify business opportunities before they become apparent to competitors.

But AI and machine learning technologies are still too immature. And spreadsheet-based processes don’t have the necessary functions to fill these advanced needs. While some might argue that spreadsheet-based processes could work for small businesses, they become unmanageable as companies grow.

PBF

Click here to access Wolters Kluwers FSN detailed survey report

The Innovation Game – How Data is Driving Digital Transformation

Technology waits for no one. And those who strike first will have an advantage. The steady decline in business profitability across multiple industries threatens to erode future investment, innovation and shareholder value. Fortunately, the emergence of artificial intelligence (AI) can help kick-start profitability. Accenture research shows that AI has the potential to boost rates of profitability by an average of 38 percent by 2035 and lead to an economic boost of US$14 trillion across 16 industries in 12 economies by 2035.

Driven by these economic forces, the age of digital transformation is in full swing. Today we can’t be “digital to the core” if we don’t leverage all new data sources – unstructured, dark data and thirty party sources. Similarly, we have to take advantage of the convergence of AI and analytics to uncover previously hidden insights. But, with the increasing use of AI, we also have to be responsible and take into account the social implications.

Finding answers to the biggest questions starts with data, and ensuring you are capitalizing on the vast data sources available within your own business. Thanks to the power of AI/machine learning and advanced algorithms, we have moved from the era of big data to the era of ALL data, and that is helping clients create a more holistic view of their customer and more operational efficiencies.

Embracing the convergence of AI and analytics is crucial to success in our digital transformation. Together,

  • AI-powered analytics unlock tremendous value from data that was previously hidden or unreachable,
  • changing the way we interact with people and technology,
  • improving the way we make decisions, and giving way to new agility and opportunities.

While businesses are still in the infancy of tapping into the vast potential of these combined technologies, now is the time to accelerate. But to thrive, we need to be pragmatic in finding the right skills and partners to guide our strategy.

Finally, whenever we envision the possibilities of AI, we should consider the responsibility that comes with it. Trust in the digital era or “responsible AI” cannot be overlooked. Explainable AI and AI transparency are critical, particularly in such areas as

  • financial services,
  • healthcare,
  • and life sciences.

The new imperative of our digital transformation is to balance intelligent technology and human ingenuity to innovate every facet of business and become a smarter enterprise.

The exponential growth of data underlying the strategic imperative of enterprise digital transformation has created new business opportunities along with tremendous challenges. Today, we see organizations of all shapes and sizes embarking on digital transformation. As uncovered in Corinium Digital’s research, the primary drivers of digital transformation are those businesses focused on addressing increasing customer expectations and implementing efficient internal processes.

Data is at the heart of this transformation and provides the fuel to generate meaningful insights. We have reached the tipping point where all businesses recognize they cannot compete in a digital age using analog-era legacy solutions and architectures. The winners in the next phase of business will be those enterprises that obtain a clear handle on the foundations of modern data management, specifically the nexus of

  • data quality,
  • cloud,
  • and artificial intelligence (AI).

While most enterprises have invested in on-premises data warehouses as the backbone of their analytic data management practices, many are shifting their new workloads to the cloud. The proliferation of new data types and sources is accelerating the development of data lakes with aspirations of gaining integrated analytics that can accelerate new business opportunities. We found in the research that over 60% of global enterprises are now investing in a hybrid, multi-cloud strategy with both data from cloud environments such as Microsoft Azure along with existing on-premises infrastructures. Hence, this hybrid, multicloud strategy will need to correlate with their investments in data analytics, and it will become imperative to manage data seamlessly across all platforms. At Paxata, our mission is to give everyone the power to intelligently profile and transform data into consumable information at the speed of thought. To empower everyone, not just technical users, to prepare their data and make it ready for analytics and decision making.

The first step in making this transition is to eliminate the bottlenecks of traditional IT-led data management practices through AI-powered automation.

Second, you need to apply modern data preparation and data quality principles and technology platforms to support both analytical and operational use cases.

Thirdly, you need a technology infrastructure that embraces the hybrid, multi-cloud world. Paxata sits right at the center stage of this new shift, helping enterprises profile and transform complex data types in highvariety, high-volume environments. As such, we’re excited about partnering with Accenture and Microsoft to accelerate businesses with our ability to deliver modern analytical and operational platforms to address today’s digital transformation requirements.

Artificial intelligence is causing two major revolutions simultaneously among developers and enterprises. These revolutions will drive the technology decisions for the next decade. Developers are massively embracing AI. As a platform company, Microsoft is focused on enabling developers to make the shift to the next app development pattern, driven by the intelligent cloud and intelligent edge.

AI is the runtime that will power the apps of the future. At the same time, enterprises are eager to adopt and integrate AI. Cloud and AI are the most requested topics in Microsoft Executive Briefing Centers. AI is changing how companies serve their customers, run their operations, and innovate.

Ultimately, every business process in every industry will be redefined in profound ways. If it used to be true that “software was eating the world,” it is now true to say that “AI is eating software”. A new competitive differentiator is emerging: how well an enterprise exploits AI to reinvent and accelerate its processes, value chain and business models. Enterprises need a strategic partner who can help them transform their organization with AI. Microsoft is emerging as a solid AI leader as it is in a unique position to address both revolutions. Our strength and differentiation lie in the combination of multiple assets:

  • Azure AI services that bring AI to every developer. Over one million developers are accessing our pre-built and customizable AI services. We have the most comprehensive solution for building bots, combined with a powerful platform for Custom AI development with Azure Machine Learning that spans the entire AI development lifecycle, and a market leading portfolio of pre-built cognitive services that can be readily attached to applications.
  • A unique cloud infrastructure including CPU, GPU, and soon FPGA, makes Azure the most reliable, scalable and fastest cloud to run AI workloads.
  • Unparalleled tools. Visual Studio, used by over 6 million developers, is the most preferred tool in the world for application development. Visual Studio and Visual Studio Code are powerful “front doors” through which to attract developers seeking to add AI to their applications.
  • Ability to add AI to the edge. We enable developers, through our tools and services, to develop an AI model and deploy that model anywhere. Through our support for ONNX – the open source representation for AI models in partnership with Facebook, Amazon, IBM and others – as well as for generic containers, we allow developers to run their models on the IoT edge and leverage the entire IoT solution from Azure.

But the competition to win enterprises is not only played in the platform battlefield, enterprises are demanding solutions. Microsoft AI solutions provide turnkey implementations for customers who want to transform their core processes with AI. Our unique combination of IP and consulting services address common scenarios such as business agents, sales intelligence or marketing intelligence. As our solutions are built on top of our compelling AI platform, unlike ourcompetitors, our customers are not locked in to any one consulting provider, they remain in full control of their data and can extend the scenarios or target new scenarios themselves or through our rich partner ecosystem.

AI Analytics

Click here to access Corinium’s White Paper

Mastering Risk with “Data-Driven GRC”

Where are organizations heading ?

“Data Driven GRC” represents a consolidation of methodologies, both functional and technological, that dramatically enhance the opportunity to address emerging risk landscapes and, in turn, maximizing the reliability of organizational performance. This paper examines the key opportunities to leverage change—both from a risk and an organizational performance management perspective—to build integrated, data-driven GRC processes that optimize the value of audit and risk management activities, as well as the investments in supporting tools and techniques.

Functional Stakeholders of GRC Processes and Technology

The Institute of Internal Auditors’ (IIA) “Three Lines of Defense in Effective Risk Management and Control” model specifically addresses the “who and what” of risk management and control. It distinguishes and describes three role- and responsibility-driven functions :

  • Those that own and manage risks (management – the “first line”)
  • Those that oversee risks (risk, compliance, financial controls, IT – the “second line”)
  • Those functions that provide independent assurance over risks (internal audit – the “third line”)

The overarching context of these three lines acknowledges the broader role of organizational governance and governing bodies.

Technology Deficiencies in the Three Lines of Defense

Since the emergence of Sarbanes-Oxley, the use of technology in risk and control related processes has truly started to take meaningful shape in many organizations. However, when looking across the risk and control oriented functions in most organizations, technology is still typically used on a departmental or point solution basis.

Third Line (internal audit) use of risk & control technology

For the past decade, surveys of internal auditors have consistently identified the more effective use of technology as among the most pressing issues facing the profession. Specifically, the responses to the surveys also referred to the need for increased use of technology for audit analysis, fraud detection, and continuous auditing. Other surveys also highlight a shortage of sufficient technology and data analysis skills within audit departments.

Much of the driving force for improving the use of technology is based on the desire to make the audit process itself more efficient and more effective, as well as to deliver more tangible value to the rest of the organization.

During the past decade, the role of the internal audit function itself has changed considerably. Internal audit’s traditional focus on cyclical audits and testing internal controls is evolving into one in which internal audit is expected to assess and report on the effectiveness of management’s processes to address risk overall. This often includes providing guidance and consultation to the business on best practices for managing risk and compliance within business process areas and maintaining effective control systems. The use of technology is an increasingly critical component of these best practices and in some cases internal audit is able to champion the implementation of high-impact, high-value technology within the business’s risk management and compliance processes, based on their own experience in using technology for assurance purposes.

There is considerable variation in the extent to which internal audit departments leverage technology. However it is certainly fair to say that for audit to be truly valuable and relevant within the context of organizational strategy, a significant improvement is required across the board. Internal audit as a profession simply is not moving forward at the pace of technology.

Some specific statistics from recent research reveals:

  • Only approximately 40% of internal audit departments use audit and documentation management systems from specialized vendors. The remainder use disorganized tools and processes, typically based on Microsoft Office® & shared folders.
  • Audit programs for specific business process areas and industries are usually developed through a combination of previously used programs and those shared on various audit-related websites. This approach does not address organization-specific risk.
  • Next generation testing techniques, especially data analytics, are overwhelmingly underutilized.

Second Line (risk, compliance, financial controls, IT) use of risk & control technology

Outside of audit, in other areas of risk and compliance, some organizations have acquired specialized departmental software, but the majority use only basic Office tools to maintain inventories of risks, document controls and perform risk assessments. In larger enterprises, it is not unusual to have a variety of different technologies and approaches applied in different operational entities or in different functional areas. This approach is usually more costly and less effective than one based on a common platform. Effective testing methods using technology are usually unavailable or left unconsidered.

In fact, second line of defense functions often rely heavily on inquiry-based methods such as surveying, which are proven ineffective at identifying the actual manifestations of risk in the organization. If analytical software is used in the business for investigations or monitoring transactions, it in many cases involves standard query tools or some form of generic business intelligence (BI) technology. Although good for providing summary level information or high-level trends, BI tools struggle to show the root cause of problems. And while they may have certain capabilities to prevent fraud and errors from occurring, or to flag exceptions, they are not sufficient to effectively trap the typical problem transactions that occur.

First Line (management) use of risk & control technology

While in some cases, first line management have access to better technology for use on specific pain point areas (e.g., continuous transaction monitoring technology used within finance departments), there is a common tendency for management to place far too much reliance on core business systems for effective control. While the large ERP and other system vendors seem to have extensive capabilities for preventing control deficiencies, the reality is that these are extremely extensive and complex systems and internal controls are usually the afterthought of those implementing them, not a core focus. For example, in many cases certain control settings are turned off to enable the ERP system to run more efficiently.

An integrated and collaborative approach to managing risks and monitoring controls in collaboration with the second and third lines of defense, using a common, independent methodology and technology platform, typically proves the most effective in accomplishing management’s key risk mitigation strategies.

DD GRC

 

Click here to access ACL’s White Paper

By investing heavily in start-ups and technology, (re)insurance companies appear to have assumed a semblance of control over the InsurTech revolution

Who Benefits from Modularization?

With technology moving forward at an unprecedented pace, incumbents are increasingly electing to outsource functions to highly specialized new entrants, renting evolving modules of technology that can be tailored to suit their individual needs. Though this approach may be more cost effective, it further fuels the question of whether incumbents will allow value in the industry to shift towards new entrants. In time, market participants will come to understand which module in the chain generates the most value. It is plausible that automation in distribution will shift value towards efficiency of internal processes that support cutting-edge modeling and underwriting engines.

InsT0

The State of InsurTech

InsurTech funding volume increased 36% year-over-year in 2017, demonstrating that technology driven innovation remains a core focus area for (re)insurance companies and investors heading into 2018. However, perhaps contrary to many of the opinions championed in editorial and press coverage of the InsurTech sector, further analysis of the growing number of start-ups successfully attracting capital from (re)insurers and financial investors reveals that the majority of InsurTech ventures are not focused on exiling incumbents by disrupting the pressured insurance value chain. According to research from McKinsey & Company,

  • 61% of InsurTech companies aim to enable the value chain,
  • 30% are attempting to disintermediate incumbents from customers
  • 9% are targeting full scale value chain disruption.

Has the hype surrounding InsurTech fostered unjustified fear from overly defensive incumbents?

We have taken this analysis a step further by tracking funding volume from strategic (re)insurers versus financial investors for InsurTechs focused on enabling the value chain relative to their counterparts attempting to disintermediate customers from incumbents or disrupt the value chain altogether and found that 65% of strategic (re)insurer InsurTech investments have been concentrated in companies enabling the value chain, with only 35% of incumbent investments going to start-ups with more disruptive business models. What does it mean? While recognizing the subjective nature of surmising an early stage company’s ultimate industry application at maturity from its initial focus, we attribute this phenomenon to the tendency of incumbents to, consciously or subconsciously, encourage development of less perceptibly threatening innovation while avoiding more radical, potentially intimidating technologies and applications.

Recognizing that this behavior may allow incumbents to preserve a palatable status quo, it should be considered in the context in which individual investments are evaluated – on the basis of expected benefits relative to potential risk. We have listed several benefits that InsurTechs offer to incumbents :

InsT1

Segmenting the InsurTech Universe

As InsurTech start-ups continue to emerge across the various components of the insurance value chain and business lines, incumbents and investors are evaluating opportunities to deploy these applications in the insurance industry today and in the future. To simplify the process of identifying useful and potentially transformational technologies and applications, we have endeavored to segment the increasingly broad universe of InsurTech companies by their core function into four categories:

  1. Product & Distribution
  2. Business Process Enhancement
  3. Data & Analytics
  4. Claims Management

This exercise is complicated by the tendency of companies to operate across multiple functions, so significant professional judgment was used in determining the assignment for each company. A summary of the criteria used to determine placement is listed below. On the following pages, we have included market maps to provide a high level perspective of the number of players in each category, as well as a competitive assessment of each subsector and our expectations for each market going forward. Selected companies in each category, ranked by the amount of funding they have raised to date, are listed, followed by more detailed overviews and Q&A with selected representative companies from each subsector.

InsT2

Click here to access WTW’s detailed birefing

Keeping up with shifting compliance goalposts in 2018 – Five focal areas for investment

Stakeholders across the organization are increasingly seeking greater compliance effectiveness, efficiency, cost cutting, and agility in compliance activities to further compete in the expanding digital and automated world.

Organizations are being reinforced this way to continuously improve their compliance activities, because in the future, integration and automation of compliance activities is an imperative. To prepare for tomorrow, organizations must invest today.

When positioning your organization for the future, keep in mind the following five areas for investment:

1. Operational integration

Regulators are increasingly spotlighting the need for operational integration within a compliance risk management program, meaning that compliance needs to be integrated in business processes and into people’s performance of their job duties on a day-to-day basis.

When approaching the governance of managing compliance efforts, a more centralized, or a hybrid approach, strengthens the organization’s overall compliance risk management control environment.

2. Automation of compliance activities

The effectiveness of compliance increases when there is integration across an enterprise and successful automation of processes. Compliance leaders are turning toward intelligent automation as an answer for slimming down compliance costs, and becoming more nimble and agile in an ever-increasingly competitive world. When intelligent automation is on the table to support possible compliance activities, some important considerations must be made:

  • Compliance program goals for the future
  • Implementation dependencies and interdependencies
  • Determining how automation will and can support the business
  • Enhancing competitiveness and agility in executing its compliance activities

Automating compliance activities can also help augment resource allocation and realize greater accuracy by implementing repetitive tasks into the automation.

3. Accountability

Regulators increasingly expect organization to implement performance management and compensation programs to encourage prudent risk-taking. In fact, identified by the KPMG CCO Survey, 55% of CCOs identified “enhancing accountability and compliance responsibilities” as a top 3 priority in 2017.

It is essential that disciplinary and incentive protocols be consistently applied to high-level employees. To do so sends a message that seniority and success do not exempt anyone from following the rules.

4. Formalized risk assessments

Regulatory guidelines and expectations released in 2017 set forth specific focal areas that compliance leaders should ensure are covered in their risk assessments.

  • Evaluating the data needs of the compliance program can help the organization migrate to a more data-driven metrics environment in a controlled way.
  • Availability, integrity, and accuracy of data is needed to understand and assess compliance risks enterprise-wide. The use of data quality assessments to evaluate the compliance impact can help address this challenge.
  • Implementing a data governance model to share data across the 3 lines of defense is a good way of reassuring data owners and stakeholders that the data will be used consistent with the agreed upon model.
  • Further integration and aggregation of data is needed to avoid unintentionally ‘underestimating” compliance risks because of continuous change in measurement of compliance programs and data & analytics.
  • To maximize the benefits of data & analytics, leading organizations are building analytics directly into their compliance processes in order to identify risk scenarios in real time and to enhance their risk coverage in a cost-effective way.

5. Continuous improvement

Compliance efforts by organizations need to continuously evolve to ensure the control environment remains firm while risk trends appear, risks emerge, and regulatory expectations shift.

Compliance and business leaders must continuously improve their compliance activities in pursuit of greater effectiveness, efficiency, agility, and resiliency. Because by continuously improving, organizations can methodically position their organizations for the future.

KPMG

Click here to access KPMG’s detailed White Paper

State of Digital Analytics: The Persistent Challenge of Data Access & Governance

Disjointed, inaccessible data is a major productivity inhibitor for analytics teams, diverting skilled resources from contributing to valuable business intelligence.

Analytics teams struggle with data access. In addition to listing data silos and data access among both their top data and analytics challenges, above, nearly three in five said it takes days or weeks to access all the data needed for their work or the work of the teams they manage. Only a third were able to access all their data in a day or less.

AMOUNT OF TIME FOR ANALYSTS AND ANALYTICS TEAMS TO ACCESS DATA

Nearly two in five analytics professionals are spending more than half of their work week on tasks unrelated to actual analysis. Forty-four percent of managers reported that more than half of their team’s work week is spent accessing, blending, and preparing data rather than analyzing it, while 31 percent of analysts said they spend more than half of their work week on data housekeeping.

TIME SPENT PREPPING DATA, RATHER THAN ANALYZING IT

As a result, the majority of analysts have found it necessary to learn programming languages specifically to help them access and/or prepare data for analysis. Outside of mandates from their employers, a full 70 percent of analysts reported taking it upon themselves to learn to code for this reason, and more than a quarter of those analysts have spent 80 or more hours learning to program.

ANALYSTS LEARNING PROGRAMMING SKILLS TO OVERCOME DATA ISSUES

It should go without saying that data professionals tasked with analyzing organizational information meaningfully and actionably cannot adequately perform their core job function without accurate data. Yet in addition to raising the data access challenges above, the industry is also split in terms of confidence in data accuracy. Nearly half reported that they question the accuracy of the data they or the teams they manage use regularly, while a little more than half said they are confident about their data.

Data Analysis

Click here to access TMMData’s detailed Survey Results

What’s now and next in Analytics, AI, and Automation

Over the past few years, rapid technological advances in digitization and data and analytics have been

  • reshaping the business landscape,
  • supercharging performance
  • and enabling the emergence of new business innovations
  • and new forms of competition
  • and business disruption.

Yet progress has been uneven. While many companies struggle to harness the power of these technologies, companies that are fully leveraging the capabilities are capturing disproportionate benefits, transforming their businesses and outpacing—and occasionally disrupting—the rest.

At the same time the technology itself continues to evolve rapidly, bringing new waves of advances in

  • robotics,
  • analytics,
  • and artificial intelligence (AI),
  • and especially machine learning.

Together they amount to a step change in technical capabilities that could have profound implications for business, for the economy, and more broadly for society as a whole. Machines today increasingly match or outperform human performance in a range of work activities, including ones that require cognitive capabilities, learning, making tacit judgments, sensing emotion, and even driving—activities that used to be considered safe from automation. Adoption of these technologies could bring significant new performance and transformational benefits to companies that go beyond simply substituting labor and lead to previously unimagined breakthrough performance and outcomes. Moreover, they have the potential to boost the productivity of the global economy at a time when it is sorely needed for growth and the share of the working-age population is declining.

Yet their advent raises difficult questions about how companies can best prepare for and harness these technologies, the skills and organizational reinvention that will be required to make the most of them, and how the leaders in the private and public sector as well as workers will adapt to the impact on jobs, capability-building and the nature of work itself.

Disruption

MGI-Briefing-Note-Automation-final