Data Search and Discovery in Insurance – An Overview of AI Capabilities

Historically, the insurance industry has collected vast amounts of data relevant to their customers, claims, and so on. This can be unstructured data in the form of PDFs, text documents, images, and videos, or structured data that has been organized for big data analytics.

As with other industries, the existence of such a trove of data in the insurance industry led many of the larger firms to adopt big data analytics and techniques to find patterns in the data that might reveal insights that drive business value.

Any such big data applications may require several steps of data management, including collection, cleansing, consolidation, and storage. Insurance firms that have worked with some form of big data analytics in the past might have access to structured data which can be ingested by AI algorithms with little additional effort on the part of data scientists.

The insurance industry might be ripe for AI applications due to the availability of vast amounts of historical data records and the existence of large global companies with the resources to implement complex AI projects. The data being collected by these companies comes from several channels and in different formats, and AI search and discovery projects in the space require several initial steps to organize and manage data.

Radim Rehurek, who earned his PhD in Computer Science from the Masaryk University Brno and founded RARE Technologies, points out:

« A majority of the data that insurance firms collect is likely unstructured to some degree. This poses several challenges to insurance companies in terms of collecting and structuring data, which is key to the successful implementation of AI systems. »

Giacomo Domeniconi, a post-doctoral researcher at IBM Watson TJ Research Center and Adjunct Professor for the course “High-Performance Machine Learning” at New York University, mentions structuring the data as the largest challenge for businesses:

“Businesses need to structure their information and create labeled datasets, which can be used to train the AI system. Yet creating this labeled dataset might be very challenging apply AI and in most cases would involve manually labeling a part of the data using the expertise of a specialist in the domain.”

Businesses face many challenges in terms of collecting and structuring their data, which is key to the successful implementation of AI systems. An AI application is only as good as the data it consumes.

Natural language processing (NLP) and machine learning models often need to be trained on large volumes of data. Data scientists tweak these models to improve their accuracy.

This is a process that might last several months from start to finish, even in cases where the model is being taught relatively rudimentary tasks, such as identifying semantic trends in an insurance company’s internal documentation.

Most AI systems necessarily require the data to be input into an AI system in a structured format. Businesses would need to collect, clean, and organize their data to meet these requirements.

Although creating NLP and machine learning models to solve real-world business problems is by itself a challenging task, this process cannot be started without a plan for organizing and structuring enough data for these models to operate at reasonable accuracy levels.

Large insurance firms might need to think about how their data at different physical locations across the world might be affected by local data regulations or differences in data storage legacy systems at each location. Even with all the data being made accessible, businesses would find that data might still need to be scrubbed to remove any incorrect, incomplete, improperly formatted, duplicate, or outlying data. Businesses would also find that in some cases regulations might mandate the signing of data sharing agreements between the involved parties or data might need to be moved to locations where it can be analyzed. Since the data is highly voluminous, moving the data accurately can prove to be a challenge by itself.

InsIA

Click here to access Iron Mountain – Emerj’s White Paper

 

EIOPA reviews the use of Big Data Analytics in motor and health insurance

Data processing has historically been at the very core of the business of insurance undertakings, which is rooted strongly in data-led statistical analysis. Data has always been collected and processed to

  • inform underwriting decisions,
  • price policies,
  • settle claims
  • and prevent fraud.

There has long been a pursuit of more granular data-sets and predictive models, such that the relevance of Big Data Analytics (BDA) for the sector is no surprise.

In view of this, and as a follow-up of the Joint Committee of the European Supervisory Authorities (ESAs) cross-sectorial report on the use of Big Data by financial institutions,1 the European Insurance and Occupational Pensions Authority (EIOPA) decided to launch a thematic review on the use of BDA specifically by insurance firms. The aim is to gather further empirical evidence on the benefits and risks arising from BDA. To keep the exercise proportionate, the focus was limited to motor and health insurance lines of business. The thematic review was officially launched during the summer of 2018.

A total of 222 insurance undertakings and intermediaries from 28 jurisdictions have participated in the thematic review. The input collected from insurance undertakings represents approximately 60% of the total gross written premiums (GWP) of the motor and health insurance lines of business in the respective national markets, and it includes input from both incumbents and start-ups. In addition, EIOPA has collected input from its Members and Observers, i.e. national competent authorities (NCAs) from the European Economic Area, and from two consumers associations.

The thematic review has revealed a strong trend towards increasingly data-driven business models throughout the insurance value chain in motor and health insurance:

  • Traditional data sources such as demographic data or exposure data are increasingly combined (not replaced) with new sources like online media data or telematics data, providing greater granularity and frequency of information about consumer’s characteristics, behaviour and lifestyles. This enables the development of increasingly tailored products and services and more accurate risk assessments.

EIOPA BDA 1

  • The use of data outsourced from third-party data vendors and their corresponding algorithms used to calculate credit scores, driving scores, claims scores, etc. is relatively extended and this information can be used in technical models.

EIOPA BDA 2

  • BDA enables the development of new rating factors, leading to smaller risk pools and a larger number of them. Most rating factors have a causal link while others are perceived as being a proxy for other risk factors or wealth / price elasticity of demand.
  • BDA tools such as such as artificial intelligence (AI) or machine learning (ML) are already actively used by 31% of firms, and another 24% are at a proof of concept stage. Models based on these tools are often cor-relational and not causative, and they are primarily used on pricing and underwriting and claims management.

EIOPA BDA 3

  • Cloud computing services, which reportedly represent a key enabler of agility and data analytics, are already used by 33% of insurance firms, with a further 32% saying they will be moving to the cloud over the next 3 years. Data security and consumer protection are key concerns of this outsourcing activity.
  • Up take of usage-based insurance products will gradually continue in the following years, influenced by developments such as increasingly connected cars, health wearable devices or the introduction of 5G mobile technology. Roboadvisors and specially chatbots are also gaining momentum within consumer product and service journeys.

EIOPA BDA 4

EIOPA BDA 5

  • There is no evidence as yet that an increasing granularity of risk assessments is causing exclusion issues for high-risk consumers, although firms expect the impact of BDA to increase in the years to come.

In view of the evidence gathered from the different stake-holders, EIOPA considers that there are many opportunities arising from BDA, both for the insurance industry as well as for consumers. However, and although insurance firms generally already have in place or are developing sound data governance arrangements, there are also risks arising from BDA that need to be further addressed in practice. Some of these risks are not new, but their significance is amplified in the context of BDA. This is particularly the case regarding ethical issues with the fairness of the use of BDA, as well as regarding the

  • accuracy,
  • transparency,
  • auditability,
  • and explainability

of certain BDA tools such as AI and ML.

Going forward, in 2019 EIOPA’s InsurTech Task Force will conduct further work in these two key areas in collaboration with the industry, academia, consumer associations and other relevant stakeholders. The work being developed by the Joint Committee of the ESAs on AI as well as in other international fora will also be taken into account. EIOPA will also explore third-party data vendor issues, including transparency in the use of rating factors in the context of the EU-US insurance dialogue. Furthermore, EIOPA will develop guidelines on the use of cloud computing by insurance firms and will start a new workstream assessing new business models and ecosystems arising from InsurTech. EIOPA will also continue its on-going work in the area of cyber insurance and cyber security risks.

Click here to access EIOPA’s detailed Big Data Report

The Innovation Game – How Data is Driving Digital Transformation

Technology waits for no one. And those who strike first will have an advantage. The steady decline in business profitability across multiple industries threatens to erode future investment, innovation and shareholder value. Fortunately, the emergence of artificial intelligence (AI) can help kick-start profitability. Accenture research shows that AI has the potential to boost rates of profitability by an average of 38 percent by 2035 and lead to an economic boost of US$14 trillion across 16 industries in 12 economies by 2035.

Driven by these economic forces, the age of digital transformation is in full swing. Today we can’t be “digital to the core” if we don’t leverage all new data sources – unstructured, dark data and thirty party sources. Similarly, we have to take advantage of the convergence of AI and analytics to uncover previously hidden insights. But, with the increasing use of AI, we also have to be responsible and take into account the social implications.

Finding answers to the biggest questions starts with data, and ensuring you are capitalizing on the vast data sources available within your own business. Thanks to the power of AI/machine learning and advanced algorithms, we have moved from the era of big data to the era of ALL data, and that is helping clients create a more holistic view of their customer and more operational efficiencies.

Embracing the convergence of AI and analytics is crucial to success in our digital transformation. Together,

  • AI-powered analytics unlock tremendous value from data that was previously hidden or unreachable,
  • changing the way we interact with people and technology,
  • improving the way we make decisions, and giving way to new agility and opportunities.

While businesses are still in the infancy of tapping into the vast potential of these combined technologies, now is the time to accelerate. But to thrive, we need to be pragmatic in finding the right skills and partners to guide our strategy.

Finally, whenever we envision the possibilities of AI, we should consider the responsibility that comes with it. Trust in the digital era or “responsible AI” cannot be overlooked. Explainable AI and AI transparency are critical, particularly in such areas as

  • financial services,
  • healthcare,
  • and life sciences.

The new imperative of our digital transformation is to balance intelligent technology and human ingenuity to innovate every facet of business and become a smarter enterprise.

The exponential growth of data underlying the strategic imperative of enterprise digital transformation has created new business opportunities along with tremendous challenges. Today, we see organizations of all shapes and sizes embarking on digital transformation. As uncovered in Corinium Digital’s research, the primary drivers of digital transformation are those businesses focused on addressing increasing customer expectations and implementing efficient internal processes.

Data is at the heart of this transformation and provides the fuel to generate meaningful insights. We have reached the tipping point where all businesses recognize they cannot compete in a digital age using analog-era legacy solutions and architectures. The winners in the next phase of business will be those enterprises that obtain a clear handle on the foundations of modern data management, specifically the nexus of

  • data quality,
  • cloud,
  • and artificial intelligence (AI).

While most enterprises have invested in on-premises data warehouses as the backbone of their analytic data management practices, many are shifting their new workloads to the cloud. The proliferation of new data types and sources is accelerating the development of data lakes with aspirations of gaining integrated analytics that can accelerate new business opportunities. We found in the research that over 60% of global enterprises are now investing in a hybrid, multi-cloud strategy with both data from cloud environments such as Microsoft Azure along with existing on-premises infrastructures. Hence, this hybrid, multicloud strategy will need to correlate with their investments in data analytics, and it will become imperative to manage data seamlessly across all platforms. At Paxata, our mission is to give everyone the power to intelligently profile and transform data into consumable information at the speed of thought. To empower everyone, not just technical users, to prepare their data and make it ready for analytics and decision making.

The first step in making this transition is to eliminate the bottlenecks of traditional IT-led data management practices through AI-powered automation.

Second, you need to apply modern data preparation and data quality principles and technology platforms to support both analytical and operational use cases.

Thirdly, you need a technology infrastructure that embraces the hybrid, multi-cloud world. Paxata sits right at the center stage of this new shift, helping enterprises profile and transform complex data types in highvariety, high-volume environments. As such, we’re excited about partnering with Accenture and Microsoft to accelerate businesses with our ability to deliver modern analytical and operational platforms to address today’s digital transformation requirements.

Artificial intelligence is causing two major revolutions simultaneously among developers and enterprises. These revolutions will drive the technology decisions for the next decade. Developers are massively embracing AI. As a platform company, Microsoft is focused on enabling developers to make the shift to the next app development pattern, driven by the intelligent cloud and intelligent edge.

AI is the runtime that will power the apps of the future. At the same time, enterprises are eager to adopt and integrate AI. Cloud and AI are the most requested topics in Microsoft Executive Briefing Centers. AI is changing how companies serve their customers, run their operations, and innovate.

Ultimately, every business process in every industry will be redefined in profound ways. If it used to be true that “software was eating the world,” it is now true to say that “AI is eating software”. A new competitive differentiator is emerging: how well an enterprise exploits AI to reinvent and accelerate its processes, value chain and business models. Enterprises need a strategic partner who can help them transform their organization with AI. Microsoft is emerging as a solid AI leader as it is in a unique position to address both revolutions. Our strength and differentiation lie in the combination of multiple assets:

  • Azure AI services that bring AI to every developer. Over one million developers are accessing our pre-built and customizable AI services. We have the most comprehensive solution for building bots, combined with a powerful platform for Custom AI development with Azure Machine Learning that spans the entire AI development lifecycle, and a market leading portfolio of pre-built cognitive services that can be readily attached to applications.
  • A unique cloud infrastructure including CPU, GPU, and soon FPGA, makes Azure the most reliable, scalable and fastest cloud to run AI workloads.
  • Unparalleled tools. Visual Studio, used by over 6 million developers, is the most preferred tool in the world for application development. Visual Studio and Visual Studio Code are powerful “front doors” through which to attract developers seeking to add AI to their applications.
  • Ability to add AI to the edge. We enable developers, through our tools and services, to develop an AI model and deploy that model anywhere. Through our support for ONNX – the open source representation for AI models in partnership with Facebook, Amazon, IBM and others – as well as for generic containers, we allow developers to run their models on the IoT edge and leverage the entire IoT solution from Azure.

But the competition to win enterprises is not only played in the platform battlefield, enterprises are demanding solutions. Microsoft AI solutions provide turnkey implementations for customers who want to transform their core processes with AI. Our unique combination of IP and consulting services address common scenarios such as business agents, sales intelligence or marketing intelligence. As our solutions are built on top of our compelling AI platform, unlike ourcompetitors, our customers are not locked in to any one consulting provider, they remain in full control of their data and can extend the scenarios or target new scenarios themselves or through our rich partner ecosystem.

AI Analytics

Click here to access Corinium’s White Paper

Technology Driven Value Generation in Insurance

The evolution of financial technology (FinTech) is reshaping the broader financial services industry. Technology is now disrupting the traditionally more conservative insurance industry, as the rise of InsurTech revolutionises how we think about insurance distribution.

Moreover, insurance companies are improving their operating models, upgrading their propositions, and developing innovative new products to reshape the insurance industry as a whole.

Five key technologies are driving the change today:

  1. Cloud computing
  2. The Internet of Things (including telematics)
  3. Big data
  4. Artificial intelligence
  5. Blockchain

This report examines these technologies’ potential to create value in the insurance industry. It also examines how technology providers could create new income streams and take advantage of economies of scale by offering their technological backbones to participants in the insurance industry and beyond.

Cloud computing refers to storing, managing, and processing data via a network of remote servers, instead of locally on a server or personal computer. Key enablers of cloud computing include the availability of high-capacity networks and service-oriented architecture. The three core characteristics of a cloud service are:

  • Virtualisation: The service is based on hardware that has been virtualised
  • Scalability: The service can scale on demand, with additional capacity brought online within minutes
  • Demand-driven: The client pays for the services as and when they are needed

cloud

Telematics is the most common form of the broader Internet of Things (IoT). The IoT refers to the combination of physical devices, vehicles, buildings and other items embedded with electronics, software, sensors, actuators, and network connectivity that enable these physical objects to collect and exchange data.

The IoT has evolved from the convergence of

  • wireless technologies,
  • micro-electromechanical systems,
  • and the Internet.

This convergence has helped remove the walls between operational technology and information technology, allowing unstructured, machine-generated data to be analysed for insights that will drive improvements.

IoT

Big data refers to data sets that are so large or complex that traditional data processing application software is insufficient to deal with them. A definition refers to the “five V” key challenges for big data in insurance:

  • Volume: As sensors cost less, the amount of information gathered will soon be measured
    in exabytes
  • Velocity: The speed at which data is collected, analysed, and presented to users
  • Variety: Data can take many forms, such as structured, unstructured, text or multimedia. It can come from internal and external systems and sources, including a variety
    of devices
  • Value: Information provided by data about aspects of the insurance business, such as customers and risks
  • Veracity: Insurance companies ensure the accuracy of their plethora of data

Modern analytical methods are required to process these sets of information. The term “big data has evolved to describe the quantity of information analysed to create better outcomes, business improvements, and opportunities that leverage all available data. As a result, big data is not limited to the challenges thrown up by the five Vs. Today there are two key aspects to big data:

  1. Data: This is more-widely available than ever because of the use of apps, social media, and the Internet of Things
  2. Analytics: Advanced analytic tools mean there are fewer restrictions to working with big data

BigData

The understanding of Artificial Intelligence AI has evolved over time. In the beginning, AI was perceived as machines mimicking the cognitive functions that humans associate with other human minds, such as learning and problem solving. Today, we rather refer to the ability of machines to mimic human activity in a broad range of circumstances. In a nutshell, artificial intelligence is the broader concept of machines being able to carry out tasks in a way that we would consider smart or human.

Therefore, AI combines the reasoning already provided by big data capabilities such as machine learning with two additional capabilities:

  1. Imitation of human cognitive functions beyond simple reasoning, such as natural language processing and emotion sensing
  2. Orchestration of these cognitive components with data and reasoning

A third layer is pre-packaging generic orchestration capabilities for specific applications. The most prominent such application today are bots. At a minimum, bots orchestrate natural language processing, linguistic technology, and machine learning to create systems which mimic interactions with human beings in certain domains. This is done in such a way that the customer does not realise that the counterpart is not human.

Blockchain is a distributed ledger technology used to store static records and dynamic transaction data distributed across a network of synchronised, replicated databases. It establishes trust between parties without the use of a central intermediary, removing frictional costs and inefficiency.

From a technical perspective, blockchain is a distributed database that maintains a continuously growing list of ordered records called blocks. Each block contains a timestamp and a link to a previous block. Blockchains have been designed to make it inherently difficult to modify their data: Once recorded, the data in a block cannot be altered retroactively. In addition to recording transactions, blockchains can also contain a coded set of instructions that will self-execute under a pre-specified set of conditions. These automated workflows, known as smart contracts, create trust between a set of parties, as they rely on pre-agreed data sources and and require not third-party to execute them.

Blockchain technology in its purest form has four key characteristics:

  1. Decentralisation: No single individual participant can control the ledger. The ledger
    lives on all computers in the network
  2. Transparency: Information can be viewed by all participants on the network, not just
    those involved in the transaction
  3. Immutability: Modifying a past record would require simultaneously modifying every
    other block in the chain, making the ledger virtually incorruptible
  4. Singularity: The blockchain provides a single version of a state of affairs, which is
    updated simultaneously across the network

Blockchain

Oliver Wyman, ZhongAn Insurance and ZhongAn Technology – a wholly owned subsidiary of ZhongAn insurance and China’s first online-only insurer – are jointly publishing this report to analyse the insurance technology market and answer the following questions:

  • Which technologies are shaping the future of the insurance industry? (Chapter 2)
  • What are the applications of these technologies in the insurance industry? (Chapter 3)
  • What is the potential value these applications could generate? (Chapter 3)
  • How can an insurer with strong technology capabilities monetise its technologies?
    (Chapter 4)
  • Who is benefiting from the value generated by these applications? (Chapter 5)

 

Click here to access Oliver Wyman’s detailed report