Quick claims | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

Quick claims

Could artificial intelligence replace insurance claims adjusters?


LaymanZoom/iStock

Quick claims
You have {{ remainingArticles }} free {{ counterWords }} remaining. You've read all of your free articles.

Full access isn’t far.

We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.

Get started for as low as $3.99 per month.

Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.

LET'S GO

Already a member? Sign in.

Your insurance agent may not have been replaced by a computer, but artificial intelligence (AI) may soon take over one of the most critical jobs in the insurance business: accident and disaster appraisal.

Tractable, a London-based startup, is using machine learning to train an AI system to conduct visual damage assessment with the goal of speeding up insurance payouts and access to governmental disaster relief funds.

“Our belief is that when accidents and disasters hit, the response could be 10 times faster thanks to AI,” Alexandre Dalyac, Tractable CEO and co-founder, told TechCrunch. “Everything from road accidents, burst piping to large-scale floods and hurricane.”

The traditional process of releasing claims funds after an automobile accident, for example, begins with a visual damage appraisal by an experienced claims adjuster, a process that can take days or weeks. Tractable claims its AI, which was trained on millions of images of vehicle damage, can do the same job in minutes. It works by policyholders sending photos of the damage to their insurance company, which can then use Tractable’s AI program to instantly estimate the repair cost.

Dalyac conceded that it’s a difficult machine learning problem to correlate exterior photos with internal damage, something an experienced human adjuster can do easily.

“Our AI has already been trained on tens of millions of these cases, so that’s a perfect case of us already having distilled thousands of people’s work experience,” he said. “That allows us to get hold of some very challenging correlations that humans just can’t do.”

University Medical Center Göttingen

Bright noise

Flashes of light could be the key to future, high-resolution hearing aids. German scientists have successfully restored hearing in gerbils using a technique called optogenetics, in which auditory nerves are genetically modified so they respond to light.

According to IEEE Spectrum, the researchers injected normal gerbils’ cochleae with a virus that changed the genetic code of their auditory nerves so that they became sensitive to light. They then implanted optical fibers to deliver the light to those neurons. In subsequent tests, the scientists trained the gerbils to jump to another compartment after optical stimulation of the ear. When they played an audible alarm (without the light flashes) the gerbils still jumped, indicating the animals’ brains had interpreted the sound and light cues similarly. The optical cues also worked with deaf gerbils.

The German researchers are attempting to improve on current cochlear implant technology, in which sounds are converted to electrical impulses delivered to the inner ear via electrodes.

“The chief complaint of people who depend on cochlear implants is that it’s hard to understand speech in noise,” lead researcher Tobias Moser of the University Medical Center Göttingen told IEEE Spectrum. “Optical stimulation may be the breakthrough to increase frequency resolution.”

The research is in early stages, so it’s not yet clear what sort of sound the gerbils perceived. —M.C.

Abhishek Singh

Abhishek Singh Handout

Smarter with signs

Software developer Abhishek Singh has built a computer application that translates sign language into verbal speech that can communicate with Alexa, Amazon’s popular voice assistant.

Singh’s program uses the computer’s camera to capture sign language gestures and translate them into words spoken by a computer-generated voice. A nearby Amazon Echo device with the Alexa voice assistant hears the commands, and the program translates Alexa’s audible response into visible text on the screen.

In a recent YouTube video, Singh signed the phrase, “Alexa, what is the weather?” Alexa’s instant reply: “Right now in New York it’s 29 degrees Celsius with partly sunny skies.” —M.C.


Michael Cochrane Michael is a World Journalism Institute graduate and a former WORLD correspondent.

COMMENT BELOW

Please wait while we load the latest comments...

Comments