Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
October 6, 2021

Can We Trust Insurance Companies With Big Data?

Can We Trust Insurance Companies With Big Data
BY: TARTLE

Insurance. No one likes it. No one really wants it. We definitely hate paying for it. And why wouldn’t we? Insurance companies are notorious for not wanting to pay out any money on a claim and sometimes dropping people if they do successfully collect on one. After all, insurance companies aren’t really about protecting you, they’re about making money. As the sniveling weasel in The Incredibles put it, “What about our shareholders? Who’s looking out for them, huh?” 

As one would expect, insurance companies are always looking to cut their costs. For that, they have turned to data collection and analysis. TARTLE is of course big on data and what we can learn from it. However, we are not fans of the way insurance companies and pretty much everyone else tends to make use of third party data for their purposes. Not only is the sourcing of the data unethical in itself, it can also wind up being discriminatory. Not intentionally, sometimes assumptions are made that are written into the algorithms that analyze the data. Those assumptions may seem like no big deal at first, but they can be processed in such a way that they exclude far more people than intended, people that seem to fit a given profile but in the end differ in certain important ways the algorithm isn’t meant to look for. That’s one of the dangers of completely automating everything. When an AI is running the show, it doesn’t care about any programmed biases, it just does what it is told and does it completely ruthlessly. That is why Connecticut recently reminded insurers in the tiny state that they need to be careful to avoid any sort of discrimination in their use of data. Easier said than done.

To illustrate that, let’s say the insurance company offered a discount to anyone who linked a Whoop or a Fitbit to their insurance account. That might seem innocuous. Certainly, they are sourcing data in a better than normal way since people have to opt in to share it. However, those things on your wrist cost money. Money that not everyone might be able to afford. Just a Whoop subscription runs around $30 a month. How many people are going to be paying that so they can opt into a discount program? Not many, especially since that discount will probably not defray the costs of the subscription. 

On one hand, it seems perfectly reasonable to grant a discount to people who are willing to share more of their health data. Why wouldn’t an insurance company want to incentivize that behavior? Of course they would. On the other hand, not everyone can afford it, as stated above. Which makes this a case of exclusion based on economics. Intentional? Probably not. Not too many people actually wake up in the morning and ask themselves how they can screw over poor people today. Not even people working for an insurance company. 

So, what is the solution? How can an insurance company reward customers for sharing their health data without excluding those who can’t afford the necessary devices? TARTLE has exactly the right solution. We offer these companies the chance to reach out directly to their customers. The company can ask its customers on TARTLE to share whatever data they would like and when someone chooses to do so, the company simply pays the person for the data. That is something that virtually anyone can take advantage of. Yes, there are people who can’t afford any sort of device to work with TARTLE on, but if we are being honest, they don’t have insurance anyway. The solutions to that problem are on a whole other level (though there are other ways other organizations can use us to tackle that one). What we offer is the chance for insurance and other companies to interact directly with their customers to get the information they need and for those people to be incentivized. It’s a win/win scenario for everyone willing to take advantage of it.

What’s your data worth? Sign up for the TARTLE Marketplace through this link here.

Summary
Can We Trust Insurance Companies With Big Data?
Title
Can We Trust Insurance Companies With Big Data?
Description

Insurance. No one likes it. No one really wants it. We definitely hate paying for it. And why wouldn’t we? Insurance companies are notorious for not wanting to pay out any money on a claim and sometimes dropping people if they do successfully collect on one. After all, insurance companies aren’t really about protecting you, they’re about making money. As the sniveling weasel in The Incredibles put it, “What about our shareholders? Who’s looking out for them, huh?” 

Tags:
Feature Image Credit: Envato Elements
FOLLOW @TARTLE_OFFICIAL

For those who are hard of hearing – the episode transcript can be read below:

TRANSCRIPT

Alexander McCaig (00:08):

Okay. We are back again, live on other streams too unfortunately for you all. You have to deal with us. Can't get us out of your hair.

Jason Rigby (00:16):

World domination.

Alexander McCaig (00:18):

Yeah.

Jason Rigby (00:19):

On social media.

Alexander McCaig (00:21):

What did they have? They had InfoWars, right?

Jason Rigby (00:24):

Yeah.

Alexander McCaig (00:25):

Data wars.

Jason Rigby (00:26):

Oh, there went this video.

Alexander McCaig (00:27):

Yes.

Jason Rigby (00:29):

Oh, here we go. We are against InfoWars. We do not condone anything that InfoWars is a part of.

Alexander McCaig (00:35):

No. I have no ...

Jason Rigby (00:37):

Okay. There we go.

Alexander McCaig (00:37):

No interest right now. Thank you.

Jason Rigby (00:38):

Wait, now we just saved the video.

Alexander McCaig (00:40):

All right. That cleaned that up. Connecticut reminds insurers to avoid discriminary practices with big data.

Jason Rigby (00:47):

So don't be gone. Every time we talk about insurance, people just leave the episode.

Alexander McCaig (00:53):

Insurance is so funny, right? There was a time I used to sell insurance, and ...

Jason Rigby (00:58):

You look like that guy. Did you wear a suit?

Alexander McCaig (01:01):

No. No, I wouldn't do that.

Alexander McCaig (01:04):

I sold insurance in some pretty rough areas.

Jason Rigby (01:08):

Oh wow. That's tough.

Alexander McCaig (01:10):

I'll tell you right now, people would rather pay their Xfinity or cell phone bill than buy an insurance policy.

Jason Rigby (01:16):

Yeah.

Alexander McCaig (01:17):

Priorities are backwards. And on top of that, everybody needs it. Nobody wants to pay for it. And every time you do pay for it and you issue a claim, it's a hassle. So everyone's like, "F this scam." That's what they feel like. But that's my side point. The thing I want to focus on here is that Connecticut, the government had to remind people to not discriminate.

Jason Rigby (01:53):

So let's get into this.

Alexander McCaig (01:54):

Okay.

Jason Rigby (01:55):

Whether you're an insurance company, we're not just complaining because this is corporate transparency in our big seven. Why does it behoove a company when you're looking at big data and AI to discriminate?

Alexander McCaig (02:10):

If you discriminate, then you can control things better. But if you let everybody in, you got to deal with the free flow. And free flow for a lot of people is way too much risk.

Jason Rigby (02:20):

So and insurance companies are all about what?

Alexander McCaig (02:22):

They're about decreasing risks.

Jason Rigby (02:25):

Yes.

Alexander McCaig (02:25):

So of course, they're going to use more big data to isolate on a very specific subset of their markets that they want to speak to.

Jason Rigby (02:32):

It's like, if I can find Whoop owners and Fitbit owners, and then I know their heart rate variability is this number and they're working out five times a week ...

Alexander McCaig (02:42):

We're only going to get policies-

Jason Rigby (02:45):

Low risk.

Alexander McCaig (02:46):

Yeah, it's low risk for us. Here's an interesting point. The availability of data increases. Some people choose to participate in the sharing of that information. The others that want to participate in the receiving of that shared information can make a better decision. So let me ask you something. Is it discrimination that you give someone a better price because ....

Jason Rigby (03:20):

Of life choices?

Alexander McCaig (03:22):

Of life choices, because they have essentially more data to share that decreases their risk profile because they opt in to sharing that information.

Jason Rigby (03:31):

As long as they know that.

Alexander McCaig (03:32):

Here's the other problem. Here's the other half of that coin. For insurance, well then, think about the Whoop. How many people can afford a $30 a month subscription.

Jason Rigby (03:43):

Yeah.

Alexander McCaig (03:43):

Just on a band that goes on your arm. Not many. So then you're only going to give better insurance rates to those that sit in a certain demographic profile because they're sharing more data. It's a problem. So what you really need to look at is insurance companies, if they want to avoid discriminatory practices ...

Jason Rigby (04:05):

They need to sign up for TARTLE.

Alexander McCaig (04:07):

They need to sign up for TARTLE because everyone can participate in the sharing.

Jason Rigby (04:13):

Mm-hmm (affirmative). But you can also go directly to them and ask them. If you want to know what their life choices are, pay them for that data.

Alexander McCaig (04:20):

You will be paying for-

Jason Rigby (04:20):

Then it's ethical.

Alexander McCaig (04:22):

Yeah, of course, it's ethical.

Jason Rigby (04:23):

How simple as that?

Alexander McCaig (04:24):

Well, that was easy to source that.

Jason Rigby (04:25):

You just solved the whole insurance issues.

Alexander McCaig (04:27):

Stop discriminating. Just buy it for everybody. And all they want to do is standardize a curve. The best way to standardize a curve is to have a very diverse population.

Jason Rigby (04:36):

The best way to have the right type of data instead of these insurance companies trying to figure out all this third-party data is go directly to the person and ask them if they smoke, ask them how many times they work out a day. You can ask them that. And if a person in their freewill chooses to give you that information, then you pay them for that. Now, guess what you have?

Alexander McCaig (04:54):

Do you get road rage often?

Jason Rigby (04:56):

Oh.

Alexander McCaig (04:56):

Have you felt suicidal-

Jason Rigby (04:57):

I'm feeling road rage right now-

Alexander McCaig (04:58):

Have you felt suicidal in the past week-

Jason Rigby (04:59):

Big data. I don't like ... You know why. Okay, let's talk about this.

Alexander McCaig (05:04):

Okay.

Jason Rigby (05:04):

Because now it's got me pissed off.

Alexander McCaig (05:05):

I'm sorry.

Jason Rigby (05:06):

Discrimination, this ... And you know as well as I this, and how passionate we are about this. It does not fucking matter where you live, who you are.

Alexander McCaig (05:18):

Nope.

Jason Rigby (05:20):

The decisions that you make for you as a human.

Alexander McCaig (05:22):

Yep.

Jason Rigby (05:23):

You are valued 100%. On a needle from zero to 100%, at TARTLE every single person-

Alexander McCaig (05:31):

Pegged out.

Jason Rigby (05:31):

Is pegged out no matter what country you live in.

Alexander McCaig (05:34):

You want to know why? Because they're a human being.

Jason Rigby (05:37):

Yes.

Alexander McCaig (05:39):

They have ...

Jason Rigby (05:39):

Discrimination is anti-human.

Alexander McCaig (05:41):

It's anti-human.

Jason Rigby (05:43):

It's anti-

Alexander McCaig (05:43):

It's anti-life-

Jason Rigby (05:45):

... evolves. Yeah, anti-life.

Alexander McCaig (05:47):

It's anti-life. It's a de-evolutive thought process to discriminate. Insurance is in the game. This is where this boils down to.

Jason Rigby (05:55):

Now we're getting somewhere.

Alexander McCaig (05:56):

Insurance companies got in the game to profit.

Jason Rigby (05:59):

Yes.

Alexander McCaig (06:00):

That's it.

Jason Rigby (06:01):

Off a death, in car accidents.

Alexander McCaig (06:05):

Yeah. But why do you think they fight so hard not to give you your claim back. They don't want to pay out. It's not good for them.

Jason Rigby (06:14):

No.

Alexander McCaig (06:14):

And every time they pay out, they don't eat the cost. They shove it off to everybody else who's insured by them, and they make you pay more.

Jason Rigby (06:23):

Yes.

Alexander McCaig (06:23):

Okay. And I understand that's an aspect of insurance. But guess what? Open up the broad spectrum availability. And if you become profitable, share that money back to the people so if we all want to share in the risk pool, at least you've been paying us back in the mutual sense so we're okay with a little bit of a price increase. But if you're increasing prices and you don't want to pay people out on absolutely anything, and you're constantly fighting them to the hill, you're going to have no face. That's why every insurance company is an insurance company. When you say insurance, do you have just some like bright shining armor and one that you're so happy to talk about? No.

Alexander McCaig (06:59):

When I talk about big box retailers, people have a preference to Target, right? They're like, "Target is like, they're my gold standard when I go to do my Target run." If I'm doing an insurance run, what's your gold standard? I don't have one. I'm just going to the cheapest price.

Jason Rigby (07:13):

Yes.

Alexander McCaig (07:13):

Because you don't trust any of them.

Jason Rigby (07:15):

Yeah, me personally, I don't give any investing advice or financial advice at all. I'm not certified in that. But as for me personally, I have enough money to pay for my funeral sitting there in cash.

Alexander McCaig (07:24):

I'm happy for you.

Jason Rigby (07:25):

So the amount of money that the $150, $170, $200 I pay a month for life insurance, I just throw it in Bitcoin. Every month that much just throw it into Bitcoin.

Alexander McCaig (07:35):

I'm so happy for you.

Jason Rigby (07:36):

What happens if I die in five or 10 years, what's going to happen?

Alexander McCaig (07:39):

I'll drain the Bitcoin account.

Jason Rigby (07:41):

There you go.

Alexander McCaig (07:41):

I'll go pay for your funeral.

Jason Rigby (07:44):

No, I have the cash sitting there.

Alexander McCaig (07:45):

I'm going to get you the cheapest pine box around.

Jason Rigby (07:47):

Well, you know what? It's actually free for me to die because I'm a disabled veteran and I don't care. And Santa Fe has an amazing veterans' cemetery. And I'm not going to care anyway.

Alexander McCaig (07:58):

Here's some things that are messed up. You got to pay to die. You got to pay to have a baby.

Jason Rigby (08:02):

Yes.

Alexander McCaig (08:04):

Dude, all this stuff is upside down.

Jason Rigby (08:06):

It's all upside down. And when we discriminate, we cause division, yeah. And when we cause division, it causes the biggest issue that we have right now. And that's number four in our big seven, and that's global peace.

Alexander McCaig (08:22):

Right.

Jason Rigby (08:22):

And TARTLE is all about global peace.

Alexander McCaig (08:25):

We want global peace. Global peace happens through global understanding that a human being is a human being.

Jason Rigby (08:30):

Regardless of their culture. I don't care if our-

Alexander McCaig (08:33):

Muslim, Catholic, Christian, Scientologist, black, white, yellow, red, brown, blue. I don't even care.

Jason Rigby (08:40):

No. You're a human being.

Alexander McCaig (08:41):

I'm going to pull that skin off. Right? And underneath is nerves, blood, bone, muscle, lymphatic system. And we all sharing that.

Jason Rigby (08:52):

Don't pull skin off, bro. It's like put the lotion in the basket.

Alexander McCaig (08:54):

I know. You understand my point. Right?

Jason Rigby (08:57):

No, I get your point. Yeah. I'm just-

Alexander McCaig (08:58):

The point is, when you discriminate, you slow. It's not efficient. It's also illogical.

Jason Rigby (09:04):

Yes.

Alexander McCaig (09:05):

So if you're looking at algorithms or practices that you think are decreasing your risk models, but essentially uplift a certain group that maybe get some sort of false priority because of the amount of money they make or their ability to offer more information, that's not proper.

Jason Rigby (09:20):

It's also with these artificial intelligence machines and we're seeing this, is when we're throwing our biases of discrimination in there, it's becoming extremely ruthless.

Alexander McCaig (09:28):

Yeah.

Jason Rigby (09:29):

So ...

Alexander McCaig (09:30):

what does it care?

Jason Rigby (09:31):

Why don't you let it? Why don't you let it decide? If you put a metric, if you put AI, which is just so amazing the things that they're learning and as it continues to grow, if you put a metric that humans, every single life was at 100%, all the globe, but we have this problem.

Alexander McCaig (09:49):

All of our current algorithm-

Jason Rigby (09:50):

Let it figure itself out.

Alexander McCaig (09:52):

Of course, yeah.

Jason Rigby (09:53):

And then you can take that value and say human life is at 100%.

Alexander McCaig (09:56):

Yep.

Jason Rigby (09:56):

And no one is ... Everyone is equal. If it's a human, it's equal, but there's all these other variables. And then let it decide. It will get smarter and smarter and smarter, decide on that. But you know as well as I do with all these marketing firms and these big data companies, it's all about discrimination.

Alexander McCaig (10:13):

Right. And the third party data is already biased.

Jason Rigby (10:15):

And very discriminatory.

Alexander McCaig (10:16):

Yeah. Because you're not getting it from the source. So some party who thought it was beneficial to them to acquire this data and then resell it, they're only going to choose specific groups to take-

Jason Rigby (10:26):

Oh yeah. We want people that make over 100,000 a year, that are moms, that are 32 to 41, you know what I mean? It's like, come on. But they won't tell you this, but the algorithms are even going off of color.

Alexander McCaig (10:42):

Oh, of course they are.

Jason Rigby (10:43):

And race.

Alexander McCaig (10:43):

All the time.

Jason Rigby (10:44):

Yeah.

Alexander McCaig (10:45):

They won't share that with you and their matrices.

Jason Rigby (10:47):

Yeah. These guys are creating racist-

Alexander McCaig (10:49):

The actuary tables-

Jason Rigby (10:51):

... artificial intelligence.

Alexander McCaig (10:52):

Yeah. It's very racist, even if they don't even realize it. Some people don't even realize they might be a little racist.

Jason Rigby (10:59):

Oh, yeah. I would say the majority of the world is ...

Alexander McCaig (11:01):

I think-

Jason Rigby (11:02):

At some point.

Alexander McCaig (11:06):

When you're programming your algorithms, have a checks and balance in there. How discriminating is this thing before it processes, right? Have an algorithm that checks other algorithms. Is this anti-human? Is this anti-life? Is this anti-peace? Is this anti-climate? If it's pegged out on those things, well, you're not using that algorithm.

Jason Rigby (11:30):

Right.

Alexander McCaig (11:30):

You have an algorithm that checks out other algorithms. That's what we should be doing.

Jason Rigby (11:34):

No, but whether it's an insurance company or whatever, everybody wants ROI.

Alexander McCaig (11:39):

Yeah, here, this, and here's the final quote, and speaking to your ROI. This notice acknowledges the changing competitive and innovative business environment Connecticut insurance companies must operate in. Why are insurance companies competing with one another? You're not in the business to make money. Just be happy with the people that sign up. If that's the limit of your growth, you're capped out at that. You can't create growth out of insurance. Insurance is a business for loss. That's the whole design of it. But they're trying to make money from insurance. It's so stupid. So they shuffle each other around, create fake growth every year, and try and get a little bump. But in the end, you're just harming yourself and others.

Jason Rigby (12:24):

Discrimination.

Alexander McCaig (12:25):

Discrimination. It doesn't work for insurance. Don't work for me. Don't work for you. Don't work for anybody else.

Jason Rigby (12:29):

Not a part of TARTLE.

Alexander McCaig (12:30):

Yeah. See you.

Announcer (12:30):

Thank you for listening to TARTLEcast with your hosts, Alexander McCaig and Jason Rigby, where humanity steps into the future, and source data defines the path. What's your data worth?