Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
June 18, 2021

Big Data, Big Problem, Big Bias

Big Data, Big Problem, Big Bias
BY: TARTLE

Big Data Bias

Data can do and has done great things to help improve life for people everywhere. That’s accelerated in the digital age (yes, data existed before computers, we just gathered it with our senses) as we’ve been able to gather more and more data faster from more people. Learning how to sort and analyze that data quickly has also been a game-changer in forming policy and developing new products. We’ve been able to better target where a problem may lie in a company, which roads need to be fixed, better ways to distribute medicine and the list goes on.

Of course, as with so many things, there is a downside. We’ve often looked at how companies can profit off of the data that you generate for them without your consent to gather it in the first place, or even if you have consented to the gathering, you might not appreciate some of the ways that data gets used. Often, a person might withdraw that consent if they knew how it was being utilized. However, there is another downside to the way data is currently handled that we haven’t discussed nearly as much. That would be biases in how it gets sorted, or that things are sorted in certain ways at all. 

The government, companies and other organizations often sort data into different categories based on race, cultural background, income, and shopping habits. What do you notice about all of that? Those are all attributes of people. Yes, it is often useful to classify and sort information into different categories. Yet, aren’t people more than the sum of a few superficial attributes? Aren’t people more than their race? More than their paycheck? TARTLE would like to think so.

What are some examples? Some universities will sort people based on these kinds of categories and then run it through a set of predictive algorithms to determine who they should and shouldn’t admit. So you have a kid from the inner city, low income, no father, a couple of petty robberies on his rap sheet. The algorithm rejects him. It’s easy to see why. Yet, what if this kid is eager to turn his life around and do better, to get out of a crappy cycle? What if all he needs is a chance? The algorithm won’t catch that. It doesn’t care that the kid is a human being and not a collection of attributes.

Another is at least one town used predictive analytics to determine who in the area was likely to be a criminal. That led to a lot of harassment when police officers took the information their algorithm spit out and started trying too hard to catch those people in a crime. In addition to the obvious injustice of being treated like a criminal before even committing a crime that also meant that resources weren’t getting directed where they needed to be. A number of crimes might have been prevented if the police weren’t focused on people their computers said were likely to be criminals. Not to mention, by repeatedly agitating some people, you might actually create a couple of criminals when they lash out.

This has of course infected the corporate world as well. Some companies actually grade their employees’ productivity based in part on how much digital interaction they engage in. People at these companies can be considered productive if they send a lot of emails and participate in the company’s group chat. Of course, the emails could be a series of memes saved on your phone, the group chat could be talking about your new car or any number of silly and irrelevant things that have absolutely nothing to do with productivity. It’s possible this is the worst metric ever.

All of these examples point to a central and significant problem, a problem that pervades Big Data. The problem of forgetting that behind all of those data points is a person, a person that probably will not fit perfectly into the box an algorithm will try to shove them in. What is the solution? How can we get and analyze data without losing sight of the people behind it? By going to the people themselves. By getting to know them, asking them questions, learning what their goals really are, instead of letting algorithms decide that for everyone. That is the mission of TARTLE, to get organizations to go to the source of the data, to go to you so they can get real information, information that will actually contribute to understanding what is really going on in the world. Something that no algorithm will ever be able to do.

What’s your data worth?

Summary
Big Data, Big Problem, Big Bias
Title
Big Data, Big Problem, Big Bias
Description

Data can do and has done great things to help improve life for people everywhere. That’s accelerated in the digital age (yes, data existed before computers, we just gathered it with our senses) as we’ve been able to gather more and more data faster from more people. Learning how to sort and analyze that data quickly has also been a game-changer in forming policy and developing new products.

Feature Image Credit: Envato Elements
FOLLOW @TARTLE_OFFICIAL

For those who are hard of hearing – the episode transcript can be read below:

TRANSCRIPT

Speaker 1 (00:07):

Welcome to Tartle Cast, with your hosts, Alexander McCaig and Jason Rigby. Where humanities steps into the future and source data defines the path.

Alexander McCaig (00:25):

Guten Morgen my friends. Welcome to Tartle Cast.

Jason Rigby (00:29):

Yes.

Alexander McCaig (00:30):

No, Yah.

Jason Rigby (00:31):

Yah. We're going to be talking about secret sauce and recipes.

Alexander McCaig (00:35):

Secret sauce.

Jason Rigby (00:36):

That's what's in the article. This person must've been hungry when they wrote it.

Alexander McCaig (00:39):

It's the thing that goes on your Big Mac, gets on your Chick-fil-A sandwich.

Jason Rigby (00:44):

All Big Mac sauce is, is thousand Island and Ketchup, pretty much. With, it looks like it has pickles ground up in there or something.

Alexander McCaig (00:52):

And who knows what else they put in it.

Jason Rigby (00:53):

Yeah. Who knows? McDonald's. McDonald's is, if I'm going to go bad, that's my go-to.

Alexander McCaig (01:02):

Really?

Jason Rigby (01:04):

I haven't ate at McDonald's in, I don't know when, but it's been months, but I do occasionally will partake and I will go... I crave, if I'm going to be super depressed and I'm not feeling good for the day-

Alexander McCaig (01:19):

Oh, that'll probably help it.

Jason Rigby (01:20):

Yeah. It helps a lot. I'll put all the shades, make them really dark, go get McDonald's and then watch horror movies, one after the other and it cures it.

Alexander McCaig (01:30):

You have a problem.

Jason Rigby (01:31):

No, we're fine bro. We are fine.

Alexander McCaig (01:35):

Yeah. We're fine here.

Jason Rigby (01:39):

It's not deep rooted, only since I've been a child.

Alexander McCaig (01:44):

Kleiner problem. Tiny.

Jason Rigby (01:47):

Just a little bit. Why do you think I'm single bro? Exactly.

Alexander McCaig (01:52):

It's all the horror films and McDonald's. I love wasting time here on this podcast. This is an interesting behavior and we'll segue, if I'm going to choose something it's Domino's Pizza.

Jason Rigby (02:07):

Yeah. I like Domino's Pizza. It's pretty good. Yeah.

Alexander McCaig (02:09):

No, but if I want it, you can send over a couple of those.

Jason Rigby (02:12):

Do they make a vegan one?

Alexander McCaig (02:14):

I just get it without the cheese and the butter.

Jason Rigby (02:16):

Oh yeah, yeah.

Alexander McCaig (02:17):

So I'm pretty much just getting bread with a bunch of stuff on it.

Jason Rigby (02:20):

Yeah. But you can get, they have tons of vegetables. They have the little Domino's tracker on the phone.

Alexander McCaig (02:26):

Here's what I'll do, I'll get the pizza and I have my own cheese, my vegan cheese at the house and I'll put it on there.

Jason Rigby (02:30):

Oh, just put them on there?

Alexander McCaig (02:31):

Put it back in the oven.

Jason Rigby (02:32):

Oh, there you go. That's perfect. Yeah, see.

Alexander McCaig (02:34):

I have them do all the real legwork and I'm essentially just heating it up.

Jason Rigby (02:37):

Yeah. I like the vegan pizza. We have a vegan pizza place in the mountains.

Alexander McCaig (02:42):

Oh, Trail Rider?

Jason Rigby (02:43):

Yeah.

Alexander McCaig (02:43):

That place is unreal.

Jason Rigby (02:45):

Yeah. So they have some pretty amazing food there. I love it.

Alexander McCaig (02:49):

Can we talk about something-

Jason Rigby (02:50):

Yeah well, you're talking about, I'm a fat kid, bro.

Alexander McCaig (02:52):

So am I.

Jason Rigby (02:54):

Whenever it comes to food, bro, I'm in. [crosstalk 00:02:56] I was raised in the South where it was like a privilege to eat fast food. And my dad loved fast food. My mom ate it. And then at home, everything was like junk cereals. And then my mom was always making cakes.

Alexander McCaig (03:15):

Wow.

Jason Rigby (03:15):

So I just constantly ate peanut butter and jelly.

Alexander McCaig (03:18):

I'd love to see you as a little butter ball.

Jason Rigby (03:20):

Like all sugar all the time.

Alexander McCaig (03:21):

I'd love to see you as a little butter ball.

Jason Rigby (03:23):

Luckily I was active enough, I was thin. Actually I looked like you. I was six, two-

Alexander McCaig (03:29):

I'm going to tell you, I didn't always look like this.

Jason Rigby (03:31):

I was six, two at 12.

Alexander McCaig (03:34):

Yeah. You and I are about the same then.

Jason Rigby (03:35):

And then I stopped at 12 years old. I didn't grow anymore.

Alexander McCaig (03:43):

[crosstalk 00:03:43] inches all the way up to 18.

Jason Rigby (03:44):

And then, but I weighed 170 at six, two, all through junior high and high school. And so I think because I was just so busy, I just burned the sugar out, but it was just... But now I'm having to eat super healthy just to counteract probably-

Alexander McCaig (04:00):

All the chaos you did in the past.

Jason Rigby (04:02):

Yes. All the kilos.

Alexander McCaig (04:04):

Yeah. All the moments. So this person got really heated in this Barron's article. That's probably about as heated as a journalist would get in an article.

Jason Rigby (04:14):

Especially in a Barron's article. Barron's is usually pretty conservative. This is about big data bias. And then-

Alexander McCaig (04:25):

Oh never heard of it, Jason.

Jason Rigby (04:26):

Yeah. We need to see the recipe he says. So there's big problems-

Alexander McCaig (04:30):

I think it's a she.

Jason Rigby (04:31):

There's big data, big problems, there's privacy and bias. Yeah. Let me make sure I'm not, you've got me thinking about-

Alexander McCaig (04:36):

Don't be some sort of chauvinist pig. Figure it out.

Jason Rigby (04:37):

Yes. Research professor cross-disciplinary, fellow and director of the Digital Trade and Data Governance Hub at George Washington University, Susan Ariel Aaronson.

Alexander McCaig (04:48):

Here we go Susan Aaronson.

Jason Rigby (04:50):

Yes. So that's me automatically assuming it was a dude. Because I'm thinking Barron's, so I was thinking of Wall Street guy-

Alexander McCaig (04:57):

Big cigar, he's got his tuxedo penguin suit on.

Jason Rigby (05:04):

"We're making billions."

Alexander McCaig (05:07):

"Keep squeezing the people dry. They're our free labor." So how does this thing kick off here?

Jason Rigby (05:14):

American policy and economy. When they're talking about big data analytics and they're talking about the economy and how it's not, understand... And I mean, we'll get into the article in a second, but I want you speak to this, we can see this with how far behind the times the government... I always laugh because people always say conspiracy theories, and they're talking about this, I was in the military and just some of the missions that we were doing, it was just so, and some of this was high level stuff, it was so unorganized.

Jason Rigby (05:48):

I mean, just to have cross agency people speaking to each other, is such an issue because there's so many big silos. So to keep 30 or 40,000 people quiet over something with different departments at different agencies, and to coordinate something nefarious, I know that will never happen in the government. They're too incompetent.

Alexander McCaig (06:09):

No. They don't understand is that they think it's the government, it was people that were in the government that left the government to create their own businesses that don't operate under the government umbrella anymore.

Jason Rigby (06:17):

Yes. There's those guys.

Alexander McCaig (06:19):

It's, here's government and then these people that do whatever the hell they want.

Jason Rigby (06:22):

Yeah. So whenever we look at American policy and we see data right now, they're just trying to react.

Alexander McCaig (06:29):

It's completely reactive. And with the big data, it's really about predictions. That's all the world is built on right now is predictions. How well can we predict? But they're using historical data and old lagging behaviors and all this other third party observational stuff about who you are, it's just this bucket that they appealed together online. And just assume that here's a profile of Jason. We've said this a million times before. Okay. It's just bad predictions, over and over and over.

Alexander McCaig (07:03):

And government is looking at it from a policy standpoint and they're still lagging in this factor. There's no reason you wouldn't know ahead of time who everyone is going to decide for who's going to be the president. You would know before the vote comes in. It's ridiculous.

Jason Rigby (07:18):

Yeah. This is funny, it's like I live in an apartment complex, this happened yesterday, serving me an ad to get my air conditioning unit fixed is a waste of time. So this company, a local company, that is doing this, whoever did your ad for you, they should have took the apartment complexes off. They can do that. So you're just losing money.

Alexander McCaig (07:42):

Three grand a month for a small business, sending stuff to people that don't even have responsible for it.

Jason Rigby (07:46):

Yeah. You can target people that own a home.

Alexander McCaig (07:50):

But what happened was they went to some online company to buy a mailing list and there was no sort of curating to be like, "Well, is this an apartment complex? Are the people responsible for it?" They have no idea unless you go to the person at the apartment complex and ask them.

Jason Rigby (08:04):

Yeah, exactly.

Alexander McCaig (08:05):

Am I wrong?

Jason Rigby (08:05):

Yeah. They don't know. "Do you need air conditioning services?" "No."

Alexander McCaig (08:10):

When I walked through Costco, okay, and they hammer me and they're like, "Do you want solar panels on your house?" I'm like, "I am not responsible for the domicile that I live in."

Jason Rigby (08:20):

Yes, exactly.

Alexander McCaig (08:21):

It's not my liability. I'm a renter.

Jason Rigby (08:23):

Yes, exactly. None. So whenever we look at the big data analytical process, and he used these words and I love it.

Alexander McCaig (08:30):

She.

Jason Rigby (08:30):

I mean she. Gosh.

Alexander McCaig (08:32):

You are a chauvinist.

Jason Rigby (08:35):

Call me out, bro.

Alexander McCaig (08:37):

Keep going.

Jason Rigby (08:38):

Divisive, Discriminatory. These are all me. Divisive, discriminatory, inequitable and dangerous outcomes.

Alexander McCaig (08:47):

Now we know why you're single. Now we know what you're single.

Jason Rigby (08:50):

Exactly. Some of the people sort of into groups that needs to change. So "Big data analytics often requires a huge supply of anonymized..."

Alexander McCaig (08:58):

Before we get to the anonymized part, sort people into groups. Why are you sorting me into a group? Why can't I tell you what group I want to go in?

Jason Rigby (09:08):

Yes, exactly.

Alexander McCaig (09:09):

Can we think about just that right there.

Jason Rigby (09:11):

I automatically think of bad things when government starts sorting you, "You need to go to over here."

Alexander McCaig (09:17):

Yeah. "You go here. You go here. You go here. You, there." That's what it is. And we're just doing the same thing with technology again, don't put me in a group I don't belong in just because you're taking a best guess approach to saying I belong in this box, and you're going to funnel me this way and shovel me these ads and make no sense. Or beyond ads, product services. Send me to the doctor I shouldn't be going to, whatever it might be, that's not me. We need to invert that foundational piece of big data. Big data is making the choice for people rather than people making the choice for big data.

Jason Rigby (09:56):

Yes. And like he says, that needs to change.

Alexander McCaig (09:59):

She.

Jason Rigby (10:00):

She says.

Alexander McCaig (10:01):

Strike three.

Jason Rigby (10:03):

Okay. I'm out. This podcast...

Alexander McCaig (10:06):

Just say the author.

Jason Rigby (10:07):

Everyone, I want to apologize. First of all, I want to apologize to Susan Ariel Aaronson.

Alexander McCaig (10:14):

You apologized to her three times.

Jason Rigby (10:15):

I know. Yeah. This is terrific. This is funny. Okay. So "Big data analytics often requires a huge supply of anonymized personal data. The process..." You're still laughing.

Alexander McCaig (10:26):

Oh man. I've learned my lesson, Alex. Apparently not.

Jason Rigby (10:31):

No I've done it two more times. It's like a little kid, you know how the little kids are like-

Alexander McCaig (10:36):

"I won't do it again."

Jason Rigby (10:39):

And then 30 minutes later they're like sneaking behind the couch.

Alexander McCaig (10:42):

What's with the power drill?

Jason Rigby (10:42):

And a plugin.

Alexander McCaig (10:45):

Okay. Let's focus here.

Jason Rigby (10:47):

Yeah. "Researchers glean and anonymize this data and then separate this data into groupings based on attributes, such as behavior, preferences, income, race, and ethnicity."

Alexander McCaig (10:56):

And who defines the attributes? Again, one dude, or a couple of researchers-

Jason Rigby (11:02):

It could be a girl.

Alexander McCaig (11:02):

It could be a girl. A dude or a gal. And I say, dude for either gender, but there are these small group of researchers are designing what, 1,000 million people go into specific bracket. Does that seem wrong? Do you like going into a conversation and someone just defines you, who you are just blanket statements?

Jason Rigby (11:24):

No. I would never.

Alexander McCaig (11:26):

Nobody likes that.

Jason Rigby (11:28):

That's annoying.

Alexander McCaig (11:29):

Of course, it's annoying. Why are we not annoyed with what's going on here? And then we just live in this false reality constantly. And it's just this massive state of falseness. And it's defined by the few to control the many.

Jason Rigby (11:44):

Like she was talking about, we're on the right track, my brain is, the wheels are turning, but she is talking about these sophisticated analytical techniques on artificial intelligence and machine learning. But what scares me is they're making assertions on these groups with bad data. Think about that.

Alexander McCaig (12:08):

No, I know that.

Jason Rigby (12:09):

So now we have millions of dollars being spent when we've grouped these people incorrectly. And now it's like the Flat Earthers. It's like-

Alexander McCaig (12:22):

We're telling you, this is how the earth is. Well, all the data from seven million say otherwise.

Jason Rigby (12:26):

They make really great documentaries like on YouTube and stuff if you look at it, I mean, they're high quality, but science has proven you over and over and over again.

Alexander McCaig (12:35):

If you take a rocket and you fly off the planet.

Jason Rigby (12:40):

It's not going to smash into ice.

Alexander McCaig (12:41):

Is the moon round? Oh, interesting. Is it a smaller planetary body? Yes. Well then how is ours possibly fricking flat when everything else is round.

Jason Rigby (12:49):

Yeah. I mean, instruments are measured off of it.

Alexander McCaig (12:52):

And if you look up into the night sky, do stars shine in one direction?

Jason Rigby (12:57):

Exactly.

Alexander McCaig (12:58):

No, they radiate.

Jason Rigby (12:59):

Yeah. And then there's the rotation of the earth. And then the circumference, whenever you're looking across the sea, you can actually... I mean, no one would have made it anywhere in the ships if they would have based it off of being flat. All the sextons and stuff.

Alexander McCaig (13:13):

Yeah. Columbus right. Yeah, the sextons wouldn't have worked.

Jason Rigby (13:18):

Yeah. None of that old-school calculations. Because they had to do it old-school ways.

Alexander McCaig (13:22):

That's so cool. The big brass thing.

Jason Rigby (13:24):

Yeah. A Signal guy did that. The guy that owns Signal.

Alexander McCaig (13:27):

Really?

Jason Rigby (13:28):

Yeah. He got a ship with a bunch of teenagers on it...

Alexander McCaig (13:32):

Going to who's island?

Jason Rigby (13:35):

Yeah. No. And he got old school mechanical things. These were kids that needed help or whatever.

Alexander McCaig (13:41):

Oh I like that.

Jason Rigby (13:42):

Yeah. But he's like, come to think of it I'm surprised their parents let them go on the ship with me. Because he's like, "We're going to go on a sailboat, a big sailboat. I have all mechanical instruments, no radar, no nothing. And we're just going to go on the open sea." And he started Signal.

Alexander McCaig (14:01):

I'll stand on the shore and wave you guys goodbye.

Jason Rigby (14:04):

Exactly. No, he sailed since he was a little kid.

Alexander McCaig (14:08):

Oh cool.

Jason Rigby (14:08):

So he had that master's license or whatever. The master sailer type license or whatever, but that's still crazy to have to navigate license.

Alexander McCaig (14:15):

Heavy tonnage license.

Jason Rigby (14:16):

Yeah. I mean, so he has pictures on his Instagram of him out in the sea with the sextons and all that stuff.

Alexander McCaig (14:24):

What does that celestial positioning have to do with?

Jason Rigby (14:26):

Well, I think it's important because these people will create a whole bias off of a false narrative. And then they will double down on it and then next thing you know, there's millions and millions of dollars being spent, a doctrine being established-

Alexander McCaig (14:42):

On a bias.

Jason Rigby (14:43):

On a bias, yes, that's totally false. And that's what's happening here when you group people.

Alexander McCaig (14:48):

And you deliver it back to the public and they're only getting the same things. And so what happens when you inundate someone with the same imagery, video, audio all the time? It's naturally going to sway them in that direction. And now these people are going to be like, "Oh, what we're doing is working." No, it's really not working. People don't have any other option because this is all you feed them because you think it's the right thing. And they have no other choice because you've removed choice out of the matter. You've decided for them.

Jason Rigby (15:15):

Yeah. And it's funny, she uses an example of Microsoft 365 that used the software to monitor their workers productivity-

Alexander McCaig (15:22):

Yeah, how many emails you're sending.

Jason Rigby (15:24):

By scoring them on participation in group chats and by the number of emails sent by employees. So if you would think about that-

Alexander McCaig (15:31):

You're telling me I'm productive because I'm sending more emails? Because I'm group chatting more that makes me productive?

Jason Rigby (15:41):

Think about that.

Alexander McCaig (15:42):

It's so stupid. I'll talk about this as a real example, I got a buddy of mine and I'm over at his place, he's got the laptop open working from home. And I'm like, "What is this?" I'm like, "All these people are on yellow and you're the only person that's green." He's like, "Yeah, I keep it green. Because the system monitors, how many times we go away in the chat system." And he said, "And I always have my email up. Because it's always watching how long I have my email up for. And so what I do is I just ping small stuff back and forth. So they're just...

Jason Rigby (16:19):

Everybody's going to find a way around it.

Alexander McCaig (16:22):

Of course, they do.

Jason Rigby (16:23):

I mean, they have those refresher pages, that it'll refresh every five seconds or whatever.

Alexander McCaig (16:29):

What a stupid algorithm to define productivity. But that is like a metaphor for everything else we're doing. And I think she did a great job highlighting that.

Jason Rigby (16:37):

I liked the Sheriff's office one in Pasco County, Florida. So machine learning created this list of individuals that were going to be potential criminals.

Alexander McCaig (16:53):

And you could harass them.

Jason Rigby (16:54):

A system. And it would monitor and constantly harass them. Big government.

Alexander McCaig (16:58):

It forced them into a position of you're agitating them so you're wanting them to... Do you remember the bait car?

Jason Rigby (17:04):

Yes, yes.

Alexander McCaig (17:05):

It's like a digital bait car. And that's what they're doing. And then the IRS takes it a step further, you want to dodge taxes, how about we track you all the time wherever you're going and we'll see all of your social media feeds.

Jason Rigby (17:20):

And location data.

Alexander McCaig (17:20):

Why does the IRS, why should a private entity founded in the early 1900s, a private entity, why should they be allowed to track me, and then also look at my social media data, just for some sort of assumption I may be dodging taxes? They'll just say, "Well, everybody's dodging taxes. Let's track everybody."

Jason Rigby (17:42):

Yeah. I don't want to...

Alexander McCaig (17:45):

Did I ever tell the IRS it was okay to do something like that?

Jason Rigby (17:48):

I don't want to get into that whole mess because that's a... Put your accent out when you say IRS.

Alexander McCaig (17:54):

IRS. The IRS.

Jason Rigby (17:59):

Yeah. There we go. Yes. But this is the crazy part, whenever we use big data to create big government, I'm out. So I mean-

Alexander McCaig (18:08):

When big data becomes big brother, we got to nix that.

Jason Rigby (18:10):

Yeah. We got to nix that.

Alexander McCaig (18:11):

Bye. We're going to lop that down with our sovereign sword of data freedom. We're going to cut it down. Our data champions are going to be like, "No, that model does not work like that. You're going to come to Tartle, and we're going to tell you who we are." Does that make sense?

Jason Rigby (18:26):

Yeah. I'd like the one that she talks about, not just Cambridge Analytica, but the athletic network, Strava-

Alexander McCaig (18:32):

Oh at NATO.

Jason Rigby (18:33):

That released a global heat map of user activity.

Alexander McCaig (18:36):

Don't you remember that? The imagine. People were like, "What is this up here? Oh, there's a training base in the Arctic? That's interesting." It shows all the guys going for jogs. And it has all the internal parts of the building.

Jason Rigby (18:49):

Yeah. The NATO military personnel.

Alexander McCaig (18:52):

Yeah. It's great.

Jason Rigby (18:52):

Yeah. So these things that, "The algorithm designed made it less likely to refer black people than white people who were equally sick, to programs that aim to improve care for patients with complex medical needs." That's machine learning. So it's-

Alexander McCaig (19:03):

Who's the fricking racist that put that algorithm together?

Jason Rigby (19:06):

Exactly.

Alexander McCaig (19:10):

If you have a complex medical need, why is the algorithm defining who gets a complex medical need first? Why can't me saying, "There's a complex issue going on internally within my body, I need to go see a doctor." Why are you telling me who and when I should go see them? This is complex enough for me.

Jason Rigby (19:31):

This is the same I'm seeing right now where Tartle could have helped dramatically with the vaccine rollout. This is a prime example of where you need to speak to the people. And then if we would have had their data packet, their health records, data packet-

Alexander McCaig (19:52):

Behaviors, jobs.

Jason Rigby (19:53):

And then asked them direct questions to qualify for them, then we could have made a proper list-

Alexander McCaig (20:00):

In 24 hours.

Jason Rigby (20:01):

Yeah. Of what, with every American, or whatever country you're in.

Alexander McCaig (20:07):

Whoever wants a vaccine.

Jason Rigby (20:08):

Yeah. To get this vaccine rolled out to the right people at the right time. Because now there's this big controversy, Hollywood people are getting it. Super healthy people are getting it.

Alexander McCaig (20:19):

I'm so tired of hearing that crap. I'm fed up with just people pointing fingers and blaming all this stuff. We have the tool available to cut all that up. What are you going to have to talk about when you can't complain anymore?

Jason Rigby (20:32):

I don't know now because everybody's in 2020. I get so conditioned to being off all the time.

Alexander McCaig (20:36):

All the time.

Jason Rigby (20:39):

And now you're going to have the president's going to be in. Nothing's going to happen. It's going to be the same as it always is. I always tell people this, your day-to-day life, every day, I tell people this all the time. I was saying this in 2020, constantly. Alex, whether Trump gets into office or Biden gets into office, your day-to-day life every day, will it change?

Alexander McCaig (21:01):

No.

Jason Rigby (21:01):

No.

Alexander McCaig (21:01):

I don't care, I want to be pissed off.

Jason Rigby (21:06):

And if you're that into it, to the point to where you let it consume-

Alexander McCaig (21:12):

Go get a job in politics.

Jason Rigby (21:14):

Yeah. Either go get a job in politics and change the world or, number two, your priorities are all wrong.

Alexander McCaig (21:20):

Your priorities are all wrong or you can join Tartle and just talk about it through there.

Jason Rigby (21:22):

Yes, exactly.

Alexander McCaig (21:23):

Get paid to be upset.

Jason Rigby (21:25):

Yeah. Get paid to be upset. But it's your responsibility 100%. I don't care who the puppet is.

Alexander McCaig (21:31):

Oh man. Puppet, that's a good word.

Jason Rigby (21:32):

Yeah. I don't care who the puppet is on whatever country I live in. We've talked about-

Alexander McCaig (21:38):

Whether it's Luxembourg or Guadalajara, Mexico.

Jason Rigby (21:40):

Or Melbourne or-

Alexander McCaig (21:41):

Or Cebu in the Philippines.

Jason Rigby (21:42):

Or Cebu in the Philippines. I don't care. Maybe Kazakhstan.

Alexander McCaig (21:45):

Or maybe Pyongyang in North Korea.

Jason Rigby (21:48):

Hey, do I not pay taxes there? Perfect. Well, no, I don't have any hair. Because all the guys have to get the same. Kim Jong-un. You got to get the Un haircut.

Alexander McCaig (22:03):

They'd have a life tax on you. Welcome to North Korea. Here's your life tax.

Jason Rigby (22:08):

Yeah. And you're a CMO of a tech company, that's not going to work here, buddy.

Alexander McCaig (22:12):

No, that's not going to work.

Jason Rigby (22:13):

You speak English. That's not going to work.

Alexander McCaig (22:14):

That's also not going to work.

Jason Rigby (22:14):

We have all these things that we teach our children with these big missiles. They have missiles in the school, and they do plays on.

Alexander McCaig (22:22):

It's empty. Oh, don't they only have one nuclear warhead? Literally just one. They've only required the resources for one.

Jason Rigby (22:31):

North Korea reminds me of the Wizard of Oz. It's just a silly guy behind the curtain. Yeah. It seems all scary. And then you look and it's...

Alexander McCaig (22:42):

It's just a little, "Hey. Hello. Welcome to the Emerald City."

Jason Rigby (22:46):

Yeah, exactly. It's it's so funny, but those people are living in a whole different matrix. I mean, they are all in.

Alexander McCaig (22:53):

It's the Matrix.

Jason Rigby (22:54):

I mean, from they have to watch commercials in their planes when they fly.

Alexander McCaig (22:59):

I don't think they have a choice.

Jason Rigby (23:00):

They have no choice.

Alexander McCaig (23:01):

Yeah. But you know who does have a choice?

Jason Rigby (23:03):

Here we go. This is what I want to talk about. Data and choice.

Alexander McCaig (23:06):

All the other 180 plus countries across the globe and those individuals that have access to the internet can all join Tartle, and have a choice. They can have a choice in the big data algorithms. They can have a choice in what they want their future to look like. They can have a choice in how they choose to define themselves.

Jason Rigby (23:24):

Yeah. I do want to end on this, and you can speak to this, she mentioned something about, "The Security and Exchange Commission should ask all publicly traded companies to disclose when and how they use data analytics to make decisions that affect our customers humans rights, such as access to credit, education, healthcare." So here's, maybe this is my libertarian viewpoint or whatever, but whenever I'm saying, "Okay, now I'm going to ask security exchange," why can't we as consumers go to this company and demand that? And shareholders should demand that. We don't need to get the government involved.

Alexander McCaig (23:56):

The government doesn't need to be involved. We need to just work with the people that we're working with. Why do we need to have a third party come in and define what needs to go on? Why can't we as people do that?

Jason Rigby (24:06):

And don't support a company that's nefarious.

Alexander McCaig (24:10):

We're just being Socratic here.

Jason Rigby (24:12):

Yeah. It's very, very simple.

Alexander McCaig (24:14):

Support yourself and support some global causes, the big seven. Thank you very much.

Speaker 1 (24:26):

Thank you for listening to Tartle Cast with your hosts, Alexander McCaig and Jason Rigby. Where humanities steps into the future and the source data defines the path. What's your data worth?