Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
June 10, 2021

MIT Fair Data Value

MIT Fair Data Value
BY: TARTLE

Data and Digital Transformation   

Breaking news! Digital transformation needs data! So sayeth the sages at Forbes. In other news, water is wet and the sky is still blue. Now, that you’ve had a moment to recover from that shock, let’s actually spend some time thinking about this. 

The first thing to note is that digital transformation is something that’s been going on for decades at this point, just not by that name. Digital transformation is just another buzzword that the corporate world likes to invent and throw around to sound smarter than they are. So, what is it? When we strip away the corporate sheik and the buzzwords, what is this process that has been going for so long and how is data involved?

The process is simply that of moving from an analog to a digital world. The first calculators were a part of than transformation, email another, as well as the move to HDTV. One of the most impressive part of society’s ongoing digital transformation is the rise of the humble mp3. Due to the digitization of music it has become available virtually anywhere streaming over your phone. A whole generation has already grown up never having heard and maybe never having seen an analog cassette tape. About the only analog music you can find in the stores now is an old turntable.

Now, that all seems fairly mundane. But it only seems that way. Every one of those innovations reveals another or was made possible by another. Streaming mp3s are only possible due to the internet and massive servers storing music files, which represents a significant shift from physical to digital media. 

There are also a variety of corporations still working on making better use of their digital assets, using sensors throughout their facilities to track the production and movement of products and make the process more efficient. Naturally, every single sensor is putting out data that is stored in a server and then analyzed. The process of analysis has also undergone a digital transformation. Once, it was all done by people. Now, there are complex algorithms that can handle the simpler kinds of statistical analysis. 

The digital world is taking over in others ways as well. There are already automated semitrucks on the road, bringing loads of goods and material to different locations in the country. Some of the simpler articles online are now written by a computer rather than a reporter. In some places, even retail stores are getting automated. Amazon for example has set up retail stores in New York where you never have to go through a checkout line. The store tracks what you get off the shelf and bills your account accordingly. In Japan, convenience stores are moving in the same direction. And even in small towns in the US department stores are letting people scan their purchases with their phones and heading out of the store. 

To bring things back to data, all these moves were data driven, data that at least suggested that products could be delivered more efficiently by moving further into the digital world. An unfortunate aspect of the transformation is that it is largely driven by a desire to increase profits. Of course, there are still numerous benefits that derive from that motivation, but what if the desire to help people was the real driver? What if we gathered analyzed data and then applied with profit as the secondary goal and creating a better world first? Is it even possible?

The answer to that is ‘yes’. As someone wise once said “put first things first and the secondary things will follow.” This is what TARTLE is trying to accomplish, to harness our continual digital transformation not to merely drive the bottom line but to help people and given them the freedom to develop themselves and reach their full potential. That is why we want to give you control of your data. By being in control again, you are also an active part of the process with the freedom to decide where and how much you even want to participate in the digital transformation. You get to decide how much you and your data will be involved, not a faceless government or corporations. 

What’s your data worth? 

Summary
MIT Fair Data Value
Title
MIT Fair Data Value
Description

The first thing to note is that digital transformation is something that’s been going on for decades at this point, just not by that name. Digital transformation is just another buzzword that the corporate world likes to invent and throw around to sound smarter than they are. So, what is it? When we strip away the corporate sheik and the buzzwords, what is this process that has been going for so long and how is data involved?

Feature Image Credit: Envato Elements
FOLLOW @TARTLE_OFFICIAL

For those who are hard of hearing – the episode transcript can be read below:

TRANSCRIPT

Speaker 1 (00:07):

Welcome to TARTLE Cast with your hosts, Alexander McCaig and Jason Rigby, where humanity steps into the future and source data defines the path.

Alexander McCaig (00:25):

Welcome back everybody to TARTLE Cast. This episode-

Jason Rigby (00:29):

The Shiny Lips Episode.

Alexander McCaig (00:30):

The Shiny Lips Episode, Jason. It's good. We got so much coconut oil in our coffee.

Jason Rigby (00:36):

Yeah, it's what made our lips shiny.

Alexander McCaig (00:37):

Yeah. I prefer coconut oil because it's the only thing that when you thermally process it, cook with it, it doesn't change its chemical structure.

Jason Rigby (00:45):

That's what I've heard, up to a certain... What if you get high heat on it?

Alexander McCaig (00:49):

No, it's good stuff. It never changes.

Jason Rigby (00:50):

I use it all the time.

Alexander McCaig (00:51):

That's the best part about coconut. That's a side coconut point. There's an article that MIT put out. It was a tech privacy article on the kind of like the fair value of the data market. And I thought this was actually a fantastic article and I wanted to touch on it and there's some good quotes in here. So, if you might mind-

Jason Rigby (01:15):

Whoever wrote this, I'd love to have them on the podcast. We need to reach out to them.

Alexander McCaig (01:19):

Yeah. This is truly like a fabulously written episode.

Jason Rigby (01:22):

Yeah. I mean, it's called Fair Value? Fixing the Data Economy, which you know I love the word fixing.

Alexander McCaig (01:27):

Fixing.

Jason Rigby (01:28):

Yeah.

Alexander McCaig (01:29):

When they talk about fixing the data economy, the idea is that there's no fair payment or compensation for the people that support the system. The data economy is supported by people that generate data. Okay, great. So, why is it that only the select few again are the ones that are the only ones benefiting, right? It's like another Occupy Wall Street situation, but on data. Who was that guy? The Occupy Wall Street guy?

Jason Rigby (01:53):

Yeah, Tim Pool.

Alexander McCaig (01:54):

Yeah, Tim Pool, right? So, if Tim Pool was into data, this is what he would be talking about. And it's not the fact that it's about writing. It's just about, there should be transparency and a placement of fair value on the people that are generating that data, and they should be compensated for the value they're creating in that system. There's some interesting points in there.

Jason Rigby (02:18):

How the article starts off is, "The data economy is facing a social reckoning, says leading voices in industry, government, and academia. And I like the two questions it starts off because, to me, questions are very important. It says, "But what went wrong?" That's a beautiful question.

Alexander McCaig (02:32):

Well, that's a great thing.

Jason Rigby (02:33):

Yeah. And then how can we fix it? But we need to know the first question. It snowballs to the second one. And so they talk about, when they get into it, it's the inhumane working conditions, the creation of unions, labor laws, foundations of political parties.

Alexander McCaig (02:49):

It's that development, since the development from the industrial revolution to where we are now, but what went wrong. Again, we started with the approach of how do we make things efficient. Right? How do we make it economically beneficial? And so when that was applied to the tech industry, no one ever thought, "Oh, yeah, how are they offering this to me for free?" Like, what is that? Why didn't Google start off in the first place with just charging people? They figured, "Oh, we can just make it look pseudo-free, but pass the cost off to advertisers and businesses." Where we went wrong is that companies found that they could continue to run this black box approach and falsely offer people things for free, but what the user didn't realize on the back-end because nobody wanted to go through the terms and conditions, or understand how the consent worked in the whole approach, or what was even going on with their data.

Alexander McCaig (03:47):

They were just more interested in things like, "I just want to use this tool." And when you think about free tools on the internet, and I know they earmarked this, it's like Wikipedia. That's the one thing that everybody uses and it's free. That's how open source the internet should be.

Jason Rigby (04:03):

Yeah, and then you can choose to donate.

Alexander McCaig (04:04):

And then they ask for a donation. Right? And that's beneficial. And what did Wikipedia do? It allows people to research, to learn, to evolve themselves. But what happens is we've taken a de-evolutive approach with where we went wrong with big tech. And that's the fact is we did everything black box and falsely told people that they can receive this product or service for free, but they didn't realize how much they were giving up on the back end, so much sovereignty and anonymity that they had that was passed out the door and then-

Jason Rigby (04:40):

Quickly.

Alexander McCaig (04:41):

Yeah, very quickly.

Jason Rigby (04:41):

Here's the privacy agreement, scroll down, hit I agree.

Alexander McCaig (04:44):

Be done with it, right? And then go on to be using the system. But on the back end, 23andMe is selling all your genetic data to GlaxoSmithKline. They made a boatload of cash and you didn't get a penny for it. It's your genetic information.

Jason Rigby (04:58):

In fact, you paid them to-

Alexander McCaig (05:00):

You paid... Let's think about this. You paid 23andMe to buy one of their kits.

Jason Rigby (05:05):

$99.

Alexander McCaig (05:06):

$99 or 130 bucks, when it first came out. They swab your mouth. You send it over to them. So, they've collected 130 bucks. Now they're going to process this, and they're going to have this massive genomic sequencing data set on a huge populace of people that have used their product. And then they're going to go back and they're going to take all that data and sell it to a biopharmaceutical company.

Alexander McCaig (05:25):

Are you going to give me a discount on my 23andMe? Was it free? Apart from the insights you're giving me, you just went back and you double-dipped off of who I am and all, truly-

Jason Rigby (05:37):

Literally, who I am.

Alexander McCaig (05:38):

... my most personal information, you sold off. If you look at that as like a metaphor for the approach that the technology industry has taken, that's where we've gone wrong. It's that black box, double-dipping, not telling people what you're doing, and just trying to look at people as a number and take as much value for just the people who run the company and not for the people who truly support it, which are its users.

Jason Rigby (06:00):

Yes, a hundred percent. And in this article, it went into when innovations lead to disaster. And then it says much about societal context and it uses Chernobyl nuclear disaster and the US opioid crisis.

Alexander McCaig (06:15):

Yeah, so just thinking about Chernobyl, right, or nuclear technology. Nuclear technology's great. It's very powerful. It allows us to do a lot of things-

Jason Rigby (06:25):

Clean.

Alexander McCaig (06:26):

... frankly, quite efficiently. Right? The only thing that's unclean about it is the waste. What do you do with it? Do you bury in the ground here in New Mexico? But the thing is that most good things in life are a double-edged sword. But we've only been blindly looking at it as just a single-edged sword and assuming that nothing is happening on the other side. Well, with Chernobyl, you have the possibility or the risk that you do have some sort of nuclear leak. Right? And people die and it destroys an area for a large period of time. Right? Or if you do create a new tool that affords people anonymity, well, it's like, well, how do you prevent bad actors from coming into a system and misusing it?

Jason Rigby (07:08):

Yes, that's the big one.

Alexander McCaig (07:09):

It's a lot of the trouble you see with things like Silk Road, and that's why the Feds are always trying to shut it down.

Jason Rigby (07:15):

I like how they talked about the opiate crisis, and then they explained, of course, it killed millions and then became an incredibly addictive drug. And then it fueled the black market of pills. And I've known people-

Alexander McCaig (07:28):

I don't want to say that the CIA funds it.

Jason Rigby (07:30):

No, no, but I know what you're getting at there.

Alexander McCaig (07:36):

What does the Department of Justice do with all that money that they seize? I don't know.

Jason Rigby (07:41):

Where does it go?

Alexander McCaig (07:42):

Where does it go?

Jason Rigby (07:43):

But, they were talking about the fractures and the problems of modern America, especially with the lobbyists for the pharmaceutical industry and then our fragmented health system.

Alexander McCaig (07:52):

Yeah. Well, our system is fragmented. It's fragmented because it's not about quality of care. It's about economics. Health has turned into economics. It's not turned into quality of life. That's just the perspective. And that sort of perspective is what's hurting us, that fair value, like the value of life, it's not treated fairly. And then for the people that support all these technological systems and the growth that we're seeing here in the United States and around the world, no one's valuing the individual for supporting that growth. They're just saying, "Oh, look at this company. Look how fantastic they are. They're the champions." No, they're truly not. They wouldn't exist if it wasn't for people using their systems.

Jason Rigby (08:33):

Yeah, and I think also when you look at this digital technology revolution, and they speak to this and I think it's really interesting. They call it, right now, it's facing its own societal reckoning, and as its benefits are eclipsed by the harmful practices and business models it has unleashed. And I like the word unleashed. That was a perfect word.

Alexander McCaig (08:52):

Well, unleashed is a good thing because there's very little regulation on a lot of the stuff. And so it's easy for a lot of these companies to take rather somewhat unethical approaches. And they're doing a lot more harm than good, but I don't know.

Jason Rigby (09:09):

Well, I mean, they talk about this, insurance companies harnesses data to exclude certain customers unfairly.

Alexander McCaig (09:15):

Okay. That's a tough call, right? Because the more information an underwriter gets, the better they can go into their own predictive matrices to say that the person might perish at this point. It's like people want to turn the lights. Well, how much information do I want to share before it starts harming me? So, I guess, just over time, insurance companies will have to evolve as they receive total or perfect information. It's not so much about excluding people, but now accurately placing them in a better statistical bucket than before.

Jason Rigby (09:45):

To me, the health insurance is the same exact as everything medical. I mean, health insurance is looking at how do we litigate risk and how do we make the most profit.

Alexander McCaig (09:57):

That's all it looks at. Health insurance is not like, "Okay, how is it that we get people to pay as little money as possible to us and do whatever we can to increase their quality of care and their quality of life?" When's the last time you heard insurance companies say that? Hey, we want you to pay as little as possible.

Jason Rigby (10:13):

And I love that the health insurance companies now are buying the... They're buying the pharmaceutical rights, the things. What these pharmaceutical companies are doing is they're buying the middleman out and the insurance when they go to HR departments and negotiate. When you own the whole tree, and then you can charge each one.

Alexander McCaig (10:32):

Yeah. If I own the tree, I'm the guy making the pills. I'm the guy sending the pills to CVS Pharmacy. And then I'm the one that says you can get the prescription.

Jason Rigby (10:39):

Yes, and I'm the one that is going to sell the health insurance to the company. And then I'm not only going to be the underwriter of the health insurance, I'm going to also have a middleman in there that goes in there and he's going to get a cut.

Alexander McCaig (10:54):

Well, that's all this double-dipping that's going on. Right? If you don't treat people as human beings, and if you are not clear and transparent with them without trying to be naturally convoluted or confusing in your terms and conditions, then you start to move over to the fair value. But right now we've been doing all these, frankly, ugly black box practices, and it hasn't truly come back to say that the internet has become a thing of value. It became a thing of abuse.

Jason Rigby (11:26):

Yes. Yeah. A hundred percent.

Alexander McCaig (11:28):

Just look at the NSA.

Jason Rigby (11:30):

I think abuse is perfect example in the article where it talks about reselling data without permission.

Alexander McCaig (11:34):

Yeah, they've been reselling it with our permission all the time. It was like 23andMe.

Jason Rigby (11:39):

Think about that. So, what if somebody walks to your house, takes your car, and then sells it to somebody else? And then you walk outside.

Alexander McCaig (11:46):

It was in my driveway. What are you doing selling my car?

Jason Rigby (11:48):

You would say, "Somebody stole my car."

Alexander McCaig (11:50):

Yeah. You would say, "Someone stole my car." Or like, "Why is my car getting repossessed?" Well, you didn't pay your bills. Oh, so I never truly owned it in the first place. Well, I understood that because I signed that contract and that was very clear. I didn't pay my bills, I don't own this anymore. Right? But you're the one that generated the data. You own it.

Jason Rigby (12:07):

Yeah, exactly. So, I mean, if you think about it, it's even more of theft.

Alexander McCaig (12:15):

It's like a weird digital theft that's going on.

Jason Rigby (12:16):

Yes, yeah. And then they talked about dishing out criminal sentencing and predicting student grades. When you get into, and we've talked about this before, when you get into machine learning, and you get into race and ethnicity, it can get really sketchy.

Alexander McCaig (12:32):

There's already a lot of biases. We talked about this in a couple of episodes with the algorithms. And that can put people in a box they don't belong in. I'd love to see an algorithm try and put me in a box.

Jason Rigby (12:43):

Yes. Think about it. So, if 90% of the prisons in the United States are full with black Americans, and we can talk about the injustice of that and that'll piss me off so I'm not going to talk about that because you've got-

Alexander McCaig (12:55):

You got minor criminal sentences and stuff like that.

Jason Rigby (12:57):

... weed, guys in prison for selling weed. Let's not even go there.

Alexander McCaig (13:00):

No, [crosstalk 00:13:01].

Jason Rigby (13:00):

But I'm just telling you, if you should put that into machine learning and said 90% of the people that are-

Alexander McCaig (13:07):

The thing's going to be like, "Okay. I know exactly who to target."

Jason Rigby (13:09):

Any person that has black skin is a criminal.

Alexander McCaig (13:12):

Yeah. And so all the facial recognition software that's on the streets and all over the place, you even see then with cameras in London, they're going to be targeting people that they shouldn't be targeting.

Jason Rigby (13:21):

Yes. Unfairly.

Alexander McCaig (13:23):

Totally unfairly, totally unjustly.

Jason Rigby (13:25):

Or you have issues with, and I love the higher learning trying to adapt to data, but where you have Asian students coming over from abroad and coming here, and then they're like, "Well, what do we do? 90% of our schools would be full of Asian students." So, how do we do this fairly, where we can bring people in from all? So, I think we're at this crossroads in this tipping point of are we going to continue down the road of where data is going to be all about efficiency, profits-

Alexander McCaig (13:59):

Profit sharing, yeah.

Jason Rigby (14:01):

Yeah, like we've done in the past and we've turned... Anything good, we've soured it. It's become ripe. I had an avocado I left in my refrigerator, and it was in there for a month, and I pull it out, and it's all shriveled up and nasty and rotten, and you can poke your thumb through it. Disgusting.

Alexander McCaig (14:19):

What we find is that capitalism and competition tend to sour really nice things. They take life out of things that truly would work well. Wikipedia, as an example again, is not this big capitalist enterprise. It's a bunch of people coming together on the internet to contribute towards knowledge, and there's no cost involved.

Jason Rigby (14:43):

I love that. Yeah.

Alexander McCaig (14:45):

That is a fantastic model for how things should be used. That's why the open-source community says that if the internet should be free, so should all the software on it. People shouldn't be limited by their creativity in what they can use.

Jason Rigby (15:01):

Yeah. And when we get into the article, it talks about whistleblowers, economists, historians, anthropologists, and they're calling for reform. And so I kind of want to take a approach, a proactive approach right now because we've been going just slamming everything and being kind of negative with it, and I like how the article started off that way to realize like...

Alexander McCaig (15:22):

You got to shake people up a little bit. Look at the common themes. Like, the same thing happened in The Industrial Revolution, and now it's happening now with our data.

Jason Rigby (15:30):

So, Paul Romer, this is great. He was a former chief economist of the World Bank and winner of the Nobel Prize for economics. He said, "I don't think anything has come out of the internet revolution that was actually a net positive for society, except maybe Wikipedia." Except. He's saying out of all the trillions of data packets that have been put out there, all the... Isn't there like... I forgot what they said. It's like 9 million videos uploaded every second or something on YouTube.

Alexander McCaig (15:57):

That's freaking crazy.

Jason Rigby (15:58):

And when you start seeing and how many posts are being... I mean, it's absolutely nuts. And he's saying, "The only thing I can think of is Wikipedia." This is a Nobel Peace Prize winner.

Alexander McCaig (16:06):

Yeah, but that is a net positive because he realizes the value of education and a free open source tool. The internet was designed to be free and open source. I know that DARPA had originally worked on this for a defense thing, but now when you open it up to the general public, don't use it as a system to attract people or abuse the information they're creating. Do it to help elevate us collectively as a society.

Jason Rigby (16:28):

Dude, I guarantee you, if you did a study on this, I would bet my life on this.

Alexander McCaig (16:33):

Your life.

Jason Rigby (16:34):

I would bet my life.

Alexander McCaig (16:35):

I'd have to put you down.

Jason Rigby (16:36):

Yes. You can put me down with my .44 Magnum or something. Boy, I got to put you down.

Alexander McCaig (16:42):

Excuse me, I have to go find a farmer. Can I borrow this weapon for a minute? I got to put him down. He lost a bet.

Jason Rigby (16:47):

So, whenever you think about people in learning, I guarantee you, majority of the learning that's going on now, I would say from 35 and under is from YouTube watching somebody do something and then copying that being done on YouTube.

Alexander McCaig (17:03):

We're good at doing that whether it's something that is truly beneficial or not. I mean, there's a lot of that learning that happens, and I know that like Khan Academy took that approach. That is a fantastic thing. That's online school free for everybody.

Jason Rigby (17:17):

Yes. That's amazing.

Alexander McCaig (17:18):

Teaches on economics, health, science, chemistry, physics, whatever, you name it. It's all there. That being-

Jason Rigby (17:24):

And I know MIT and a couple of others have put just their classes out.

Alexander McCaig (17:27):

MIT has all their open coursework you can find online. The benefit, the net positive benefit beyond Wikipedia that that economist was talking about from the World Bank is that it's the ability to share. And the double-edged sword of sharing is tracking. Right? If you're going to share, things are going to be transparent. They're going to know where it's coming from. So, just share better.

Jason Rigby (17:57):

Yes. And I think be proactive with your data. Understand. Take the time to read the privacy statement. You'll be shocked.

Alexander McCaig (18:04):

You'd be shocked how much stuff you don't want to use.

Jason Rigby (18:06):

And the consent for this, and the consent for that, and the consent for this, and you're just going click, click, click, I agree.

Alexander McCaig (18:12):

Consent should be clear. Right? Do you want to do this? Yes or no. Oh, okay. And there's like, why is it that I couldn't go and use a system, some sort of online system, and if I say, "I don't want to use that way," why does it prevent me totally from using their system? It should prevent the people who created the system from doing what I said not to do.

Jason Rigby (18:36):

Consent has to be clear in everything in your life. What if we said, "Well, I'm not sure if you come into work today you're going to get COVID or not because we don't really test people here and we don't check temperatures or nothing... But, you know, I would really like for you to come into work now."

Alexander McCaig (18:51):

I proposed to Amanda. What if she was like, "I'm not really sure I consent to that."

Jason Rigby (18:56):

Yeah, exactly.

Alexander McCaig (18:57):

I'd be devastated, you know?

Jason Rigby (18:59):

Yeah, when you propose. You're on a mountain, you're sitting there.

Alexander McCaig (19:03):

Yeah, like, "I'm unsure." And then you're always sitting around wondering, "Wow, why is she unsure?" Play that into data, and it's like, "Well, what's going on with the data? What is all this stuff I'm actually... Why is this necessary? What is actually happening in the background?"

Jason Rigby (19:16):

Why is there nine pages for me to sign up for this website that I agreed to without reading?

Alexander McCaig (19:20):

That's exactly right. Just look at all the fake outrage that's going on between Apple and Facebook right now. It's ridiculous. You guys drove yourselves into the corner. What do you expect? And now you're bickering at one another from a problem you both created? And then you're trying to look like the good guy coming out on top? You both screwed up.

Jason Rigby (19:42):

Then you get politics involved, and these guys don't even know how to put on a computer.

Alexander McCaig (19:45):

No, I know, that's exactly-

Jason Rigby (19:46):

And you're yelling at them and bringing them up to Capitol Hill. So, it's like, I say this, the people need to go back to these companies and say, "You broke this, you better freaking fix it."

Alexander McCaig (19:57):

You need to fix this and not for your benefit, but for us.

Jason Rigby (20:00):

Yes. As humanity. We get that you want to serve us ads. I'm fine with that. You need to make money. You're a company. But you have taken from us, pillaged from the village.

Alexander McCaig (20:11):

Pillaged from the village of data. The data village.

Jason Rigby (20:14):

Yes, the data village.

Alexander McCaig (20:15):

You pillaged the data village.

Jason Rigby (20:15):

You've come in here, took us over. We've been subjective to you, but no longer is this going to happen.

Alexander McCaig (20:21):

No. And honestly, you shouldn't have designed it like that in the first place. And that's why, if you want to do something that's fair, egalitarian, humanitarian, high ethical, high moral value, and make sure that you are evolving yourself and your wallet, go to tartle.co.

Jason Rigby (20:37):

It's that simple.

Alexander McCaig (20:38):

It's that simple.

Jason Rigby (20:40):

Start getting paid.

Alexander McCaig (20:41):

Yeah. Share your data. Earn money. Change your world. It's a three-step process.

Jason Rigby (20:45):

It's that simple. Love it.

Alexander McCaig (20:47):

Love it. Thanks everybody.

Jason Rigby (20:55):

[inaudible 00:20:55]

Speaker 1 (20:58):

Thank you for listening to TARTLE Cast with your hosts, Alexander McCaig, and Jason Rigby, where humanity steps into the future and source data defines the path. What's your data worth?

Jason Rigby (21:17):

[crosstalk 00:21:17] like a car share company.