Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
April 5, 2022

It's Time: Stop Big Tech from Abusing You Today!

It's Time: Stop Big Tech from Abusing You Today!
BY: TARTLE

In the previous episode, we discussed how using TARTLE can help save your life. Now, let’s talk about how we can improve your quality of life.

One of the most pressing problems we have today is that people don’t know the value of their data. We find ourselves giving away a treasure trove of information in exchange for meager services, like access to our ancestry or participation in a social network.

Big tech and other companies make so much money out of our data. It’s time we take back what’s rightfully ours.

You Are A Modern-Day Slave

We are allowing big tech to create a captain’s log of every human being there is. We are giving them the power to look over our shoulders while we write down our most secret desires, while we report the outcomes of our day to day activities.

It’s a violation of our right to privacy and it happens on a daily basis. We need to wake up now, before it gets even worse.

At the rate we’re going, we are allowing other companies to take data out of our human body. And then they conceal their intentions behind 200 pages of legal mumbo jumbo, and then they go ahead and make $600 billion off of our human work.

We’ve got a term for it here: data slavery.

Living in the Wild West of Data Era

What’s extra insidious about this is that it’s happening right under our noses, as we speak. We wouldn’t let other companies like Walmart get away with slave labor, so why do we let it happen with our personal information?

These companies may not be stealing to your face, but that doesn’t make it okay. You deserve to know that you are being farmed for your data and that other people are making money off of you.

You are a unique human being. Throughout your existence, you’ve created thoughts, actions, and preferences that are valuable to the evolution of humanity. You do not deserve to become a pawn in someone else’s convoluted gamble for profit and power.

Conclusion

TARTLE is designed to allow people to feel a sense of reciprocity. That they’re actually getting what they deserve. The platform is a more ethical and sustainable way of sharing data. 

On the TARTLE Marketplace, you get to choose how much data you share. You can capture your ancestry, genomic sequencing, social determinants of health, and more. Once you’re done uploading that information, you can choose who you sell that data to. That’s how important your consent is to us. You are a part of the process every step of the way.

You also get to keep all of the money you earn from TARTLE. We do not take a cut from your hard work.

What’s your data worth? Sign up for the TARTLE Marketplace through this link here.

Summary
It's Time: Stop Big Tech from Abusing You Today!
Title
It's Time: Stop Big Tech from Abusing You Today!
Description

One of the most pressing problems we have today is that people don’t know the value of their data. We find ourselves giving away a treasure trove of information in exchange for meager services, like access to our ancestry or participation in a social network.

Feature Image Credit: Envato Elements
FOLLOW @TARTLE_OFFICIAL

For those who are hard of hearing – the episode transcript can be read below:

TRANSCRIPT

Alexander McCaig (00:07):

We all need to get our cut, and I'm not talking about CRISPR.

Jason Rigby (00:11):

CRISPR babies.

Alexander McCaig (00:12):

CRISPR babies.

Jason Rigby (00:14):

How much of our cut should we get? How much did they make?

Alexander McCaig (00:19):

These companies-

Jason Rigby (00:25):

23andMe.

Alexander McCaig (00:25):

... or Ancestry or any of those people that do genomic testing for the public, made around $600 billion selling the research, which you paid them to do. You paid a hundred bucks swabbing your mouth.

Jason Rigby (00:41):

Or they have companies that do lab testing. They'll do testosterone testing. They'll do STI testing.

Alexander McCaig (00:51):

Hold on. They couldn't do the testing if it wasn't for your data.

Jason Rigby (00:57):

Your work.

Alexander McCaig (00:58):

Your work.

Jason Rigby (00:59):

Yes.

Alexander McCaig (00:59):

Your life. You've showed up, and then they're making a mint. They think they own that whole thing. You don't own me. You don't own any of this. I get you're doing the research, but the fundamentals, you couldn't do the research if it wasn't for people.

Jason Rigby (01:15):

But I want people to realize this because I think this is why it's so important. This is why TARTLE is so important. And this is really macro. But I really want people to understand this. You're doing something with your human body. They're taking data out of your human body. They're turning around, and you signed a little data privacy, you know not everybody reads it, it's 200 pages or whatever.

Alexander McCaig (01:38):

Nobody read it.

Jason Rigby (01:38):

It's not clear what's happening. They don't make it clear. They purposely put in legal mumble jumble. And they make $600 billion for free off of human work from... It's you, it's your data, it's your body.

Alexander McCaig (01:58):

It's like data slavery.

Jason Rigby (01:59):

Yes, a hundred percent.

Alexander McCaig (02:00):

Is it not?

Jason Rigby (02:02):

It is.

Alexander McCaig (02:02):

You put in this work. And the way I coin this is, if you're paying 100 bucks, that should be buying you a share of their company so that you earn in those profits. You're the one fundamentally supporting them.

Jason Rigby (02:17):

Yeah, we're not opposed to the company. We're not opposed to them, anything that they're doing.

Alexander McCaig (02:21):

I love the fact that there's research going on.

Jason Rigby (02:24):

Yes.

Alexander McCaig (02:24):

That's not the issue. The issue is ethically, you are not reciprocating the value you have received as a business, and people are getting screwed. They're like, "Oh yeah, but look at all the good we're doing. You know, telling them outcomes." No, no, no, no, no, no, no. Tell people all the outcomes you want. People are trying to find out where their next meal is.

Jason Rigby (02:45):

You made $600 billion off of people's work. You were a company that got that. You actually got people to pay you, and then you took their data for free, and you made money. If there was a company, if Walmart turned around and said right now, for $600 billion worth of our wages, we're going to make people work for free. We're going to enslave them and make them work for free.

Alexander McCaig (03:10):

Could you imagine?

Jason Rigby (03:11):

Could you imagine the uproar?

Alexander McCaig (03:13):

It'd be insane. They are the largest employer of the United States population. Could you imagine the riots?

Jason Rigby (03:19):

Yes.

Alexander McCaig (03:20):

But the thing is, the public is not aware of what is going on.

Jason Rigby (03:24):

No.

Alexander McCaig (03:25):

They need to be aware. This is not... This is a systemic abuse. It's frankly fucking abusive. We've designed TARTLE so that people could actually feel the reciprocity. They could actually get what they deserve.

Jason Rigby (03:39):

It's the fake bead scenario.

Alexander McCaig (03:42):

It's fake beads.

Jason Rigby (03:43):

It's fake beads.

Alexander McCaig (03:44):

Yeah, it's like, "Oh no, you want these beads? We'll trade with you." Wait a minute. This stuff's junk.

Jason Rigby (03:49):

We want people, slaves. They're going to come in our ship, and we don't need to get into the whole story, but this is in the slave trade, they were doing this, and they were going to different continents. They were coming in there, they realized beads was money. So they were coming in there, and they're like, "Hey, we have a factory that can make all these glass beads." They were oversaturating the market with these glass beads, which were fake glass beads, and then getting humans for those beads, and then turning around and putting them into slavery. It is no different than what is happening now.

Alexander McCaig (04:17):

So if I look at these people doing this research, what are you doing? You're getting real human work, real people involved in this, and what do you give them in return? Jack. For what? So they can wait 40 years for the research to get vetted? No, absolutely not. So you can make your mint, tell your shareholders, "We're doing that. We're doing great. Look at us. We're scamming the public." Cut it out. Give them a damn share.

Jason Rigby (04:43):

Well, we talked about this before, but Geico has a thing, because I have Geico. They have a thing where they're getting your location data and all that stuff. They have an app, but they made it very clear, "Hey, we're going to take your data and then we're going to give you a discount on your policy." Now, I have a choice. They gave me a choice to say, do you want to do that or not?

Alexander McCaig (05:02):

Oh, a choice. Your choice. Oh, there was consent in this.

Jason Rigby (05:04):

Yes.

Alexander McCaig (05:05):

Oh.

Jason Rigby (05:05):

That's totally fine. Now, whether they're getting all... If they're getting all my location data and everything and I want to get $40 off my policy, that's for me to decide whether $40 is worth it or not.

Alexander McCaig (05:15):

Correct.

Jason Rigby (05:18):

But that doesn't change the outcome.

Alexander McCaig (05:21):

No.

Jason Rigby (05:22):

It does not change the outcome. And if they're making $4,000 off my data and I choose to save 40, that's on me. We have a choice to make bad investments.

Alexander McCaig (05:31):

And that's a bad investment.

Jason Rigby (05:32):

Every day, we do that.

Alexander McCaig (05:33):

That's not right.

Jason Rigby (05:34):

No.

Alexander McCaig (05:34):

It's backwards.

Jason Rigby (05:35):

Yeah.

Alexander McCaig (05:36):

And we're going to call out individuals that do not strike a balance that needs to be there. They are not respecting the people who are supporting their products or services. There is absolutely zero respect for the human being, zero.

Jason Rigby (05:54):

Yeah, and I think it's... I'm going to use this terminology in a good way. I think it's the wild west of data right now.

Alexander McCaig (06:00):

Yeah, they just think how much can I get? How quickly? I want [crosstalk 00:06:04]-

Jason Rigby (06:03):

Who has the bigger guns?

Alexander McCaig (06:04):

Yeah, of course. I got bigger service.

Jason Rigby (06:07):

Yeah, I come in there, and I'm just going to... I can get my little group together, and then we can go rape and pillage and salvage cities and towns or whatever. But they're doing the same thing with our data.

Alexander McCaig (06:19):

Who's got the better algorithm?

Jason Rigby (06:21):

Yes.

Alexander McCaig (06:21):

And who can get data faster? To feed it.

Jason Rigby (06:23):

Yeah, and how much of the data? We have a meme up right now where it shows a picture, but it's the meta, like Facebook's talking about meta, but their meta now is their name. But I'm like, the meta universe just means they're going to be taking metadata from you.

Alexander McCaig (06:38):

All the time.

Jason Rigby (06:38):

Until we fix that.

Alexander McCaig (06:39):

Which is, frankly, some of the most important data.

Jason Rigby (06:41):

Yes, which is some of the most important data because this is going to see... The metaverse, if we don't fix this now, we're in trouble because the metaverse is going to cause us all our entertainment, our deepest desires, we're going to be dating on there. We're going to be going to school on there. We're going to be working on there.

Alexander McCaig (06:57):

Ready Player One.

Jason Rigby (06:58):

They're taking... Yeah, Ready Player One. They're taking all of that data, and they're very personal data. If you're dating on there, that's extremely personal. People are going to want to have that opportunity to be able to do that. And they're being able to take all that data for free?

Alexander McCaig (07:12):

They're going to sell your outcomes.

Jason Rigby (07:14):

Yes.

Alexander McCaig (07:14):

To other people.

Jason Rigby (07:14):

They're going to be like, oh, well, people in this area do this and do this and do this, this outcome happens. So how can we create that outcome more? How can we make it easier for them to get to first base?

Alexander McCaig (07:25):

Yeah, how can I sell them a date night at this restaurant? I can tell the restaurant you need to advertise to this person specifically.

Jason Rigby (07:30):

Yeah, yeah, exactly. And then it's only going to get worse and worse and worse. We have to make a stand on data, and people don't realize the importance of it. But it is who we are. You have a company extracting your body data.

Alexander McCaig (07:45):

Yeah, this is-

Jason Rigby (07:48):

I mean, I hope people get this.

Alexander McCaig (07:49):

This is the log. This is the captain's log of every human being.

Jason Rigby (07:53):

A hundred percent.

Alexander McCaig (07:55):

That's all it is. That's like someone walking in your house, grabbing your diary, reading it, and then saying, "Oh my God, can you believe what this guy wrote?" And then sharing it with everybody else.

Jason Rigby (08:06):

So explain, because I want to get into this, explain the whole idea of us working, trying to work with these companies and what TARTLE's doing with just the genome part. Because you can download that.

Alexander McCaig (08:19):

Correct.

Jason Rigby (08:19):

I did. I went in there and downloaded, and I have this huge file, and it's, this is me.

Alexander McCaig (08:24):

Yeah, that's just who you are. Here's what we're doing. We are showing up with the tool to these people that were otherwise abusing a system and saying, okay, we're not asking you to end what you're doing. We're just offering you a better way to go about it. More ethical, more sustainable way to go about it, one that really brings in those people that are helping you do your processes. That's what TARTLE's doing.

Alexander McCaig (08:56):

So as we capture ancestry data, as we allow individuals to capture their genomic sequencing, their social determinants of their health, they are a part of that process. By using TARTLE as a business, you are becoming a part of that process with them. You're inviting them into what you're doing, as you should.

Jason Rigby (09:18):

But through the TARTLE marketplace, I can download my genome data that I got from... I'm going to use it from 23andMe, that's where I got it from, because you can download that. I can put that into the marketplace and get paid for it.

Alexander McCaig (09:30):

You can get paid to share that so others can do research, and not just have it all, all the money piped into 23andMe so that they can rake you twice, first for the 99 bucks, and then they're going to rake you again because you're not going to get any of the cash from them selling that research.

Jason Rigby (09:46):

And then they're going to rake you again because they hit you up with emails to fill out these surveys. So then you're doing more work every month. You're filling out-

Alexander McCaig (09:54):

You're not getting anything in return.

Jason Rigby (09:55):

... They're looking and saying, how many surveys have we got people to fill out? And then they're taking those surveys and they're selling that data continually.

Alexander McCaig (10:00):

Yeah, you're not getting anything from that.

Jason Rigby (10:01):

So you're working for them every month.

Alexander McCaig (10:02):

You work for them. It's wrong.

Jason Rigby (10:05):

And they give you a report saying you may have cancer, you may have this, which the FDA got onto them for that. So, it's just all around a bad system.

Alexander McCaig (10:14):

It's a bad system.

Jason Rigby (10:15):

It's not ethical.

Alexander McCaig (10:17):

Nothing about it's ethical, so we would like to offer this opportunity to these businesses, to de-risk what they are doing, and pump some ethics back into their practices. Actually be a little bit more fair by using TARTLE, by respecting the human beings that are working with you. You have to do that. You are living in an unsustainable world. You are on a collision course to chaos at your companies. What you are doing is you are fundamentally driving the market to go talk to their politicians, to write legislation against you because you're screwing up.

Jason Rigby (10:59):

It's going happen.

Alexander McCaig (11:00):

It is happening right now.

Jason Rigby (11:02):

It's just a matter of time. You're going to get in trouble.

Alexander McCaig (11:04):

You're going to get in trouble.

Jason Rigby (11:06):

The risk is there.

Alexander McCaig (11:06):

So let's offer you an option to be fair with people, to treat them as they should be treated, respect them for their work and respect them as human beings by coming to them and meeting them on TARTLE.

Jason Rigby (11:18):

So, what would be the best way for one of these companies? Could we challenge any C-suite person to get ahold of us, contact us? If you're in a leader position at one of these companies, we'd love to talk to you.

Alexander McCaig (11:28):

Good talk.

Jason Rigby (11:29):

What would be the best way they could do that?

Alexander McCaig (11:31):

To reach out to us?

Jason Rigby (11:31):

Mm-hmm (affirmative).

Alexander McCaig (11:32):

Social media. You can hit us on LinkedIn. You can send an email to contact@tartle.co. You can talk to our press manager. Any of them. Just reach out. We are always listening. We're not looking at you as the enemy. That's not what this is about. We're just trying to offer you a better bridge to cross.

Jason Rigby (11:53):

Yeah, and that's something that people always, especially nowadays, we don't understand, and I want to close in this. Just because I disagree with you doesn't mean that you're an evil person.

Alexander McCaig (12:03):

No, no, no, no. Correct, thank you.

Jason Rigby (12:05):

You see what I'm saying?

Alexander McCaig (12:05):

Yes.

Jason Rigby (12:06):

But we have this, like, you can go on Twitter and you see this, there's this huge divide of people throwing stones at each other, hiding in their little tribes, and then saying this person's bad or evil or they need to be canceled or whatever. We're not saying 23andMe-

Alexander McCaig (12:20):

Go away.

Jason Rigby (12:21):

... is evil or bad or go away. We're just wanting to offer a solution to understand to better humanity. That's it.

Alexander McCaig (12:29):

That's it.

Jason Rigby (12:30):

Ethics.

Alexander McCaig (12:30):

We're going to call you out on your practice.

Jason Rigby (12:32):

Yes.

Alexander McCaig (12:32):

Because it's imbalanced.

Jason Rigby (12:34):

Well, it's unethical.

Alexander McCaig (12:35):

So we're going to call you out on it.

Jason Rigby (12:36):

Yes.

Alexander McCaig (12:37):

We're talking about human beings, and we will always be here to stand, put that fucking stake on the moon, on the earth. I don't care where it is, to say-

Jason Rigby (12:45):

Mars?

Alexander McCaig (12:46):

Yeah, for Mars. We're going to do it for human beings. We got to support them because resource holders will abuse it. They will take the path of least resistance. And that one that we're currently taking now is not sustainable.

Speaker 3 (13:06):

Thank you for listening to TARTLE Cast with your hosts, Alexander McCaig and Jason Rigby, where humanity steps into the future and resource data defines the path. What's your data worth?