If you’ve been here a while, you know TARTLE is all about focusing on the importance of source data, of going to the individual for their data. Well, at long last it looks like the rest of the world is starting to catch up to what we have been talking about for years.
This is in large degree due to the way the public has begun to push back on the massive invasions of privacy that most of the big tech companies have been engaging in without even thinking about it for well over two decades now. At first, that concern was brought to the fore by a few lone voices crying out in the digital wilderness. Now that the message has filtered into the mainstream, governments have begun to respond, proposing and enacting a whole raft of fresh privacy laws around the developed world. Thanks to the laws of cause and effect, the tech companies are starting to respond as well, spending a lot of time and money figuring out how to best deal with all of these new laws. Unfortunately, as one might expect, they are not always responding in good faith. Instead of trying to figure out how to change their business models in order meet the needs of their customers, they are often trying to figure out how to work around the laws, to find the technicalities that will still allow them to do business however they wish. Why?
The answer is that they grew up in the beginning of the digital age. The lack of laws and regulations led to a ton of innovation, but it also allowed for various abuses. A good analogy is the beginning of the industrial revolution when there was a flood of new products, technologies, and processes that completely transformed society. However, there was little concern given to the long term consequences of their actions. My favorite example is how the shoreline of Chicago is basically built on all the crap that collected there after people threw trash in the Chicago River for years. It got so bad that part of the clean-up effort involved burying the piles of trash and the Army Corp of Engineers literally reversing the flow of the river, sending a bunch of trash down to St. Louis.
The tech companies treated their customers’ privacy with the same disregard, tracking buying habits, browsing history, physical location, and, most notoriously, listening in on conversations and adjusting ads and news stories based on that. They did and still do all of this without anyone’s consent. Sure, it might have been in the fine print of the terms of service but everyone knows that no one reads those. It’s just an annoying legal loophole that everyone pretends accomplishes something. The point is, they got used to it and their whole business model is built on being able to do whatever they want to collect, use, and resell data. Their business models treat people like cogs in the machine rather that sovereign individuals. Now, just as we’ve spent the last few decades cleaning up the mess of the early industrial revolution, we’re starting to do the same thing with the privacy abuses of the digital age.
TARTLE is of course, already ahead of the curve. We started off wanting to return control to the individual, to let people, not corporations, or even governments decide what they should or should not do with their data. When you sign up with us and connect your various accounts, you are the one in control, able to share as much or as little as you wish. Yet, your data is still valuable to many companies around the world, which is exactly why there is a financial incentive for you to sell that data if you choose. We’re out there working with companies and governments, trying to impress the importance of not just the source data, but the humanity of that source on them, to remind that you are more valuable than ones and zeroes.
What’s your data worth?
The answer is that they grew up in the beginning of the digital age. The lack of laws and regulations led to a ton of innovation, but it also allowed for various abuses. A good analogy is the beginning of the industrial revolution when there was a flood of new products, technologies, and processes that completely transformed society. However, there was little concern given to the long term consequences of their actions.
Speaker 1 (00:07):
Welcome to TARTLE Cast with your hosts Alexander McCaig and Jason Rigby, where humanities gets into the future, and source data defines the path.
Jason Rigby (00:19):
Alexander McCaig (00:25):
Jason Rigby (00:26):
What is going on? How are you enjoying the weather outside? It's freezing now here.
Alexander McCaig (00:31):
Jason Rigby (00:32):
Alexander McCaig (00:33):
There's an interesting balance when you walk outside in New Mexico, is that the sun is still hot.
Jason Rigby (00:37):
Alexander McCaig (00:38):
But the air is ice crisp cold.
Jason Rigby (00:39):
Alexander McCaig (00:40):
So you're like, should I wear a jacket, short sleeve, long sleeve. I'm completely unsure of myself.
Jason Rigby (00:44):
Yeah. It's crazy in the morning when you run because you get the cold air in.
Alexander McCaig (00:47):
Jason Rigby (00:48):
But then you can feel the heat coming from the sun.
Alexander McCaig (00:50):
Jason Rigby (00:50):
So you could almost not have a jacket on or anything. Have shorts and a t-shirt on because once you start sweating, but you're still... when you suck in that cold air.
Alexander McCaig (00:59):
Jason Rigby (00:59):
Alexander McCaig (01:01):
That's an interesting sort of what other people would say is very first-party data.
Jason Rigby (01:06):
Alexander McCaig (01:07):
Is knowing how dynamic I am with my behaviors as the weather changes very dynamically.
Jason Rigby (01:13):
Right. So I want to... why should we care about first-party data?
Alexander McCaig (01:17):
Well, you and I have been touting this for years, and it's articles talking about it right now. You source data, right?
Jason Rigby (01:23):
Alexander McCaig (01:24):
It's going to the primary source. The reason you do that, you should care about that first-party data is because first of all, from a legal standpoint, all the governments are pushing to give people back control of that information. Where secondary parties and tertiary parties are not going to have the ability to move that data for you. You're going to end up taking all the control. I mean, that's the model we've designed for TARTLE.
Jason Rigby (01:47):
Alexander McCaig (01:48):
Right. And many companies have said, "Oh, it's a big priority for us to focus on data privacy laws." Well, of course, it is. You have to figure it out because your entire model of your business is the opposite of what these governments are pushing. So yes, it's an obvious priority for you. And you have to kind of pseudo work how your company fundamentally operates to move into this new paradigm.
Jason Rigby (02:13):
Well, I know companies are very... because we talked about last week about that proposition in California. And they had passed some laws, and then, of course, they talked about spot-
Alexander McCaig (02:24):
Jason Rigby (02:24):
Yeah. They talked about Spotify and Google.
Alexander McCaig (02:27):
Jason Rigby (02:27):
Being able to maneuver around it.
Alexander McCaig (02:29):
Jason Rigby (02:30):
So, I imagine there's legal teams that are saying, "Okay, what's the new law? What's the new threat? How can we move around this to be able to still collect..."
Alexander McCaig (02:39):
That's the thing.
Jason Rigby (02:39):
"... gold data."
Alexander McCaig (02:40):
How do we bend around the rules?
Jason Rigby (02:42):
Alexander McCaig (02:43):
Right. The problem is all these other firms they're so focused on it because they know it's going to cripple their businesses.
Jason Rigby (02:48):
Alexander McCaig (02:50):
Because at the fundamental base level, they weren't designed to operate with that sort of sovereignty to the individual. Their ideas like, "Oh, we're going to have a very siloed approach. We're going to collect everything and hold it. And we're going to be the one that reaps the benefit." Now, what we're seeing is that control is becoming decentralized with data.
Jason Rigby (03:08):
Alexander McCaig (03:08):
And in doing so, as you decentralize control, you move it away from those businesses and give it back to the individual. They are then required to be responsible for that information. And that responsibility is going to require a tool for its management. And the tool for that management is something called TARTLE. I don't know if you've ever heard of it.
Jason Rigby (03:24):
Alexander McCaig (03:25):
Right. But that's the transition that's happening. And legal changes are... policy is pushing things in that direction, but it's not very difficult to adopt or work with those policies if you're working at a very human level in the first place.
Jason Rigby (03:41):
So why can't capitalism figure this out on its own without government?
Alexander McCaig (03:44):
Because capitalism isn't designed in itself to benefit everybody. There's socialism.
Jason Rigby (03:51):
Alexander McCaig (03:51):
There's capitalism, right. But then you have this weird middle ground that you're looking at with decentralization. Now, everybody themselves can be their own best capitalist, right. But that's going to require responsibility. And that means it's not about giving the power just to a 1% profile.
Jason Rigby (04:10):
Alexander McCaig (04:11):
It's about giving the power back to the 99%. So how is it that you can still create economic gain but giving power and sovereignty back to individuals to do that themselves? So how do you still keep those economic models actually going? And that's a new sort of shift that people need to figure out, and that's going to be happening with that data. Data is the first thing.
Jason Rigby (04:32):
Alexander McCaig (04:33):
It's such a valuable market for so many companies because they want insights on what we do. They want that first-party information.
Jason Rigby (04:39):
Alexander McCaig (04:40):
Because that can tell me about those behaviors if I'm going on that run. The weather's changing. Do I rip my jacket off? Am I sweating?
Jason Rigby (04:45):
Am I in Pennsylvania? And I voted for Biden or Trump.
Alexander McCaig (04:50):
Jason Rigby (04:50):
And go directly to that person and ask them that.
Alexander McCaig (04:52):
You can go in. Now, the conversation is not, I'm going to talk to a business or a business of a business about a person.
Jason Rigby (04:58):
Alexander McCaig (04:58):
I'm going to go have a conversation with the person.
Jason Rigby (05:00):
Alexander McCaig (05:00):
Right. So don't go to these other places to buy the info. Go directly through TARTLE to purchase information from that individual. You get that first-party data. We call it source data. It's a very primary thing. So when you go to do research, right, you always want to find out what the sources.
Jason Rigby (05:15):
Alexander McCaig (05:15):
Like you're citing your sources. Well, that sources the person. They are the citation of all that data. They are the provenance of its lineage, right. So that's what the focus is going to be on. That's the shift. And that can only occur with that decentralization, but people also taking responsibility over the things they create. And we might see something interesting that policy will push so hard that all the control is given to people. And then there's no momentum. People are like, "Well, now I have control of my information. Where does it go? What do I do with it?"
Jason Rigby (05:44):
Right. And what would cause tech companies... that would cause them to sputter too.
Alexander McCaig (05:50):
It would cause them to sputter.
Jason Rigby (05:51):
Because now, you're looking at profits that we've been used to.
Alexander McCaig (05:54):
They can't... from a profiteering standpoint, it would slow them down.
Jason Rigby (05:59):
Alexander McCaig (06:00):
From a business operations standpoint, it would slow them down because their analysis would slow. They otherwise couldn't get information for their product or service at the same rate they had it before. And it's not like that... here's an interesting dynamic to this. They are getting a lot of information very quickly, but it's not primary information.
Jason Rigby (06:20):
Alexander McCaig (06:21):
So even if you did slow it down, you're getting source information, that first-party data. And because you get that source of information, it's far more valuable. So the rate at which you receive it doesn't have to be so instantaneous that you're used to.
Jason Rigby (06:34):
Alexander McCaig (06:34):
You can actually receive it at a slower rate, but you know that it is an absolute goldmine.
Jason Rigby (06:39):
Yes, yes, exactly because it would be... let's say somebody is wanting to buy a Ford truck.
Alexander McCaig (06:46):
Jason Rigby (06:47):
And you've used the example before. Maybe they were changing a baby's diaper or something.
Alexander McCaig (06:51):
Jason Rigby (06:51):
And the Ford truck commercial went on from YouTube, and they didn't hit skip because they were changing a diaper. And now, Oh-
Alexander McCaig (06:57):
He's watching it for 90% of the ad.
Jason Rigby (06:59):
Yeah. Oh wow. So...
Alexander McCaig (07:00):
Let's push some dollars to that guy.
Jason Rigby (07:01):
Yeah. Or either he had it on his phone, and he was watching an Amazon show, but he was on his phone on YouTube too. So he went to set that down.
Alexander McCaig (07:09):
Jason Rigby (07:09):
Because I know these marketing companies have want us to have three screens.
Alexander McCaig (07:12):
Jason Rigby (07:13):
A laptop, a phone, and a TV then they can serve ads to all three.
Alexander McCaig (07:17):
Jason Rigby (07:19):
So you're getting this false data.
Alexander McCaig (07:21):
Yeah. When we look at people, this is how I simplify this current model. Right now, you are the annuity stream for companies that use data.
Jason Rigby (07:34):
I love that.
Alexander McCaig (07:35):
Jason Rigby (07:35):
That's a really good analogy.
Alexander McCaig (07:36):
You are the pipe that gives them the revenue. You are nothing but an asset to them.
Jason Rigby (07:44):
That Tesla cannot run without that battery.
Alexander McCaig (07:46):
Correct. You are not an asset to yourself right now with your data. You are not your own annuity stream. Okay. That is being taken from you. There's no true value. There's no ethical human egalitarian value, [inaudible 00:08:03] value that you receive in the current model. So that shift is going to be a little bit harsh. But in order for you to take that annuity stream away from them and bring it back to yourself, you need to claim that ownership, and you have to become very responsible. And you need a tool that is also responsible enough that matches fundamentally, philosophically, legally, those changes that are happening with society.
Jason Rigby (08:32):
I like that.
Alexander McCaig (08:32):
And can you think of a tool that would be able to do that?
Jason Rigby (08:36):
Alexander McCaig (08:37):
Only one, right. And it's called TARTLE because that's the design, and that's... we may have been ahead of the game. Actually, we have been ahead of the game for quite some time, for many years now, but it's catching up.
Jason Rigby (08:48):
Alexander McCaig (08:49):
And now we're here with this tool, delivering it to people and educating them on it. So, now that they have that available option. If you've watched the movie, the social dilemma?
Jason Rigby (08:58):
Alexander McCaig (09:00):
How much time you look at a picture, again, it's not... it's, what is the ethical approach? Do you look at someone as a human being or a number, right? Do you look at them as a revenue model, or do you look at them as someone who's a living conscious breathing thing? And right now, they've just looked at them as revenue models, right. And these companies don't exist unless they have advertisers paying into their models. That's [inaudible 00:09:20]. Social media is not social media. Okay. It's just a very good billboard. It's a billboard for people to billboard themselves, to advertise themselves, and for companies to advertise themselves. Or companies to advertise themselves through other people's billboards.
Jason Rigby (09:35):
Alexander McCaig (09:35):
Okay. It's not truly a social interaction. It's a very false perspective of what society is and how transactions actually occur.
Jason Rigby (09:45):
And I think if you mentioned the documentary and I don't want to get too heavy into that. But I can look at the documentary, and yeah, people will understand it, but the draw was too big. It's only going to get worse. Where you take a child that has been raised on the internet. Raised, for me, if you missed a show, you had to wait till the...
Alexander McCaig (10:05):
The reruns started [inaudible 00:10:06].
Jason Rigby (10:05):
Yeah, the reruns started.
Alexander McCaig (10:06):
Jason Rigby (10:06):
Alexander McCaig (10:06):
You literally have a three-year-old that knows how to operate an iPad very well on Amazon Fire, or whatever. You know the little kids when they have... I've watched somebody in the story the other day and just on-demand, on-demand, on-demand.
Jason Rigby (10:20):
Alexander McCaig (10:20):
Constantly. People recognize that there's a problem with social media, but I think it's too big and too much... There's too much addiction and dopamine hits on it that people will not give it up.
Alexander McCaig (10:35):
That's all it is. And I think they made a quote. They made a quote in the film that a tool is only a tool if it sits there at rest and does not agitate you or cause you to have some sort of emotional response to use it. A tool only works when you want it to go work.
Jason Rigby (10:56):
Alexander McCaig (10:57):
When you choose to make that choice. And so, at a psychological level, they removing choice. And so, when you look at primary data with what we're talking about in this article on how that should be the focus. This is what companies should be focused on is getting that source information, that first-party data. You have to appreciate someone's choice. You have to willingly give them that control. And at first, it's going to be uncomfortable to give people that control over the information. But what you'll find is that you'll gain a much deeper value on the individual.
Jason Rigby (11:27):
Alexander McCaig (11:28):
Okay. They're going to be more trusting of you because you're coming to them and asking them directly for that information. It's not pulled from some black box and shared between these independent parties that you don't know who they are. In doing so, that makes people feel more comfortable. And they'd be more willing to give you more insightful information because they're releasing it to you.
Jason Rigby (11:46):
Yes. They're releasing it. Yeah.
Alexander McCaig (11:47):
Jason Rigby (11:48):
I want to kind of get [inaudible 00:11:49] close in this. I want to kind of get philosophical for a second. Whenever... because you brought up that documentary. I know a lot of you guys have listened to it. Whether it's going to the fridge because you're having a thought and in that thought, you don't like that thought or there's an uncomfortable feeling of guilt, worry, or something that's inside of you.
Alexander McCaig (12:09):
Jason Rigby (12:10):
So you disrupt that thought instead of dealing with the thought.
Alexander McCaig (12:13):
Jason Rigby (12:14):
And it is corruption, is going to the fridge, grabbing this, show after show where it automatically just goes to the next episode, and it causes us not to deal with the reality that's in us, in what we need to accomplish.
Alexander McCaig (12:27):
That's fantastic. You know they have the guy on there that invented the infinite scroll.
Jason Rigby (12:30):
Alexander McCaig (12:30):
It's more of like the infinite distraction.
Jason Rigby (12:32):
Alexander McCaig (12:33):
So it's [inaudible 00:12:34]. This is the dichotomy of technology, which I'm a proponent against. Where technology evolves at a rate drastically faster than society is evolving. What happens is the technology should be designed that it elevates society, not distracts them from their own evolution. And that's what that technology is doing.
Jason Rigby (12:56):
Alexander McCaig (12:56):
And that's why people have now begun to recognize the addiction. Someone who's an addict doesn't know they're a freaking addict.
Jason Rigby (13:04):
Alexander McCaig (13:05):
They're just doing the drug all the time because it's what they do until someone comes in and be like, "Hey, you're hopped up on drugs."
Jason Rigby (13:13):
Alexander McCaig (13:13):
"All the time. You're whacked out of your mind," right. You have no control right now. You need to find that control within yourself. You need to recognize that this is not healthy for you, right. And the only way you can do that is take responsibility of the situation, and that's define technologies that actually elevate and push society forward. Don't distract yourself from your own evolution.
Jason Rigby (13:33):
Yeah, exactly, 100%.
Alexander McCaig (13:35):
Jason Rigby (13:35):
We got heavy into...
Alexander McCaig (13:36):
That was good.
Jason Rigby (13:37):
... some data.
Alexander McCaig (13:37):
Yeah. All that first-party data.
Jason Rigby (13:39):
I love it.
Alexander McCaig (13:40):
Don't distract yourself.
Jason Rigby (13:41):
Yeah. No partying guys, it's COVID.
Alexander McCaig (13:42):
Yeah. No partying. Be responsible. Be responsible. Peace out.
Jason Rigby (13:46):
That's was a [inaudible 00:13:52].
Speaker 1 (13:56):
Thank you for listening to TARTLE Cast with your hosts Alexander McCaig and Jason Rigby, where humanity steps into the future and the source data defines the path. What's your data worth?
Jason Rigby (14:15):
To measure representation and data.
Alexander McCaig (14:17):
Yeah. I'm going to [inaudible 00:14:20].