Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
August 12, 2022

What Big Tech is Doing With Your Data and Why You Should Be Afraid

Big Tech
BY: TARTLE

Big tech is making some big moves on your personal information.

Now, they’re making it look good by dressing it up with some fun statistics. For example, Spotify Unwrapped gives you a list of all the music you’ve listened to the most in the past year. Google Maps has a similar function. It compiles a neat list of all the countries, cities, and places you’ve visited.

It’s time to call this out for what it is: they’re showing you all the personal information they’ve gathered from you, and are keeping it in their systems. 

The Crown Jewel of Data: Location

Your location data is critical information. We’ve seen journalists, politicians, and gamers doxxed for one thing or the other on the internet.

In this episode, Alexander and Jason discuss how a Catholic priest was outed by a Christian publication. This happened because they tracked his location using Grindr, and found that he was visiting gay bars and private residence from 2018 to 2020. They concluded that it was his phone based on the location data as well.

“A mobile device correlated to Burrill emitted app data signals from the location-based hookup app Grindr on a near-daily basis during parts of 2018, 2019, and 2020 — at both his USCCB office and his USCCB-owned residence, as well as during USCCB meetings and events in other cities.” - The Pillar

That’s highly specialized location data. Three years of the priest’s whereabouts, being logged and stored by the app. And a data vendor was all it took to ruin his entire life.

Getting Something for Free? Then You’re the Product

Should he have been a priest if he was a closeted homosexual? That’s not what we’re trying to answer here. The core of this issue is that someone tracked this individual’s private choices and exposed them without their consent, without them even being aware that they were tracked in the first place.

Those fun end-of-the-year summaries on your app activity aren’t for free. They are blatantly telling you that you are the product, and you can have your data weaponized against you without your knowledge. That’s the kind of chaotic world we can expect with the inevitable weaponization of data.

Protect Your Data. Join TARTLE.

This is a wake-up call for you to start being more vigilant about who and where you share your data. You need to own the information you create on your gadgets. Those are your personal assets and you worked hard to create them.

With TARTLE, you can take that information into your hands and choose to share it on your time, at your pace. Stop letting third parties and vendors take that away from you. Your choice, your time, your data.

Sign up for TARTLE here.

Summary
What Big Tech is Doing With Your Data and Why You Should Be Afraid
Title
What Big Tech is Doing With Your Data and Why You Should Be Afraid
Description

Big tech is making some big moves on your personal information. Now, they’re making it look good by dressing it up with some fun statistics. For example, Spotify Unwrapped gives you a list of all the music you’ve listened to the most in the past year. Google Maps has a similar function. It compiles a neat list of all the countries, cities, and places you’ve visited. It’s time to call this out for what it is: they’re showing you all the personal information they’ve gathered from you, and are keeping it in their systems. 

Feature Image Credit: Envato Image
FOLLOW @TARTLE_OFFICIAL

For those who are hard of hearing – the episode transcript can be read below:

TRANSCRIPT

Jason Rigby (00:07):

The inevitable weaponization of app data is here.

Alexander McCaig (00:11):

You and I had spoken about this, and if I can get the microphone to stay where it belongs. You and I spoke about this in the past.

Jason Rigby (00:16):

Yes.

Alexander McCaig (00:16):

What did we say the crown jewel of data was? Location.

Jason Rigby (00:19):

Yes.

Alexander McCaig (00:21):

The Nazis knew it. If you prevent freedom of movement, then you have full control.

Jason Rigby (00:28):

Yes.

Alexander McCaig (00:28):

That's what it is. So what happens when you're tracked all the damn time? And here's the worst part. If you're tracked, someone knows your location data, and then they sell that to third parties?

Jason Rigby (00:39):

Mm-hmm (affirmative).

Alexander McCaig (00:39):

That's horrible. Say they don't know your location data specifically, like the GPS.

Jason Rigby (00:44):

Right.

Alexander McCaig (00:44):

But what happens when you use some sort of contactless payment option, or you walk into a building that sends off a ping to an advertiser, they would know immediately in the general area, this is where you are, regardless of the GPS.

Jason Rigby (01:00):

It would be interesting if they could tell you, because obviously I have an Apple phone, so Apple knows my location.

Alexander McCaig (01:06):

All the time.

Jason Rigby (01:06):

I have T-Mobile, T-Mobile knows my location.

Alexander McCaig (01:08):

All the time.

Jason Rigby (01:09):

I have Google Maps, so Google knows my location.

Alexander McCaig (01:12):

All the time.

Jason Rigby (01:12):

They even send me emails about it, like, "Hey, you went to eight cities, and you walked to 76 miles."

Alexander McCaig (01:16):

I went around the globe 1.3 times.

Jason Rigby (01:18):

Yeah, exactly, all that stuff. They're making it pretty sounding that they're super stalking you. But, whenever you think of location data, it's like every time you download an app-

Alexander McCaig (01:30):

They all want it.

Jason Rigby (01:31):

... you're giving consent to something. And if you don't take the time to look at the privacy statement and see what they're pulling out of your data, and Apple's kind of done a better job, but it's like the-

Alexander McCaig (01:43):

Nothing's changed except they made it more forward.

Jason Rigby (01:47):

Yeah.

Alexander McCaig (01:48):

It was in the background.

Jason Rigby (01:49):

Right.

Alexander McCaig (01:49):

And now they tell you it's in the foreground.

Jason Rigby (01:51):

But Apple and Google, I think, are number one?

Alexander McCaig (01:53):

Of course they're number one. Nobody would prevent your location data from being shared with them, in whatever format, if they didn't have another option for tracking.

Jason Rigby (02:04):

Yeah, and it's so funny, people are like, "Well, Apple got this for me for free," or, "Google Gmail's free." If a tech company gives you something for free...

Alexander McCaig (02:11):

You should be worried. You should be worried.

Jason Rigby (02:14):

If Gmail is free there's a reason. The Google calendars, there's a reason.

Alexander McCaig (02:21):

Well TARTLE's free. But we're not giving you anything, you have to do all the work yourself.

Jason Rigby (02:27):

Yes.

Alexander McCaig (02:28):

Different story.

Jason Rigby (02:28):

Yeah, totally different story.

Alexander McCaig (02:29):

"Oh, by the way, we can't see any of your data." And if we wanted it, we'd have to buy it from you.

Jason Rigby (02:34):

We do, TARTLE has to buy it.

Alexander McCaig (02:36):

Talk about smoking your own dope.

Jason Rigby (02:39):

So, let's talk about the inevitable weaponization of app data's here, and there's a situation that happened. We've seen it with researchers, journalists, even governments, have used highly sensitive location data from a smartphone app to track and publicly harass a specific person. In this case, there was a Catholic Substack publication, The Pillar, and it said it used location data ultimately tied to Grindr, which is an app, to trace the movements of a priest, and then out him publicly without his consent. And this was in The Washington Post, it was reported on Tuesday, which ended up leading to his resignation.

Alexander McCaig (03:17):

I don't really, frankly care what the priest does. That's his thing.

Jason Rigby (03:21):

Yeah, he's human.

Alexander McCaig (03:23):

But for somebody else to talk about someone else, and that's essentially their private manner or their private choices, and for you to expose that without their consent, without them being a part of it, or even being aware of the fact that they were being tracked, that's a problem.

Jason Rigby (03:37):

Well, the problem is the trickle down effect of what they talk about, is corporations, intelligence agencies. Now the article says any sort of disgruntled, unscrupulous or dangerous individual. And then it talks about-

Alexander McCaig (03:50):

They did the same thing with Europol.

Jason Rigby (03:52):

And this is where TARTLE gets involved with this article. It says, "A growing market of data brokers that collect and sell data from countless apps, have made it so that anyone with a bit of cash and effort can figure out which phone as a so-called dataset, belongs to a target, and abuse that information."

Alexander McCaig (04:07):

Think about how many tech startups there have been. They're not all making big revenue off of their product. Let's be clear about this. Their investors are pushing these guys for returns, unreasonable stuff. They try and push us to frankly, some unreasonable things, but we are not going to change our model and say, "Oh, we've got to find another revenue stream. We've got to start selling people's data." We can't do that, our model doesn't fundamentally work like that. We do one thing, we help people move their data. You know what I mean?

Jason Rigby (04:42):

We're a marketplace of data.

Alexander McCaig (04:43):

Yeah, we're a data marketplace. So, that's the problem. So these data brokers are blood thirsty to get this stuff moved, and they'll sell it to anybody. We have to be clear about this, even if these companies sold these anonymized data sets, there is plenty of information in it for them to track it back to you. There's only a certain amount of characteristics of a human being that is needed to actually find out who that person is amongst 8.242 billion people on this planet. That's the problem.

Jason Rigby (05:23):

And this is the crazy part. Senator Ron Wyden in a statement, responding to this incident that happened, "Data brokers and advertising companies have lied to the public," this is what the senator's saying, "Assuring them that the information they collected was anonymous. As this awful, awful episode demonstrates, these claims were bogus. Individuals can be tracked and identified."

Alexander McCaig (05:45):

So, I want to clarify this one more time. In an anonymized data set, and I hope we make a little short clip about this, if there are things that are materialistically characteristic of you as a human being-

Jason Rigby (06:01):

Right.

Alexander McCaig (06:02):

... you don't have to put the name in there. You don't have to put the address. You have very specific features, the geometry of your face, your body, tattoos, the things you buy, whatever it might be. Those things are highly identifiable of who you are. So even when someone says, "Oh, it's anonymized, they took the name off of it," with very few characteristics they're going to know exactly who that person is.

Alexander McCaig (06:22):

The only thing that's truly anonymous is thought. So if you remove the name and someone's just buying data, brokers were just giving out data on just people's thoughts, that stuff's inherently anonymous.

Jason Rigby (06:34):

Well, I mean, this is scary, because this person was the general secretary of the US Bishop's Conference in the church. And like we said, we don't care about any of this. So he had visited different bars, and was using this dating app, and then went to private residence, and they tracked all that. They tracked what bars he went to, the private residences he went to from 2018 to 2020. This is the crazy part, and then they got it from a data vendor.

Jason Rigby (07:00):

So whenever you're looking at something like this... this is a person's life that was ruined.

Alexander McCaig (07:08):

Absolutely destroyed. They've lived for 40, 50 years, just going about their merry way, and just wrecked them. If the shoe was on the other foot, what if I went to the CEO of Grindr and I was like, "Hey, I'm just going to track you for a little bit and everywhere you go. I want to know your relationships that you've been into with these people. Oh, and by the way, I'm going to tell every single person in the press about your sexual relationships. Sound good?" He wouldn't stand for that in a second. So why is it that it's okay to do this sort of analysis tracking and correlative research on other people using the app, and then brokering that information off for your material economic benefit?

Jason Rigby (08:02):

Well, it gets worse, because they asked Grindr about it. So now as a company you have a choice to make.

Alexander McCaig (08:07):

Own up to it or-

Jason Rigby (08:09):

Well, here's the spokesperson.

Alexander McCaig (08:10):

... dodge it.

Jason Rigby (08:10):

Here's what Grindr said. "Grindr's response is aligned with the editorial story published by the Washington Post, which describes the original blog post from The Pillar, as homophobic and full of unsubstantiated innuendo. The alleged activity listed in that unattributed blog posts are infeasible from a technical standpoint, and incredibly unlikely it incurred. There is absolutely no evidence supporting the allegation of improper data collection or usage related to the Grindr app as purported." So what I want people to understand is, this infeasible from a technical standpoint-

Alexander McCaig (08:38):

That's not true.

Jason Rigby (08:39):

But in January, the Norwegian Data Protection Authority fined Grindr $11.7 million for providing its user data to third parties, including their precise location data.

Alexander McCaig (08:49):

Correct. No, what do they do? Don't even talk about remotely what you did, bash the article, bash everything else except owning up to the fact that you created this problem.

Jason Rigby (09:01):

Yeah. This is horrific, but this is the sad part about it, there are countries that are out there, and this article talks about that, where homosexuality is illegal. Which you're going at somebody's free will and we're-

Alexander McCaig (09:16):

We don't want to get into that.

Jason Rigby (09:17):

Yeah, we're not even going to talk into that part, but it's horrific, it's an atrocity, that's a human rights issue.

Alexander McCaig (09:22):

It's a human rights issue.

Jason Rigby (09:24):

And that's where the Norwegian authorities were at the time, because now, let's say you're on the Grindr app, you're in one of these countries, now there's an issue with you being targeted with illegal activity.

Alexander McCaig (09:38):

Correct. So you're automatically a felon in a country, because some CEO thought it was a great idea to make an economic income stream off of brokering your private data to third parties, about your own personal life.

Jason Rigby (09:56):

Well, I don't mean to interrupt you, but this is Zach Edwards a researcher, he says it in a great way, it's exactly what you're saying. He says it this way, "No one should be doxed and outed for adult consenting relationships."

Alexander McCaig (10:09):

Correct.

Jason Rigby (10:09):

"But Grindr never treated their own users with the respect they deserve. And the Grindr app has shared user data to dozens of ad-tech and analytics managers over the years."

Alexander McCaig (10:20):

I was working for a consulting firm sometime ago up in the Northeast. And a company had reached out regarding a new tool for banks to understand their customers better, data insights. And this is a relatively new firm, but they had a good amount of money, they were up there in Boston.

Alexander McCaig (10:45):

So I went into their offices and they gave us a presentation. How little they need, and where they get the data from, and how it's triangulated, was shocking. When you call into a call center, you know how they say, "These calls are recorded," the tone of your voice, the length of time you're speaking, regardless of the call center, all get roped into one singular algorithm. So it doesn't matter what company you call, wherever, they know exactly who it is. Doesn't matter what the phone number is, just off the tone of your voice or the cadence of your speech.

Jason Rigby (11:29):

Well, the sad part too is, if you're calling AT&T, and then you're calling Comcast, and then you're calling Levi's, with all those having issues or orders, they're collecting your voice, they know who you are, they're collecting your voice data over. They're saying, "Okay, here's Jason Rigby. He's called these three people. He has this response to customer service." And then they can give you a social score, which is what the Chinese are doing.

Alexander McCaig (11:57):

Right.

Jason Rigby (11:58):

Credit social systems.

Alexander McCaig (12:00):

But here's the most difficult part, and I will share this to anyone who's listening. I asked them there, because I didn't agree with what was going on, and by the way we didn't go through with using their products, I said, "What is the most difficult, or what is the highest point of friction for you really getting a full profile on someone?" And they said, "There's two things, location data, and IP address, those are the crown jewels for all of these algorithms." So I would tell all of you don't share your damn location data with anybody, unless you know specifically how it's going to be used, and two, get a VPN.

Jason Rigby (12:42):

Well also, there's this new industry coming, identity resolution industry is what they call it, it's a nice way for customers to buy sensitive data. So these companies, "Promise to match mobile advertising IDs, unique codes assigned to mobile phones by the ops center, and which tech companies have repeatedly assured consumers are anonymous, or at least pseudo-anonymous, to real world identities." This makes unmasking people in data sets even easier, because you have the mobile ID number, so you can take the data that you have now, correlate it together, and then you can figure out which phone belongs to who just by buying that information, it's very simple.

Alexander McCaig (13:20):

I've literally seen how the Feds use IMEI numbers. When you think you're connected to a cell tower, you don't understand that you are pre-hopped, which means it's going to hit essentially a false tower first to collect all those numbers, and track everybody wherever they're going before it goes to the real tower, they do it all the time. They put them on cars.

Jason Rigby (13:43):

Yeah. I mean, I would imagine the government is probably the worst at this.

Alexander McCaig (13:47):

They don't care.

Jason Rigby (13:48):

Governments.

Alexander McCaig (13:49):

What do they care?

Jason Rigby (13:50):

I mean, because you've got to think of it, so let's say the crown jewels of data privacy, what countries would be the best? Maybe some in the UK, we're probably one of the worst here in the United States, but think about countries like China, or Russia, or some of these other countries that do not give two fucks.

Alexander McCaig (14:10):

They don't care at all.

Jason Rigby (14:11):

Think about what they're getting, and what data information they have.

Alexander McCaig (14:13):

Everything.

Jason Rigby (14:13):

I mean, they have that. They have surveillance systems.

Alexander McCaig (14:15):

Edward Snowden, XKeyscore. They're just tapping into your webcam, they don't have to turn the light on. Why do you think they offer those webcam cover things? The light don't have to be on for it to work. You know the old school TV receivers for cable, or Direct TV and stuff like that, they all had microphones in them.

Jason Rigby (14:33):

Well, I mean, two episodes ago or whatever it was, we talked about Europol collecting all this data. I mean it's just constant.

Alexander McCaig (14:39):

Nielsen, the metrics company? You think your cable box just magically has this data? "I thought it was just a cable. I thought it was just copper wire coming in here." There's so much more going on.

Jason Rigby (14:51):

Well, my girlfriend and I, we put one of the little Google speakers in the bedroom. And so then it was funny how all of a sudden, now I don't know, this could be a conspiracy theory, our ads were about stuff that we were talking about.

Alexander McCaig (15:05):

Does it all the time. We've seen it with Instagram, the mic's always on. They're always listening to your voice. Skype does this.

Jason Rigby (15:11):

Well, I know Apple got in trouble for that in some country in the EU.

Alexander McCaig (15:14):

But Skype still does it. They transcribe hundreds of languages all the time. Every time someone goes on Skype to do a call, you're just feeding their algorithms.

Jason Rigby (15:25):

Yeah, that's all it is. So like I said, we wanted to bring this up because this is the seriousness of this. You're taking somebody that had a career, that had been in the public limelight, not doing anything wrong.

Alexander McCaig (15:43):

No, nothing wrong.

Jason Rigby (15:44):

Doing nothing wrong. And in turn they get outed, have to resign, because somebody doesn't like this person and decides to-

Alexander McCaig (15:52):

... use the data that they were creating themselves, that they had rightful ownership of, to then take that from them and use it against them.

Jason Rigby (15:59):

This is Game of Thrones shit.

Alexander McCaig (16:00):

This is Game of Thrones in 2022. Jason, it's just wrong. It's just flat out wrong. There's no other way to describe it other than when it's just so inconsolably... It's just gross. No one cares. Think about how people act on Twitter. You think they're going to give a shit using your data against you?

Jason Rigby (16:20):

So, instead of them just taking this data for free, and then making us slaves to them because we're not getting paid for it, how can somebody sign up for TARTLE? How long does it take?

Alexander McCaig (16:31):

Literally 30 seconds. It's like 30 seconds.

Jason Rigby (16:36):

You've got a PayPal account? How quick is it to get your wallet set up to start getting paid for data?

Alexander McCaig (16:40):

Well, I mean, if you have your password manager, and it auto populates, I don't know, 10 seconds, if that.

Jason Rigby (16:46):

So then you have a wallet, you have PayPal.

Alexander McCaig (16:48):

You've got PayPal.

Jason Rigby (16:49):

So if you're one of the 200 plus countries that work with PayPal, you're in.

Alexander McCaig (16:53):

Oh, if you have social media data and stuff like that you want to pull back down and take ownership of, and you want to sell to the advertising companies do it.

Jason Rigby (17:01):

Yeah, that's what we're here for.

Alexander McCaig (17:02):

You want to participate in a health study, but have control over all the information and not some group of lawyers or somebody else, do it, it's all you.

Jason Rigby (17:09):

So, when we say take control of your data, what do you mean by that?

Alexander McCaig (17:13):

Taking control of your data means, the information which I create on my phone, computer, laptop, walking into a hospital, anytime I'm tracked, that information is my asset. I have worked to create it. So I'm going to take that thing which I own, put it in my hands, and I'm going to choose to share it on my time. Not some third party to broker it away like Grindr did. It's my choice, my time, my data, I'll share @tartle.co.

Speaker 3 (17:48):

Thank you for listening to TARTLE cast with your hosts, Alexander McCaig, and Jason Rigby, where humanity steps into the future, and source data defines the path. What's your data worth?