Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
June 23, 2021

Govt Abuse COVID Data. Why We Can't Trust Governments with Big Data

Govt Abuse COVID Data
BY: TARTLE

COVID Contact Tracing

COVID, COVID, COVID. All the news these days seems to be about that little virus. If you like that kind of thing, then you’re in luck because we are here to talk about it today. More precisely, we’ll be talking about how governments are misusing the data they’ve been collecting due to the pandemic. 

Back around a year ago as of this writing tech companies started to put out contact tracing apps that would allow whoever had access to it to let you know if you have been in proximity to someone with COVID. That way you could go get tested or isolate if you wanted to go that route instead. Sounds like a great idea, doesn’t it? Thing is, it’s basically a tracking device that keeps tabs on where you are all day every day. They then can cross reference that data with your friends and family who are using that app and so determine who you talk to and what apps you use to do so. The government could even get a fairly good idea of the kind of things you are buying by determining which stores you walk into based on your location data. 

The program was rolled out in Singapore and was held up by many as a model of how to respond not just in the case of COVID but for any future pandemic. However, that was before it was revealed that the government was very happy to use the data being gathered in ways that it wasn’t intended. This may have come as a surprise to some, after all, the government had promised that there would be ‘robust’ privacy protections in place. While many wisely doubted those promises and suspected or assumed that this was happening, it only became public knowledge in January when it came out that Singapore was using the contact tracing data in a murder investigation. Once again, a government promise was bunk and a program begun with good intentions was perverted from the original intent. 

A natural response to this is ‘what’s the big deal?’ After all, do we not want murders solved using every tool possible? The problem isn’t really the use of the contact tracing data in the murder investigation, the issue is that this use is the proverbial camel’s nose under the tent. While the policy may currently be to only use the available data for serious crimes, how long before they use it for less serious crimes? Or for the ‘prevention’ of crimes? It’s easy to see how in the United States that kind of tracking data would get appropriated under the Patriot Act and used to arrest or harass people who haven’t actually done anything yet. Fears of exactly that sort of thing happening partially explains why contact tracing apps haven’t gained as much traction in America as they have in other places. 

Hopefully it will stay that way. In Singapore, 78% of its citizens use the tracing app, which as we now know gives the government access to and control of their location data. That’s worrisome precisely because the government just takes it for granted that they can do this. Wouldn’t it be better if they at least had to ask?

How would that work? Very simple. If a crime were to occur and other evidence showed that you were at the scene or in contact with the criminal (security camera footage or witness testimony for example) the police could simply ask you for access to your phone because they think it would help. It really is pretty simple. Not only is it simple, it’s ethical because they are treating their citizens like individuals with rights and not subjects who must obey. It’s a big difference and one that TARTLE considers very important. 

Abusing data in this way violates two of the big seven, human rights and government transparency. TARTLE will continue to work to improve this for as many people as much as we can. In doing so, we’ll never sell your data or so much as look at it. We are merely a tool for you to use, not the other way around.

What’s your data worth? Sign up and join the TARTLE Marketplace with this link here.

Summary
Govt Abuse COVID Data. Why We Can't Trust Governments with Big Data
Title
Govt Abuse COVID Data. Why We Can't Trust Governments with Big Data
Description

Back around a year ago as of this writing tech companies started to put out contact tracing apps that would allow whoever had access to it to let you know if you have been in proximity to someone with COVID.

Feature Image Credit: Envato Elements
FOLLOW @TARTLE_OFFICIAL

For those who are hard of hearing – the episode transcript can be read below:

TRANSCRIPT

Speaker 1 (00:07):

Welcome to TotalCast with your hosts Alexander McCaig and Jason Rigby where humanity steps into the future and source data defines the path

Alexander McCaig (00:25):

Oh, we're back. We're fired up and we're going to have to call out what is it, Singapore?

Jason Rigby (00:29):

Yes, Alex, we have by bloomberg.com, "Governments tap COVID data for other uses risking backlash". There was a software engineer named Harish Pillay and he decided that "I want to do whatever it takes to be able to help Singaporeans", if there's such a word.

Alexander McCaig (00:48):

Yep.

Jason Rigby (00:48):

Singaporeans, live and be able to work through this pandemic.

Alexander McCaig (00:54):

Right.

Jason Rigby (00:54):

So he decides to help. He emails the Minister In Charge and he asked how he could help. He becomes a part of a Fellowship of Developers and Engineers and they volunteer.

Alexander McCaig (01:07):

They all volunteered their time,

Jason Rigby (01:10):

Right.

Alexander McCaig (01:10):

Their resources to develop a tracing app.

Jason Rigby (01:13):

Yes.

Alexander McCaig (01:13):

Essentially trace where people are going back and forth. Right?

Jason Rigby (01:16):

Yeah. It's a Red Hat open source software.

Alexander McCaig (01:18):

So it was based off of Lennox. Okay. And they came in, they did this open source software. They wrote it up and fantastic, let's get it out to the public. Let's have them start using it. So we can just mitigate the spread of this -

Jason Rigby (01:32):

Explain what the trace app does.

Alexander McCaig (01:34):

So the trace app asks you information. Have you come into this specific place? Scan "Yes. I'm here. I'm checking in". So they know when you go to place to place outside [crosstalk 00:01:43] right. It's essentially like a low form of GPS.

Jason Rigby (01:45):

And then it turns around and says, "Oh, okay". There was a guy at the gym that you were at.

Alexander McCaig (01:50):

Yeah Joe Smith -

Jason Rigby (01:51):

Yeah was infected -

Alexander McCaig (01:52):

Joe Smith was infected and he was at a gym with a thousand other Joe Smiths and we have all their names and cell phones and data and locations.

Alexander McCaig (01:59):

So now we want to inform those people and have them go get tested or anything that might happen from that.

Jason Rigby (02:04):

So it was traced together. It was an app and everybody went hardcore on it and decided to hop in.

Alexander McCaig (02:10):

All the open source -

Jason Rigby (02:10):

All collective, super nice.

Alexander McCaig (02:12):

It's all those really good people that are online and really, this is important and we want to develop towards it.

Jason Rigby (02:17):

Then the government decided to do something that it always does.

Alexander McCaig (02:20):

Right.

Jason Rigby (02:21):

Number six, what is number six on our big seven?

Alexander McCaig (02:23):

Oh government corporate transparency. Let's talk about them not being transparent.

Jason Rigby (02:26):

Mm-hmm.

Alexander McCaig (02:28):

They found this out after the fact. Can you read those two things down there we have highlighted?

Jason Rigby (02:33):

Yeah. The problem was being solved by creating this tool. But there was aspects of trust and confidentiality, which also needed to be addressed.

Alexander McCaig (02:40):

Mm-hmm.

Jason Rigby (02:40):

Said Pillay, who has worked on Red Hat's open source software for much of his career and firmly believes in transparent technologies.

Jason Rigby (02:46):

So we understand all of these things, let the community help you do the right thing. In the beginning, Singapore was held up as a model for other nations

Alexander McCaig (02:54):

Mm-hmm.

Jason Rigby (02:55):

As the government encouraged people to download the Tracetogether app to their smartphones it published the source code and promised strict limits on data use.

Alexander McCaig (03:01):

Right.

Jason Rigby (03:02):

It promised - here's the government making new promises.

Alexander McCaig (03:04):

Mm-hmm.

Jason Rigby (03:05):

Developers from around the world pitch in to hone and debug it in real time.

Alexander McCaig (03:08):

Yeah.

Jason Rigby (03:09):

Now the early optimism is fading.

Alexander McCaig (03:12):

Yeah. But that promise, the architecture was designed where the government still had control over the data. Before we get into it, all things are mitigated when you take responsibility for your data. Don't give the responsibility to your government or to a business, take the responsibility of the management into yourself and you hold onto it and you share it when you choose to do so.

Alexander McCaig (03:34):

But because it didn't have that architecture that we've proven here at TURTL, let's talk about what happened.

Jason Rigby (03:42):

Only months after the Minister In Charge vowed it would only be used for COVID containment and that's it, transparency, they decided to take the app data and use it for a murder investigation.

Alexander McCaig (03:54):

They didn't ask you, they used your personal information, they went in and they were tracking people and they pushed it into the police. What jurisdiction apparently their police have for doing their investigations. They start scraping and using all your data doing an analysis so that they can solve their own stuff. You have no control over the data. They have control over this entire asset that is yours.

Jason Rigby (04:14):

It would have been simple and you know they didn't even apologize and that's what pissed everybody off. Instead, they began to make plans to formalize the ability of police to access data in specific cases.

Jason Rigby (04:24):

Why don't you? This is as simple. [crosstalk 00:04:27] Why don't you ask somebody, can we use your open trace data or whatever this app is. Can we use your data to help in a murder investigation? Yes or no? Ask them now.

Alexander McCaig (04:41):

What's wrong about that. Oh, by the way, would you mind giving it to us? Oh my gosh. Wow. You know, it's insane that's like someone just coming up and taking your car and driving away. It's grand theft auto.

Jason Rigby (04:55):

Yes.

Alexander McCaig (04:55):

What do you call it? Grand data. Auto grand theft data. That's what we're going to call it. It's called grand theft data. That's where a government or corporation comes in and uses your data without your consent. Oh, by the way, don't store your keys at the office.

Jason Rigby (05:10):

Mm-hmm.

Alexander McCaig (05:10):

Keep them at the house, which you own and you have access to.

Jason Rigby (05:14):

Yeah but because they're police and government, they can just walk into your house and grab your keys anytime.

Alexander McCaig (05:19):

You think they can do that anytime they want?

Jason Rigby (05:20):

Yeah. That's and this is what he said, Mr. Pillay. He said "I felt disappointed. The trust factor that was there was reduced."

Alexander McCaig (05:30):

What did they say in the military [crosstalk 00:05:31] and military trust, but verify or something.

Jason Rigby (05:36):

Yeah exactly.

Alexander McCaig (05:37):

You trusted, but you did not verify, that these people are persons or work

Jason Rigby (05:41):

but isn't the way it always goes with every government program.

Alexander McCaig (05:43):

Every time.

Jason Rigby (05:44):

It's always a disappointment.

Alexander McCaig (05:46):

Let's talk about this. What was the initial idea? Oh, nuclear technology. So cool. We fixed all of our energy problems. Guy comes in. Can we make a bomb out of it? Sure. It's the same thing. It's the same dammed metaphor. We create something good and we give it to the power of these individuals who we put our trust in then realize don't hand them the power. You hold the power.

Jason Rigby (06:06):

Well, here's the sucky part. 78% of Singapore's residents use the app.

Alexander McCaig (06:13):

Great. That means they can ..

Jason Rigby (06:15):

The government has almost 80% of GPS data.

Alexander McCaig (06:20):

They own all this personally identifiable info on people now.

Jason Rigby (06:24):

Yeah.

Alexander McCaig (06:25):

Can the Singaporean people sue their government for the data back.

Jason Rigby (06:29):

We need a new minister.

Alexander McCaig (06:30):

Yeah. Screw that. Don't sue them. Why don't you guys just all hop over on TURTL and start putting your data in there. If you want to do traceability data, we'll do A traceability package on TURTL.

Jason Rigby (06:39):

Yeah. This is what he said. "The police must be given the tools to bring criminals to justice and protect the safety and security of all Singaporeans.

Alexander McCaig (06:46):

Ethical with morals.

Jason Rigby (06:48):

I agree with him. [crosstalk 00:06:51] Just ask me or, or have a police officer. It's very simple. [crosstalk 00:06:54] I'm going to throw this laptop, I'm going to rip the studio apart. There's one thing that you could do it this way. Have that police officer that's investigating the thing. "Hello, Mr. So-and-so we're investigating a murder and on your phone, you have the ability. But before we get this information, because we want to make sure that we respect you as a citizen. Can we sir, ma'am -

Alexander McCaig (07:19):

yeah.

Jason Rigby (07:20):

Have this data so that we can solve a murder. What are most people going to say?

Alexander McCaig (07:24):

Here's the real -

Jason Rigby (07:25):

Sure.

Alexander McCaig (07:25):

Here's the real question now. How did you know to ask me? Oh, we already looking at the data.

Jason Rigby (07:34):

Yeah

Alexander McCaig (07:36):

Like, are you kidding? I'm going to flip this table. It's got to weigh a hundred pounds [crosstalk 00:07:41].

Jason Rigby (07:42):

Jesus did that when he went into the temple.

Alexander McCaig (07:43):

Yeah, there we go. We're flipping stuff now. So this is crazy. So South Korea,

Jason Rigby (07:46):

Oh, it's spreading.

Alexander McCaig (07:48):

In South Korea, private sector, contact tracing apps became increasingly invasive. One provided the exact location of every place of business or home visited by a positive case. Government workers were able to review hundreds of hours of surveillance camera footage and go through mobile phone and credit card transactions to track people down.

Jason Rigby (08:03):

That's messed up.

Alexander McCaig (08:06):

We're not going to go into what they've been finding out in China's trace. In Malaysia the Health Minister mandated businesses destroy the personal records of visitors to their premises within six months after government ordered tracing ended. In Israel, the Supreme court banned the country's Intelligence Agency from using technology to track COVID-19.

Jason Rigby (08:24):

That's what I'm talking about.

Alexander McCaig (08:25):

In Australia, Federal legislation was passed to prevent data collected in the country's COVID app from being used for any purposes beyond contact tracing.

Jason Rigby (08:32):

Look at me. I'm clapping. I don't even know if this is getting picked up on the mic.

Alexander McCaig (08:34):

Yes. So there are ethical considerations in using tracking technology.

Jason Rigby (08:39):

We need ethical considerations. You don't have to consider it. It is a thing of ethics. There's no considering it, it is or it isn't.

Alexander McCaig (08:46):

Yes. So you made a decision to ...

Jason Rigby (08:50):

Should we consider being ethical.

Alexander McCaig (08:53):

We won't get into, let's not get into corporate transparency and.

Jason Rigby (08:57):

Oh man.

Alexander McCaig (08:59):

Let's mic all these boardrooms with the C-suite guys and see if that is ever asked in any of their meetings,

Jason Rigby (09:12):

you know [crosstalk 00:09:12] you know how quickly someone would be fired.

Alexander McCaig (09:14):

in a second?

Jason Rigby (09:17):

Did we not just have a TURTL meeting where I said, as Chief Conscious Marketing Officer, did we not just talk about this thing?

Alexander McCaig (09:23):

We'd literally just talked about it.

Jason Rigby (09:25):

We had a 10, 15 minute about the next direction of where we're going in the marketing and it was all based off of this.

Alexander McCaig (09:31):

All of it.

Jason Rigby (09:31):

Yes.

Alexander McCaig (09:32):

Do you remember a Seinfeld George Stanza? He's getting all neurotic because he's got that board seat position. He lied about who he was. So he puts the tape recorder in the briefcase and leaves it on the room.

Jason Rigby (09:42):

Yes. I remember that. Yeah.

Alexander McCaig (09:43):

That's where it all ends.

Jason Rigby (09:45):

It's a slippery slope and be careful, you're sovereign over your own data. Be careful -

Alexander McCaig (09:52):

who you give it to.

Jason Rigby (09:53):

who you're giving it to.

Speaker 1 (10:02):

Thank you for listening to TotalCast with your hosts Alexander McCaig and Jason Rigby. Where humanity steps into the future and the source data defines the path. What's your data worth?