Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
August 1, 2022

KNOW MORE: Interesting Talks on The Art of War, New Tech, & the Future

aircraft
BY: TARTLE

If we aren’t careful about our choices, social media can easily turn around and start shaping your identity. But what really gets the ball rolling is how you decide to use the platform. The algorithms are designed to self reinforce: it is our decision to consume a specific type of content on social media, and the algorithms shape themselves around that.

This goes beyond cat videos and memes (which, in themselves, have morphed into a political tool). We have to be mindful about how far we take our perspective on cops, guns, reproductive health, other people and countries because it’s so easy to fall into the rabbit hole of radicalism.

The Role of Social Media in Reinforcing Bias

Alexander and Jason discuss how social media is probably more powerful than nuclear. While it may seem difficult to believe that something so ubiquitous has the capacity to change humanity forever, even the government has created a center for analytics dedicated to artificial intelligence, the internet, and online platforms. 

The truth is that social media, through their algorithms, create and reinforce hardcore biases. We live in a new era where information warfare is capable of shaping entire communities. We saw this happen in the Cambridge Analytica scandal. It won’t be the last time that our echo chambers will be exploited for the benefit of those at the top.

Ethical Source Data is the Only Way Forward

If we must live in a world where information is a weapon, we need to protect ourselves by learning how to use our data for our own benefit. 

In theory, artificial intelligence and algorithms are objective. But these are passive aggressive technologies that are hidden behind so much legal jargon, which are difficult to comprehend for the average user. In addition, the developers behind them have their own biases, which bleed into their creation.

It’s time to move to a platform where you are directly connected to companies and organizations who are invested in ethical data sourcing. It is through this shift that we can take back the power to shape societies and return it to the people.

What’s your data worth?

Sign up for TARTLE through this link here.

Summary
KNOW MORE: Interesting Talks on The Art of War, New Tech, & the Future
Title
KNOW MORE: Interesting Talks on The Art of War, New Tech, & the Future
Description

If we aren’t careful about our choices, social media can easily turn around and start shaping your identity. But what really gets the ball rolling is how you decide to use the platform. The algorithms are designed to self reinforce: it is our decision to consume a specific type of content on social media, and the algorithms shape themselves around that.

Feature Image Credit: Envato Image
FOLLOW @TARTLE_OFFICIAL

For those who are hard of hearing – the episode transcript can be read below:

TRANSCRIPT

Alexander McCaig (00:25):

All right, and we're back. Hurry up intro, it's still not short enough. I just want a noise that goes beep and then they're, "You're on."

Jason Rigby (00:32):

No, need the noise for Squid Games when it starts.

Alexander McCaig (00:35):

What's that?

Jason Rigby (00:36):

I think it was like one of those... and then... Like those siren things.

Alexander McCaig (00:39):

I like that.

Jason Rigby (00:40):

You know with the door's opening and the red lights flash?

Alexander McCaig (00:42):

That's so cool.

Jason Rigby (00:43):

Or you can push that big plastic red button and it stops.

Alexander McCaig (00:45):

That's so cool.

Jason Rigby (00:48):

With machine guns, whoever gets there first.

Alexander McCaig (00:49):

Was that like an ASMR thing or whatever it's called, we're all...

Jason Rigby (00:54):

There you go, sipping some Guatemalan coffee.

Alexander McCaig (00:56):

It's fantastic, thank you, Guatemala.

Jason Rigby (00:57):

Amazing. We love coffee.

Alexander McCaig (00:59):

To anyone who's in Guatemala listening to this, thank you very much for the coffee.

Jason Rigby (01:02):

100%. Anybody that's working in coffee, we'd love to hear. We need a coffee data packet.

Alexander McCaig (01:12):

What's this about? I didn't even know what we're talking about.

Jason Rigby (01:14):

This is data scientist, think that they can predict the next January 6th. January 6th was the attack on the Capital, so this is a Washington Post article, so Washington Post reports on this. It says, "For many Americans who witnessed the attack on the Capital last January 6th, the idea of mobs of people storming a bedrock of democracy was unthinkable." They say-

Alexander McCaig (01:34):

Mobs?

Jason Rigby (01:35):

Mobs.

Alexander McCaig (01:36):

Hold on. Mobs? It's the bedrock of democracy. The bedrock. There's nothing democratic that happens over there. It's borderline fascist. That's absolutely hysterical, whoever wrote that.

Jason Rigby (01:54):

No, no, I like this. Here's the part here. It says for many Americans who witnessed attack on the Capital, the idea of mobs. It's like these people are still Americans, they were maybe crazy, maybe a little too pissed off, but they Americans.

Alexander McCaig (02:08):

Still Americans.

Jason Rigby (02:08):

They weren't... And then like you said, the bedrock of democracy.

Alexander McCaig (02:13):

Oh my God. That's classic.

Jason Rigby (02:15):

Was unthinkable.

Alexander McCaig (02:16):

Well...

Jason Rigby (02:17):

This is like a teacher that wrote this or something. You know those prudish teachers.

Alexander McCaig (02:19):

I'm going to analyze this right now. I haven't even read the article, I didn't even know what we were talking about. This is what's going to happen, I'm going to call the whole shots on this. Ready?

Jason Rigby (02:26):

Yes.

Alexander McCaig (02:26):

The data scientist analyze social media data before the event to get a sentiment analysis of what people were going to do, then cross reference all their times or comments, everything that was going on and can predict the next time there will be social unrest and where it will be.

Jason Rigby (02:42):

Well, the worst part about this is...

Alexander McCaig (02:43):

Was I close?

Jason Rigby (02:44):

Yes, you're close.

Alexander McCaig (02:45):

Thank you.

Jason Rigby (02:47):

Machine learning's evolved, which of course you would know. The methods of machine earning, finding political violence, insurrection. And they've been working on it actually for a few years, even before this. He says, "We now have the data and the opportunity to pursue a very different path than we did before." Clayton Besaw, he helps run Coup Cast, a machine learning driven program based out of the University of Central Florida that depicts the likelihood of coups and electoral violence for dozens of countries each month.

Alexander McCaig (03:17):

I'd like a good coup.

Jason Rigby (03:20):

Here comes the dichotomy and this is the problem that we have in society right now.

Alexander McCaig (03:24):

What's that, Jason?

Jason Rigby (03:26):

Is, and I want to go on a tangent on this a little bit, and then we'll get back to the article. We have a technology, I would say social media is more powerful than nuclear.

Alexander McCaig (03:41):

Oh.

Jason Rigby (03:41):

I would say social media is creating hardcore biases because the algorithm...

Alexander McCaig (03:48):

Reinforces the bias.

Jason Rigby (03:49):

The algorithm reinforces the bias over and over and over again, it's pursuing this, and I shared this, if you watch police violent videos, you're going to think all cops are bad. If you watch amazing things that police do, where they help people and save people and you watch an hour of each and there was a scientist and he said this, a data scientist, he goes, "I was doing this study," he was doing this study where he was looking at one hour of this and one hour, he said, "I knew I was in a study and I was still getting emotionally charged."

Alexander McCaig (04:15):

That's unbelievable.

Jason Rigby (04:16):

"Over either way, I was still getting swayed, and I knew in my brain." This is how powerful social media... We've given our sovereignty up to this. You have AI. Now, AI can do a lot of amazing things. If you have Neuralink and you have AI, we may be able to have consciousness experiences that are amazing in VR world.

Alexander McCaig (04:37):

You don't even have to actually storm DC, you can do it in your mind.

Jason Rigby (04:42):

Exactly.

Alexander McCaig (04:42):

Check your avatar and storm the bedrock of democracy.

Jason Rigby (04:45):

Could you imagine if they could change the neurochemical in our brains...

Alexander McCaig (04:50):

You can with electro stimulus.

Jason Rigby (04:51):

Yes, to electro stimulus, and then allow us to automatically be in a big delta wave meditation. That we could put the goggles on, put whatever, the apparatus' on-

Alexander McCaig (05:01):

Dude, I'd in a delta wave head set.

Jason Rigby (05:02):

... because they have haptic sensory and all that stuff. And then next thing you know, boom, within a few seconds you can meditate for 10 minutes and you're completely charged like you're a monk that's meditated for two hours. We have potential, technology can be potential.

Alexander McCaig (05:16):

Where are you going with the potential then? Tell me what...

Jason Rigby (05:19):

The problem with this, I think, is it's the human. We want to blame Facebook.

Alexander McCaig (05:27):

It's not.

Jason Rigby (05:28):

We want to blame the mob. We'll get into more of the cool technology shit with here, but I want to get philosophical with this. If you...

Alexander McCaig (05:38):

Okay.

Jason Rigby (05:39):

I'll tell you one more thing. There was that comedian that said all he did on YouTube was watch cat videos, and all that popped up was cute cat videos over and over. That's all he watched. And that's all that popped up. Why is all this other crazy popping up on our feed?

Alexander McCaig (05:57):

The algorithms are designed to self reinforce. We though, as human beings reinforce our choice to watch, the algorithm is secondary. You can blame the tech, but you've made conscious choices to repetitiously continue to watch these things thus reinforcing that algorithm to feed you things that are even more catalyzing.

Jason Rigby (06:28):

Bro, it's that simple. It's one thing. The algorithm's all about one thing, engagement.

Alexander McCaig (06:32):

That's it. And you're giving it what it wants through your choice, but you are not recognizing the fact that you are reinforcing the negativity or the positivity that comes to you through the reflection of that media, through the consumption of that visual media. It's all you, it's no one else.

Jason Rigby (06:54):

In TARTLE we talk about sovereignty and personal responsibility and all these things. But when it comes to something like this, no one wants this. It happened in Seattle where they took over a block. We don't need things like this.

Alexander McCaig (07:09):

We don't need any formed violence.

Jason Rigby (07:14):

We don't need coups in, wherever it may be.

Alexander McCaig (07:15):

Wherever, it doesn't matter.

Jason Rigby (07:16):

Brazil or whatever, or Venezuela is having your issues, or Australia has the most horrendous lockdown known to man, everybody's talking about that. Now in the UK, you got huge, thousands and thousands of people are rioting on the streets, protesting not rioting. I made a mistake with that word riot.

Alexander McCaig (07:32):

Actually, people blend those together.

Jason Rigby (07:34):

They do all the time and it's wrong.

Alexander McCaig (07:35):

It's wrong.

Jason Rigby (07:36):

Because protesting is amazing and it's the freedom that we have and that's awesome. Rioting is a whole different thing.

Alexander McCaig (07:41):

Totally different animal.

Jason Rigby (07:42):

See there I am getting influenced by the, "People are rioting," headlines. But whenever we look at this Clayton Besaw, in this coup cast, it's a great... Tea cast, but coup cast. He said there's been a ton of sounding alarms that he's seeing in the United States and even he's worried about a military coup, 2024.

Alexander McCaig (08:16):

All right. Fine. Fuck it. Here we go, you ready? You and I know of...

Jason Rigby (08:22):

You and I.

Alexander McCaig (08:24):

Know a large breadth of people, many different backgrounds, military, non-military what have you.

Jason Rigby (08:30):

Right.

Alexander McCaig (08:32):

What are they all doing right now? They're buying guns.

Jason Rigby (08:36):

Yes.

Alexander McCaig (08:38):

Tons of them.

Jason Rigby (08:38):

People that were on the left have asked me, "Hey bro, what type of gun should I get?"

Alexander McCaig (08:43):

Okay.

Jason Rigby (08:43):

You know how many times I've got that?

Alexander McCaig (08:44):

There is this underbelly talk going on of a new White House, of civil unrest that's going to happen, separations of government within the United States. If you're paying attention to what these people feel is a reality they're self-reinforcing that reality in those groups. And if you talk to many different individuals, there is a great sense, here just in the US alone, that a major coup is going to happen.

Jason Rigby (09:26):

Yes.

Alexander McCaig (09:27):

Borderline civil war.

Jason Rigby (09:29):

And social media's just the kindling.

Alexander McCaig (09:32):

It's just the kindling.

Jason Rigby (09:33):

It's that dry moss.

Alexander McCaig (09:34):

But every time people meet up, they're having conversations,

Jason Rigby (09:37):

Facebook groups.

Alexander McCaig (09:38):

... backdoor trading of arms, all that other stuff.

Jason Rigby (09:42):

I have somebody that put me on this text thing with everybody else that sends me all these crazy videos-

Alexander McCaig (09:46):

Crazy stuff.

Jason Rigby (09:47):

... conspiracy theory videos. And it's a big text thread.

Alexander McCaig (09:50):

Here's the thing, we know it's crazy and illogical, but the people who are on it, self reinforcing it for those others that are self reinforcing it, that then becomes the reality. It's the same damn thing that happened in Nazi Germany where Goebbel's, if the lie is big enough and told enough times it becomes the truth.

Jason Rigby (10:10):

Yes.

Alexander McCaig (10:11):

People are feeling this subcultural agitation anti-government movement and they're actually physically amassing themselves in groups and arming themselves and taking to social media to reinforce whatever it is that they want to get done. That is what this algorithm is predicting.

Jason Rigby (10:36):

And then if you want to add more kindling or more mines-

Alexander McCaig (10:39):

There you go.

Jason Rigby (10:39):

... you put a pandemic in there, that's...

Alexander McCaig (10:42):

These groups that are already anti-government, and you put a pandemic in where government authorities are telling people how to operate, you are poking the bear.

Jason Rigby (10:53):

Speaking of the bear, here is their design on AI, their model, because you were saying social media. Here's what they do, they have quantifying variables, of course, a country's democratic history, they take their history into play, the democratic backsliding, they have that in parenthesis, backsliding, the economic swings, social trust levels, which is what you were talking about.

Alexander McCaig (11:18):

In the US, do you know what we have? Zero value. Zilch.

Jason Rigby (11:23):

Transportation disruptions, weather volatility, and then he says, and others, whatever his others are. The art of predicting political violence can be more scientific than ever, he says, with data.

Alexander McCaig (11:35):

Honestly, it doesn't even seem that scientific because I'll tell you why. If weather's chaotic you feel like your votes are not working for you and you have a lack of freedom of movement, what the hell do you think is going to happen?

Jason Rigby (11:49):

But I would invite Clayton Besaw from the University of Central Florida instead of making this huge AI model where you're just passively aggressively trying to figure out, why don't you just use TARTLE?

Alexander McCaig (12:00):

Just use TARTLE.

Jason Rigby (12:01):

It's way cheaper.

Alexander McCaig (12:02):

You don't even have to use the algorithm. All you've got to do-

Jason Rigby (12:06):

Just ask people.

Alexander McCaig (12:06):

... just ask, "Do you plan on rioting? Do you hate your government? Do you love your government?"

Jason Rigby (12:10):

One to 10, how agitated are you with the US government?

Alexander McCaig (12:13):

I don't know why they're trying to guess and predict when you can go on TARTLE and know for a fact. We ran a study the other day on asking-

Jason Rigby (12:19):

I wonder how much...

Alexander McCaig (12:20):

... people if they trust their government.

Jason Rigby (12:21):

I wonder how much money they spent on this?

Alexander McCaig (12:22):

So much.

Jason Rigby (12:23):

Millions and millions of dollars.

Alexander McCaig (12:24):

You can leave the grant, I'll give them a damn grant just to use our data. And it'll be, wow, it's wasting my time.

Jason Rigby (12:30):

It's funny how everybody is into this model of... But what they're worried about, and this is the funny part, because there's certain things you can't predict, local factors that play into unrest. You don't know if there's somebody that's... If I would've said in two thousand and... Let's go back to 2008-

Alexander McCaig (12:54):

Ah, I'm charged.

Jason Rigby (12:56):

... how old were you in 2008?

Alexander McCaig (12:59):

[inaudible 00:12:59], credit, default. Ah, it hurts.

Jason Rigby (13:01):

2008, if I'd have said, "Hey, Texas is going to succeed from the United States, or Donald Trump, the reality star guy is going to be president." Which would you have predicted to be more reliable?

Alexander McCaig (13:17):

Texas.

Jason Rigby (13:18):

Of course. And look what happened? The local factors, you can't predict things.

Alexander McCaig (13:24):

You can't, there's no predicting.

Jason Rigby (13:27):

I think Texas is still going to, if they can figure out a way to do it, they're going to do it.

Alexander McCaig (13:30):

Of course they will, why wouldn't they? If I were to chalk off the United States, you're going to have the Northeast, you're going to have North Central, you are going to have all of the West Coast and you're going to have Texas, New Mexico, Louisiana, Florida area all as their own governments.

Jason Rigby (13:51):

Yes.

Alexander McCaig (13:54):

You know how I can tell? The cultures. That's all it is.

Jason Rigby (13:58):

It's all it is.

Alexander McCaig (13:59):

It's all [inaudible 00:14:00].

Jason Rigby (13:59):

The South has a whole different culture, it's been this way. Texas has their own culture, the Midwest has their own culture. the West Coast, everybody jokes and calls it the Left Coast, they have their own culture. Northeast, New York and all that has its own culture.

Alexander McCaig (14:12):

It has it's own culture.

Jason Rigby (14:13):

New York City, in and of itself could be a fucking country.

Alexander McCaig (14:16):

I know. But this is the point, it's not going to separate because of weather, it's not going to separate because of transportation, it's only going to separate... It'll delineate itself on culture and nothing else.

Jason Rigby (14:31):

And doing risk assessments, like he talks about, on electoral violence...

Alexander McCaig (14:34):

Oh my gosh. Is the next breakdown of the United States just to become the states, going to be fueled by social media and memes. Is memes what broke the camel's back?

Jason Rigby (14:50):

Dude, people do not realize how powerful memes are. How many memes do you get a week? How many memes do you see a week? How many texts-

Alexander McCaig (14:59):

They're all memes.

Jason Rigby (15:00):

... of memes do you get?

Alexander McCaig (15:01):

They're all memes.

Jason Rigby (15:02):

How many memes do you...

Alexander McCaig (15:04):

Make? Send out.

Jason Rigby (15:05):

Yeah, send out.

Alexander McCaig (15:06):

I do all the time.

Jason Rigby (15:08):

We all do, across all countries.

Alexander McCaig (15:11):

Here's the interesting part about a meme. Remember, everyone's like, a picture's worth a thousand words. What happens when you tell people what the picture is? Game changer.

Jason Rigby (15:20):

Yes.

Alexander McCaig (15:21):

Okay.

Jason Rigby (15:22):

And to close this out.

Alexander McCaig (15:24):

What are we talking about? Oh yeah, the algorithm.

Jason Rigby (15:27):

Because this is a long thing. The Pentagon, the CIA and the state department have been moving into this direction too.

Alexander McCaig (15:35):

What direction?

Jason Rigby (15:37):

Wanting this data.

Alexander McCaig (15:38):

Why don't they just buy it off the American public?

Jason Rigby (15:41):

The State Department in 2020, while everybody's worried about the pandemic, they created a center for analytics. The CIA started to hire AI consultants and the military has a ton of new projects that they're looking at this.

Alexander McCaig (15:56):

You know what that tells me? The government authorities aren't that intelligent, because they're using AI when you can simply just ask somebody what they're going to do. It just tells me that they're just really not ahead of the game.

Jason Rigby (16:07):

Which is funny, there's this new thing that General Glen VanHerck came up with-

Alexander McCaig (16:12):

Cool name.

Jason Rigby (16:13):

... with NORAD and NORTHCOM, he's the NORAD and NORTHCOM commander, they have models now, AI models and software tools that determine in advance, which US actions, this is public, which US actions might upset China. We got to do a whole episode on that.

Alexander McCaig (16:31):

Don't do that.

Jason Rigby (16:34):

I got a friend in the military, mathematician, he spends his days analyzing culturally what's going on in an area to see what the political unrest will be to see if it's advantageous for the military to actually move in. They analyze cultural movements, they analyze the weather and they sit there and they look at the math and they run models and they're like, there's a weak point here. Oh, this is a good spot. And they run essentially scenarios on cultural changes all day long. And they pass the data off and then that's when the generals or what have you are, okay, here's our plan, this is the best time for us to do this, this is what's going on. The military, data driven, politics, data driven. What I would ask is that people stop predicting shit and start knowing for a fact.

Jason Rigby (17:33):

Well, this is the last statement, the last paragraph-

Alexander McCaig (17:36):

Go ahead.

Jason Rigby (17:36):

... and we'll end in this, because this is funny. He says the advocates that talk about this program, "But there's enormous unrealized potential to use data for early warning in action. I don't think these tools," this is a quote from him, "I don't think these tools are just optional anymore." But he says this because he doesn't know about TARTLE, he says, "It's not perfect and it can be expensive."

Alexander McCaig (17:59):

You know how you make it much cheaper and perfect? Go to TARTLE.