Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
June 28, 2021

Bio-Revolution Peril - Biological Science and Big Data Analytics

Bio-Revolution Peril
BY: TARTLE

Bio-Forks in the Road

Entertainment is full of examples of technology gone wrong. Every dystopian sci-fi movie makes use of this to some degree. Either technology runs amok and enslaves humanity as in The Terminator or The Matrix, or we become so enamored of a technology we enslave ourselves to it as in Gattaca. In still others, technology becomes a tool that is used to suppress humanity, most famously in the novels 1984, Brave New World and Fahrenheit 451. And if we are honest, we can look to all of these examples and see parallels with technological development today.

That’s because there is no such thing as a free lunch. Everything comes with some sort of trade off or a dark side. It will always be possible to take an objective good and pervert it to something destructive. The very real life development of nuclear power is a poignant example. Nuclear power, even the old school, brute force fission reactors that are still the most common produce tens of thousands of megawatts of electricity every hour. And they do this with no carbon emissions on the production end. The only thing stopping them from producing more is their relatively small number, with fewer than a hundred operating in the United States. 

However, with all that promise comes the proverbial dark side, which Hiroshima and Nagasaki experienced first-hand in 1945. While none have been used in war since then, the threat has loomed over the world like the Sword of Damocles. Trillions have been spent developing ever more powerful nuclear bombs and methods to deliver them. Trillions that could have been spent researching fusion reactors, an even more powerful energy source with a fraction of the radioactive waste of fission. Instead fusion research led to the hydrogen bomb, a type of nuke that makes Fat Man and Little Boy look like glorified fire crackers. 

We stand at a similar technological fork in the road today. As our knowledge of genetics and our ability to manipulate them grows, we will be faced with difficult choices on how to use this technology. The same technology that could eliminate genetic predispositions to various diseases could also lead to triggering those dispositions in others. Slowing down or eliminating aging could create a world of selfish would-be immortals actively preventing the birth and development of future generations. The same technology that creates a new vaccine could create a new virus to unleash on an unsuspecting world. 

Less dramatic is the idea that companies will simply use these advancements to control whole markets in new ways. Take the situation with genetically modified crops. While GMOs have been a great help in getting food to grow in environments that have typically been hostile, allowing more to be grown for and by those in challenging environments, there has also been a cost. Some, like Monsanto, control aspects of the GMO market with an iron grip. They do this either by engineering their seeds so they won’t germinate or in the case of a product that does, they have been known to sue farmers for their “intellectual property” because the GMO seeds germinated and spread into a neighboring field. That kind of action can kill a farmer’s business. In the case of the non-germinating seeds, a farmer is then forced to buy fresh seeds every year, instead of in the old days, growing this year’s crop from the last year’s seeds. That keeps prices artificially high and also puts farmers at risk should bad weather kill enough of their crop that they can’t afford to buy the new seeds. 

The point is that we have to be very careful with how we use our technology. It can often be used to destroy rather than help others. Not only that, the destructive option is usually the easier one in the short term. Just look at fusion again. We built a bomb with it decades ago but we still haven’t figured out how to make a commercially viable fusion reactor. 

Just as our choices with nuclear power defined much of the world for the latter half of the twentieth century, so our choices with genetic modification will define the world for what’s left of the twenty first. We must choose, and choose wisely.

What’s your data worth? Sign up and join the TARTLE Marketplace with this link here.

Summary
How Easy Sign Up TARTLE - Walkthrough 1
Title
How Easy Sign Up TARTLE - Walkthrough 1
Description

In still others, technology becomes a tool that is used to suppress humanity, most famously in the novels 1984, Brave New World and Fahrenheit 451. And if we are honest, we can look to all of these examples and see parallels with technological development today.

Feature Image Credit: Envato Elements
FOLLOW @TARTLE_OFFICIAL

For those who are hard of hearing – the episode transcript can be read below:

TRANSCRIPT

Speaker 1 (00:07):

Welcome to TARTLE Cast, with your hosts, Alexander McCaig and Jason Rigby. Where humanity steps into the future and source data defines the path.

Alexander McCaig (00:24):

Oh, somebody manipulate my genes, so I don't have to sleep and I can look young forever.

Jason Rigby (00:28):

Well, I've had the Moderna vaccine, the first shot. I'm getting the second shot in a couple of weeks. So I've been a part of this.

Alexander McCaig (00:34):

Yes, you have.

Jason Rigby (00:35):

Bio, what's it called? Revolution.

Alexander McCaig (00:37):

Yes. The bio revolution. And this whole bio revolution is founded on one thing - data. We have the technology to do things, like the CRISPR machine for gene sequencing, right? And also splicing genes together. But it requires a huge amount of data to analyze that, and also needs a lot of inputs so the algorithms know what to do effectively. Now, I'll just make this thing real simple here. This article comes out, talks about the bio revolution. Not only is it fantastic, it uses COVID as an example with the vaccines, getting all this data together quickly, somebody makes something. But the real gist of it is now that we have this wonderful technology and the availability of data, and for us to start engineering bio organisms or bio materials, we need to be careful. Because with all-

Jason Rigby (01:28):

Why is that, Alex?

Alexander McCaig (01:29):

Well, let's think about... We've used this example with nuclear technology. A lot of benefits, but we made a bomb out of it, right?

Jason Rigby (01:34):

Mm-hmm (affirmative).

Alexander McCaig (01:35):

So if you start engineering organisms, like mosquitoes, so they don't carry malaria or stuff like that, you got to make sure that this stuff doesn't get out of control or used with ill intent, right? Or if you're using a CRISPR machine to manipulate genetics to create a monster that is only there to harm people or anything of that nature.

Jason Rigby (01:56):

Cyborg warriors.

Alexander McCaig (01:56):

Yeah, cyborg warriors. So in the long and short of it, okay.... It's like what does it morally and ethically boil down to for how you're going to use this, frankly, quite amazing technology, to elevate humanity for good rather than have it be used against us? And make McKinsey, they're all about writing reports that sell risk. That's what it is. Let's talk about risk. That's the only thing we're selling here. They're a big consulting firm. And they were just trying to highlight some of those risks. But one of the biggest things that they keep highlighting is gene research. And for a long time, we've had tons of data. And because of our ability to process these huge sets of data now, with all of our cloud computing power, the ability for gene manipulation, to actually engineer things-

Jason Rigby (02:44):

Yes.

Alexander McCaig (02:44):

Genetically engineered life... Is much more of a possibility than it was when they first cloned that first sheep.

Jason Rigby (02:51):

Well, he talks about... And this article is great. The scope of today's bio innovation wave is large. Some 60% of physical inputs to the world economy are either already biological or could be produced using biological process in the future. And he uses an example of nylon.

Alexander McCaig (03:04):

Yeah. Nylon. It can be made from a yeast strain.

Jason Rigby (03:06):

Yes. Genetically engineered yeast.

Alexander McCaig (03:08):

Yeah. So what happens is is the transition away from petroleum products.

Jason Rigby (03:11):

Right.

Alexander McCaig (03:12):

Okay. So we're transitioning away from petroleum products. Now that we have this new substitute, how will it be adopted? How will it affect the oil and gas industry?

Jason Rigby (03:20):

And fewer greenhouse gases [inaudible 00:03:22].

Alexander McCaig (03:23):

Right. So there's some of the benefits. But then he drops down and he goes into the CRISPR nine.

Jason Rigby (03:25):

Yeah. The CRISPR-Cas9 stands out as increasing accessible technology for manipulating genetic material, and is complimented by rapid and low-cost genetic sequencing and advances in data analytics that enable scientists and biological processes better.

Alexander McCaig (03:37):

So this is literally exactly what I was just saying.

Jason Rigby (03:39):

Yeah. When we talked about them with the plants.

Alexander McCaig (03:41):

Yeah.

Jason Rigby (03:42):

And we had two episodes on plants.

Alexander McCaig (03:44):

I'm a proponent for genetic engineering.

Jason Rigby (03:46):

Right.

Alexander McCaig (03:47):

Why wouldn't you?

Jason Rigby (03:48):

Yes.

Alexander McCaig (03:49):

Why wouldn't you want to have healthier children, healthier babies, people that live longer? You have more robust crops. Ones that carry a higher nutrient density.

Jason Rigby (03:57):

But what he's talking about is you can buy CRISPR kits online now.

Alexander McCaig (04:00):

Mm-hmm (affirmative).

Jason Rigby (04:02):

So anyone with the degree of biological knowledge can potentially create and release a new living entity, including harmful bacteria or viruses.

Alexander McCaig (04:07):

I don't know much about CRISPR kits. Do you?

Jason Rigby (04:10):

No. And he said this. "Biological organisms are self-replicating, self-sustaining and interrelated. Moreover, as the rapid glowed spread of COVID-19 has demonstrated, they do not respect political borders. For example, so-called gene drives applied to infectious-disease vectors (such as Anopheles mosquitoes in the case of malaria) could save many lives, but we may not be able to control them." [crosstalk 00:04:31] So these genetically edited mosquitoes, in one [inaudible 00:04:33] in Brazil, were supposed to die. Because they tried this.

Jason Rigby (04:37):

But five years later, they're still breeding.

Alexander McCaig (04:39):

Well, yeah. The thing is, great. You've created it. Put a fail-safe in place.

Jason Rigby (04:42):

Mm-hmm (affirmative).

Alexander McCaig (04:43):

If we're just starting off with all this data that is helping us genetically engineer things, put some fail-safes in place.

Jason Rigby (04:50):

Yes.

Alexander McCaig (04:51):

If you're going to start to do really creative stuff, if you're really starting to create?

Jason Rigby (04:55):

Yes.

Alexander McCaig (04:56):

In an evolutionary stance? Just make sure you're like, "Okay, I want to do the least amount of harm as possible." Right? "I know I'm doing this for good. What's the fail-safe to prevent harm?" And in these CRISPR kits, they blew it out of proportion in this article. It's just for bacteria in your house. So you can actually stitch different types of bacteria together and get them to grow. That's all.

Jason Rigby (05:14):

Yeah. But could you create a harmful bacteria? That I don't know.

Alexander McCaig (05:18):

I mean, I can go walk out in a swamp and probably find thousands more and then go deliver that anywhere over here.

Jason Rigby (05:22):

But the main thing that I liked about this article, and that we specialize in, is data privacy.

Alexander McCaig (05:27):

Mm-hmm (affirmative).

Jason Rigby (05:27):

And he talks about, "Yeah, we have a huge debate with technology companies using personal data, such as purchasing habits, social media activity." That's what it is now.

Alexander McCaig (05:35):

Yeah.

Jason Rigby (05:35):

But he goes... And his future thinking, I love this. How are we going to look at data privacy when it comes to biological data from our bodies and brains?

Alexander McCaig (05:43):

We've been talking about this. So-

Jason Rigby (05:44):

And the intimate knowledge that these companies are going to have about us.

Alexander McCaig (05:48):

No. No. So it's not that they're going to have it. It's the fact that you're going to give it to them. So on TARTLE, your IOT data, like your whoop device-

Jason Rigby (05:54):

Right.

Alexander McCaig (05:54):

Apple, Fitbit, all those things, along your genome sequence that you have from 23andMe, Ancestry.com, you name it. Along with your entire health record from your hospital, right? That you've gone to or your doctor or your primary care provider... The combination of those things will be in your control in the future.

Jason Rigby (06:10):

Yes.

Alexander McCaig (06:10):

Obviously this guy's never heard of TARTLE.

Jason Rigby (06:12):

No.

Alexander McCaig (06:12):

So someone needs to shake him and wake him up, right? But when you have sovereignty over those things, which you rightfully own, because it's your DNA, it's your habits. So like all these things, this is what you do. These are yours. Your actions, you're responsible for them. You'll be the one that hands them off into the future for research. They will come to you and ask you for permission to use these things rather than use it without you knowing so. And the more people-

Jason Rigby (06:35):

Well, I mean that's the whole idea with abortion clinics taking those disposed fetuses and then using for biological studies. [crosstalk 00:06:44] Without the... Did you ask that 16, 19 year old that came in there-

Alexander McCaig (06:49):

Don't put everyone in a box, but yeah. [crosstalk 00:06:51].

Jason Rigby (06:51):

You know what I'm saying.

Alexander McCaig (06:51):

A large majority of people that do get abortions that can't manage a child or don't want to manage a child. Right?

Jason Rigby (06:56):

Yes, exactly.

Alexander McCaig (06:57):

Yeah. I wasn't trying to call you out on air, but I'm just saying-

Jason Rigby (07:00):

No, no. Yeah, yeah. No, I understand completely.

Alexander McCaig (07:02):

Yeah. I understand the example you were making.

Jason Rigby (07:05):

It could be a 45 year old.

Alexander McCaig (07:06):

That's the thing. Where's the consent? Where's the ask? Remember you don't own that placenta or whatever the hell that thing is you were talking about, right?

Jason Rigby (07:12):

Right.

Alexander McCaig (07:12):

That's for somebody else.

Jason Rigby (07:13):

Yes. [crosstalk 00:07:14].

Alexander McCaig (07:15):

Just ask. All you got to do is ask.

Jason Rigby (07:16):

Yeah.

Alexander McCaig (07:17):

And so the future is about consent.

Jason Rigby (07:21):

Yes.

Alexander McCaig (07:21):

And privacy and consent are going to go hand in hand. And as you begin to control more of this biological IOT, behavioral, health data on yourself, you're going to be the one that hands it off to these research centers and they'll compensate you for it. And you choose to give it to them when you choose to do so. It's like, I wouldn't want to give my genome sequence to the government of China. I don't like how they use big data against the people within their own-

Jason Rigby (07:50):

Yeah, within their own [crosstalk 00:07:52]-

Alexander McCaig (07:51):

Society. Yeah. Like that's not cool.

Jason Rigby (07:53):

Right.

Alexander McCaig (07:53):

Right? It's not right. Morally and ethically it's not a good big brother thing going on. So I want the choice. I want to make sure I'm giving it to the people I want to give it to, when I choose to do so.

Jason Rigby (08:04):

Yeah. And I think when governments, scientists, businesses... And we get this decentralized movement with the public and that responsibility shifts from the elite, let's say, to the public, then you're going to see, I think, an outgrowth-

Alexander McCaig (08:22):

A bloom.

Jason Rigby (08:23):

I think... Everybody's so worried about, and we've talked about this a million times. Everybody's so worried about ROI or profits. But whenever you're a company and you have the ability to go to your user, the one that loves you, that uses your product, and you turn around and you go to them and ask permission? And we had a whole episode on this. It creates trust. And in that trust, guess what happens? They're going to understand what you're doing with their data. And in that term, with that trust involved, with them understanding completely what you're doing with their data? Now it's saying, "Oh, okay." If Facebook could turn around and say, "Hey, we're going to use your data to solve climate change."

Alexander McCaig (09:05):

Yeah.

Jason Rigby (09:06):

"And that's the number one thing we do. Can we use your data to solve climate change?" Yes, of course. Yes. Yes. You know, 80% of people are clicking yes. They may want to read how you're going to do that.

Alexander McCaig (09:16):

And the 20%, "No, screw you. I still don't trust you, Facebook."

Jason Rigby (09:18):

Yeah. It's like Amazon does the whole Amazon smile thing, where you could... And I give to Paws and Stripes, shout out to them. They take dogs and then train them to work with veterans with PTSD.

Alexander McCaig (09:33):

Yeah.

Jason Rigby (09:34):

When the dogs actually heal. You know as well as I do, we did a whole episode on dogs. But whenever you look at any type of organization that's like that, it's like, yeah. Now I feel good about my Amazon purchases.

Alexander McCaig (09:47):

Yeah.

Jason Rigby (09:47):

Because I know a percentage of it is going to an organization. And they send me emails and say, "Hey, this month you did $12.31," I think was the last one I got from last month. Yeah. There's not a lot of money, but still, guess what? I know I gave something to help out. And if hundreds and thousands and tens of thousands of people do that collectively, as we saw with Wallstreetbets-

Alexander McCaig (10:08):

Yeah.

Jason Rigby (10:09):

What can happen? [crosstalk 00:10:11] You can topple!

Alexander McCaig (10:12):

You can topple hedge funds. Think about how much a lack of consent, a lack of trust and a lack of transparency has slowed the evolution of humanity because people put profits first.

Jason Rigby (10:24):

Mm-hmm (affirmative).

Alexander McCaig (10:25):

They put resource control first and foremost.

Jason Rigby (10:28):

Well, think about government transparency with extra trust rules.

Alexander McCaig (10:33):

So genetic engineering-

Jason Rigby (10:34):

[crosstalk 00:10:34] But you threw that up there, Alex.

Alexander McCaig (10:35):

There's a-

Jason Rigby (10:36):

We got a little alien ship going on right now.

Alexander McCaig (10:38):

Yeah. Let's consider this, right?

Jason Rigby (10:40):

Yeah.

Alexander McCaig (10:41):

Say your millions of years of advanced. Aliens are out there. Right? And they're like, "Okay, well we've been engineering biological organisms forever."

Jason Rigby (10:48):

Mm-hmm (affirmative).

Alexander McCaig (10:48):

"We've engineered ourselves to live longer." Right? Maybe they regrow an arm or something.

Jason Rigby (10:52):

Yeah. CRISPR? We did that.

Alexander McCaig (10:53):

Yeah. CRISPR?

Jason Rigby (10:54):

10 million years ago.

Alexander McCaig (10:55):

Please.

Jason Rigby (10:55):

Yeah.

Alexander McCaig (10:56):

We got CRISPR 10 million point oh. You know what I mean? But if we're looking at this, there's been accounts of like the Dulce Base up here [crosstalk 00:11:06] with genetic engineering.

Jason Rigby (11:07):

Right.

Alexander McCaig (11:08):

Where they're saying they were doing hybrid things between aliens and animals.

Jason Rigby (11:11):

Supposedly. Yeah.

Alexander McCaig (11:12):

Supposedly. Right.

Jason Rigby (11:13):

We're just having fun right now.

Alexander McCaig (11:15):

What do I know?

Jason Rigby (11:15):

Yeah.

Alexander McCaig (11:16):

But the thing is, this is like that fail-safe we talked about. McKinsey is talking about these hybrid aliens. If you have part bat person or whatever it might be.

Jason Rigby (11:25):

It's always funny. Everybody always says the alien stink, but what if we stink to the alien? [crosstalk 00:11:30]

Alexander McCaig (11:31):

Can we engineer the stink out of them? That's all they were doing at the Dulce Base.

Jason Rigby (11:35):

The aliens with their big beady eyes, they're like, "Do you smell a human?"

Alexander McCaig (11:38):

Yeah. And then humans are like, "Wow. Wow. These greys stink. This smells like a... Smells like a landfill over here."

Jason Rigby (11:48):

Yeah, exactly. It's so funny. [crosstalk 00:11:50].

Alexander McCaig (11:50):

Aliens are like, "There's no oxygen in space. You can't smell us."

Jason Rigby (11:52):

Because that's something that everybody... And I've been listening to whether it'd be Elon Musk or whether it be Peter Diamandis, or all these future thinking people. And they're like, "We're not even ready." I mean to think that for myself, this is my personal view. To think that there's millions and trillions of galaxies, and we're literally the only carbon based [crosstalk 00:12:12] consciousness around?

Alexander McCaig (12:12):

That is the most ignorant statement I've ever heard in my entire life.

Jason Rigby (12:14):

It's silly. So I personally believe... But whenever the I believe poster from X-Files... I love it.

Alexander McCaig (12:21):

Yeah. X-Files took that from the Billy Meier case. Shout out to Billy Meier. [crosstalk 00:12:24] [inaudible 00:12:26] Switzerland.

Jason Rigby (12:27):

But whenever you look at that and you see that, now the question is yes, we believe in extraterrestrial life. How would we properly communicate with them?

Alexander McCaig (12:37):

Would they want to communicate with us with the way we act right now?

Jason Rigby (12:40):

No.

Alexander McCaig (12:41):

Would they even want to share a technology with us if we're such jackasses with nuclear technology right now?

Jason Rigby (12:45):

No. They saw the one powerful thing we had, and we kind of fucked that up. Pardon my French.

Alexander McCaig (12:50):

Yeah. You guys nuked yourselves? Like good for you.

Jason Rigby (12:55):

You could have used this to... You have the zero energy. That was your shot guys. No, think about it. Think of this back review with a species that's millions of years evolved from us.

Alexander McCaig (13:07):

Yeah.

Jason Rigby (13:07):

And they're like, "Oh, okay." Because that's when that's when supposedly alien air crafts are running around [crosstalk 00:13:13], was in the 1940s and 50s-

Alexander McCaig (13:15):

Project blue book and all of [crosstalk 00:13:17], figure it out.

Jason Rigby (13:17):

So they're like, "Oh, okay. This is really dangerous. They've figured this out."

Alexander McCaig (13:21):

Yeah.

Jason Rigby (13:21):

"Let's see. What are they going to do with it?" [crosstalk 00:13:24]. We could have done zero net, bro. Think about-

Alexander McCaig (13:26):

Dude, they're millions of years advanced, and we're like, "Ooh, let's smash this atom into this one, see what happens."

Jason Rigby (13:33):

What if the world would have got together and said, because this is the problem. Everybody goes back to this, and it's dumb. It's absolutely dumb.

Alexander McCaig (13:40):

Tell me.

Jason Rigby (13:41):

What if the world would have got together and said "Chernobyl? That was idiotic."

Alexander McCaig (13:46):

Mm-hmm (affirmative).

Jason Rigby (13:47):

"It wasted..." I mean, they're still trying to... There's some places [crosstalk 00:13:51].

Alexander McCaig (13:50):

Fukushima Daiichi. Why would you build it on the ocean like that? Idiotic.

Jason Rigby (13:54):

When it comes to... We understand that nuclear energy for the process that we have right now is ultra safe. How can we make sure that those things never happen again? And they've been doing that. Science has made it ultra safe. I mean, it's ultra safe now.

Alexander McCaig (14:08):

You get more radiation flying in an airliner across the United States than you would standing next to the most reactors [crosstalk 00:14:14].

Jason Rigby (14:14):

Yeah, yeah. Exactly. Because those were stage ones and they're stage five now.

Alexander McCaig (14:18):

Probably get more radiation eating from a banana.

Jason Rigby (14:20):

But I mean, we have this technology that's sitting there. This beautiful technology that would turn around and say... I mean, somebody's had to done models on this and we'll have to do some research on it, that said, "If we had all nuclear energy and it was inexpensive, around the globe-"

Alexander McCaig (14:36):

What would we look like?

Jason Rigby (14:36):

"What would that look like? And what's the risk for that?"

Alexander McCaig (14:39):

That'd be cool.

Jason Rigby (14:39):

Yeah. It'd be cool because then now you're seeing... Now they would turn around and say, if you believe in aliens like I do now, they would turn around and say, "Good job."

Alexander McCaig (14:48):

"Hey, we'll come down. We'll hang out for five minutes."

Jason Rigby (14:52):

Yeah.

Alexander McCaig (14:52):

"Don't screw it up."

Jason Rigby (14:53):

Yeah, exactly.

Alexander McCaig (14:56):

Otherwise it's going to be day earth stood still. We're going to have Gort out here and we're just going to start roasting stuff.

Jason Rigby (14:59):

But isn't all the movies the truth though, when it comes to like Independence Day or any of these others. It's like "The generals get involved. Then there's a threat." And then it's like, why would [crosstalk 00:15:12] Go ahead.

Alexander McCaig (15:12):

Okay. I just want to think about this logically. Say a UFO shows up, okay? If us with our telescopes and everything are looking at galaxies that are so far away, and we can't even travel that distance, and you have an alien craft that shows up. And it's floating around. It's not making any noise. And you're like, "How the hell did this thing get here?" You're going to shoot at it? They obviously have the technology to waste away your entire planet instantaneously, but they don't choose to do so. [crosstalk 00:15:46] They can move across eons of space and time. We have trouble going across the United States without killing ourselves in an airplane. Or a train or a car. Think about it. And you're going to come at them with generals and all this other stuff? Don't be ridiculous. If they wanted to be negative, we would cease to exist.

Jason Rigby (16:03):

That simple.

Alexander McCaig (16:04):

Wouldn't that be just what it is?

Jason Rigby (16:06):

And why would they send one aircraft down? I mean, they could literally [crosstalk 00:16:11].

Alexander McCaig (16:10):

You've got a bunch of guys on like a 40 foot wide ship flying around. And they're like, "Just send one."

Jason Rigby (16:15):

I mean just the whole reactive monkey brain process, like sirens going off and everybody's shuffling together. And okay, we've identified this unidentified object and it's hovering over the lighthouse. Why does it have to all of a sudden go into red alert mode?

Alexander McCaig (16:32):

You want to know? Because we're irresponsible. We're reactive. We look to put out fires, and everything in this world is based off of a threat.

Jason Rigby (16:40):

Authoritative control, too.

Alexander McCaig (16:41):

Yeah. Authoritative control. Much like we talk about with data, we need to be responsible. And if we're looking at this CRISPR machine for genetic engineering, creating new biological forms of life, we need to be responsible. That's what it all boils down to. You And I talk about responsibility all day long. And if you do believe in aliens and want to show up, well prove to these people out there that are watching you that you are responsible. Prove to other human beings that you can be responsible. Prove to the Earth that you can be responsible to it. Show us with your data.

Jason Rigby (17:07):

Yeah. You're a data scientist, a scientist, you have the ability, just as so scientists did in the 1930s and 40s, you have the ability with this mass amounts of knowledge that you've acquired. This mass amounts of... That I will never even understand, and not even comprehend the beauty of your brain.

Alexander McCaig (17:27):

No.

Jason Rigby (17:28):

And when you have that much responsibility and it was given to you or you evolved to that point to where you are, that responsibility, it needs to weigh heavy on you.

Alexander McCaig (17:41):

Do the right thing.

Jason Rigby (17:41):

Because you have the potential to solve our greatest problems in the world.

Alexander McCaig (17:47):

Right.

Jason Rigby (17:49):

You have brain-

Alexander McCaig (17:50):

And we, who don't have that intellectual capacity, can come together to give you all the information you could ever need to solve it. If you have the ability to assimilate that knowledge and find the answer, we're here to come help you out.

Jason Rigby (18:04):

Yeah. Great job making an algorithm that gets people to buy more shit.

Alexander McCaig (18:08):

That is only worsening our problem. When I see investment dollars get pumped into new marketing schemes and stuff like that, or companies, "We have an AI thing on marketing." It's like, "You're just going to continue to fund this materialistic idea that is frankly crippling us right now?"

Jason Rigby (18:24):

And even if you are thinking dollars, because everybody thinks dollars. If you have a scientist that's on your payroll and he's free to be able to solve... Maybe solving climate stability in Europe-

Alexander McCaig (18:40):

Think about how many lives he saved. Don't pay him-

Jason Rigby (18:42):

What do you think his worth is in your company then? And what do you think... When he goes around and he speaks to all these world organizations, and he's giving a speech on this algorithm that he built or whatever it may be, this model. And he throws in, "I'm also an employee from blah, blah, blah." Why wouldn't you want that?

Alexander McCaig (19:04):

That's exactly what you would want.

Jason Rigby (19:06):

The worth is, dollar-wise, the worth is it's an intangible asset.

Alexander McCaig (19:10):

It doesn't matter. Who cares about the dollar value. This person is using their knowledge to frankly save this earth and save lives.

Jason Rigby (19:17):

And that needs to be-

Alexander McCaig (19:19):

That's what has to be championed.

Jason Rigby (19:21):

Yes.

Alexander McCaig (19:21):

And we have to be responsible to champion that. And we, as the collective of people of humanity, need to be responsible with our data and share it with those people that want to be responsible with it. And you have the choice to do so.

Jason Rigby (19:33):

And that choice is Tartle.co.

Speaker 1 (19:45):

Thank you for listening to TARTLE Cast, with your hosts, Alexander McCaig and Jason Rigby. Where humanity steps into the future, and source data defines the path. What's your data worth?