The Internet of Things (IOT) has expanded incredibly quickly. Believe it or not there are over 25 billion IOT devices on the planet right now. That’s right, billion with a ‘b’. More than three times the number of people. These devices include obvious things like fitbits, smartwatches, smart thermostats and things that aren’t so obvious like refrigerators and cordless drills. All these are constantly generating data, mountains of it actually. Information like your heartrate when jogging or how quickly you drain the battery in your drill you got last Father’s Day is being collected and stored somewhere. The challenge isn’t figuring out how to collect more data, it’s figuring out how to make the best use of the data we already have.
There are three basic levels of data. One of course is just the raw data, everything getting collected by all of these devices. Another is what is called meta-data, the data about the data. The best and most well-known example of meta-data is all the information that gets attached to your digital photos. That includes things like the time and place where it was taken and even (sometimes) the people in the photo. Finally, there is the transformed or processed data, data that has gone through some sort of process to make the information usable on a larger scale.
To help with that last part, artificial intelligence/machine learning (AI/ML) has been rushing to keep up with the growth of IOT. Lots of progress has already been made with over 5 billion of the 25 billion IOT devices having onboard AI/ML. What’s the advantage? Onboard AI/ML helps get things processed faster, rather than waiting for a server to collate and analyze the data. Also, if the device detects signs of trouble, the user can be made aware of it sooner. Of course, processing data from groups still requires that the data be collected elsewhere for processing, but for the individual, this could be a literal life saver.
Not that AI/ML is always used as it should be. Too often it’s used as an infallible prediction tool, a way of generalizing individuals. There are uses for that, but the problem lies in how these computer models are often treated as infallible oracles, even when broken down to the individual level. You can obviously get away with that with groups. However, when applied specifically to me or you, these things tend to break down. Just because we’ve always gone along with a certain group to by the latest smartphone, doesn’t mean you will when the next model comes out. If the AI/ML software detects a break from the pattern like that, it will either ignore it or spend time and resources reanalyzing data or gathering more in order to explain the deviation.
TARTLE has a novel approach to this kind of situation. Instead of spending a lot of time, energy and resources creating software to figure out why you didn’t go to the latest Marvel movie on opening day like you normally do, we thought we just might (brace yourself), ask you. That’s right, we, or the client purchasing your data through TARTLE would simply say, “The data you sell us shows that you normally get together with your friends and go see superhero movies on opening day. This time, you waited three weeks and apparently went by yourself. Any particular reason?” We don’t have to guess, and even better, you don’t have to answer. If you decide you are just fine keeping your reasons to yourself, you are free to do so. That’s the beauty of TARTLE, you get to decide what and when to share and why. Yes, that even applies to IOT devices, just include compatible devices on your TARTLE account and you then have control over the data those devices are sending out. Not so much as a byte goes out without your say so. TARTLE puts you back in control.
What’s your data worth?
The Internet of Things (IOT) has expanded incredibly quickly. Believe it or not there are over 25 billion IOT devices on the planet right now. That’s right, billion with a ‘b’. More than three times the number of people. These devices include obvious things like fitbits, smartwatches, smart thermostats and things that aren’t so obvious like refrigerators and cordless drills. All these are constantly generating data, mountains of it actually.
Speaker 1 (00:07):
Welcome to TARTLE Cast, with your hosts Alexander McCaig and Jason Rigby. Where humanity steps into the future and source data defines the path.
Alexander McCaig (00:27):
You're wearing a whoop band.
Jason Rigby (00:30):
Whoop, there it is.
Alexander McCaig (00:31):
And whoop, there it is. Joke again. Same joke, two episodes in a row.
Alexander McCaig (00:37):
I need to cover some new material, sorry. I will, I'm going to work on it.
Alexander McCaig (00:41):
Tonight, I want you to think about it.
Jason Rigby (00:42):
My materials is almost as bad as my presentations. I like my B stuff in there, though.
Alexander McCaig (00:48):
No, listen, I like the B stuff.
Jason Rigby (00:51):
So, this is funny. So, I did a marketing presentation for TARTLE. I love to be a real personal with people, so getting my brain, getting all of this down into a presentation, to make it make sense, I can do it speaking-
Jason Rigby (01:08):
You can speak it to me, yeah.
Jason Rigby (01:10):
Yeah, I can take some of your information, and you guys should listen to Higher Density Living plug. I can take your information and put it down into that level, but to actually like type something out and get it written. I can do audio. I speak that way.
Alexander McCaig (01:28):
I felt your ideas everywhere. I'm like, "Where did the continuity go in this?"
Jason Rigby (01:32):
Could you imagine? No, here's what you may think. This is what I was thinking when you told me that. I've prefaced all of that, and you know all of that.
Alexander McCaig (01:42):
Alexander McCaig (01:43):
So could you imagine-
Jason Rigby (01:44):
Somebody that has no idea?
Jason Rigby (01:45):
... that has no idea and looking and then looking at that.
Alexander McCaig (01:46):
Jason Rigby (01:47):
They'll be like, "Two plus two is, you're saying is equaling five, Jason?"
Jason Rigby (01:56):
That doesn't make any sense.
Alexander McCaig (01:56):
Jason Rigby (01:56):
No, That's what I mean. So, that's just a little edit. It's going to be required of it.
Alexander McCaig (01:56):
This is how we do a TARTLE business from now on. On the camera.
Jason Rigby (02:03):
We're going tell everybody about the mistakes.
Alexander McCaig (02:03):
Everybody's coming on here, because the title is going to talk about Internet of Things and then they're going to be like, "They're having just a little business meeting."
Alexander McCaig (02:08):
They'll learn about our imperfections.
Alexander McCaig (02:09):
Yes. I love it. That's what I like.
Alexander McCaig (02:11):
It's all growth.
Alexander McCaig (02:12):
Jason Rigby (02:13):
Jason Rigby (02:13):
That's what TARTLE's about.
Alexander McCaig (02:14):
Jason Rigby (02:15):
That's why we have datas, because of imperfections.
Jason Rigby (02:17):
That's precisely correct. We can mention those.
Alexander McCaig (02:20):
Alexander McCaig (02:20):
I like that.
Jason Rigby (02:21):
So, get into IoT and whoop.
Alexander McCaig (02:24):
Yeah. So IoT means Internet of Things. What it's talking about is all these devices now, because wireless card chip sets have become cheaper, Bluetooth technology is cheaper. The availability for smaller devices to connect themselves to the internet, when they otherwise didn't have the ability to do so, they were quite static, now become more dynamic and we can wear them and we can integrate them into all of these different things. Like for instance, so this band right here collects about 128 megabytes a day I believe, somewhere around there. It collects more heart rate data than anything else, right? It's got my respiratory rates. It's got the variance in my heart rates, so it's testing the electrical signals.
Alexander McCaig (03:03):
And that is interesting, because this data is not only well connected now, but now it's pulling in huge troughs of information. So the question is what are we going to do with all these devices that has become more interconnected, right? As more things come online, even the hot lamps we have on us right now for this filming, those probably one day will be an IoT device, right? With some sort of sensor on it.
Jason Rigby (03:27):
Yes. Get hooked the cameras and then watching to see if it aligns with the blue screen and so forth.
Alexander McCaig (03:35):
Yeah, and I've been to some smart factories. I did quite a good chunk of work, and I'm not going to mention their name, I don't know if it's fair to do that to them, with a couple of the leaders in Massachusetts, especially around the Newton area with outfitting smart factories. And it was amazing to see the sensor readouts on everything. You almost become inundated. It's almost too much to handle. Where you're like, "Wow, that's a massive output." So then that begs the next question. If we're absorbing all this data right here on your band and mine at the same time, what does that structure look like? How do we read that structure? How do we know what to properly interpret from it? So why don't you walk us through this article you have, and we can talk about that.
Jason Rigby (04:19):
Like my presentation, we're getting all this absorption of data from a big factory.
Jason Rigby (04:24):
It's just absorbing it, now we've got to-
Jason Rigby (04:26):
Now, just like what you said, that's what I'm going to do.
Jason Rigby (04:29):
We need to make sense.
Alexander McCaig (04:31):
Yes, having to make sense?
Jason Rigby (04:31):
Jason Rigby (04:32):
I love that. So the artificial intelligence and big data analytics and IoT market for data capture information and decision support services.
Jason Rigby (04:39):
They've got to simplify that.
Jason Rigby (04:39):
It's a big report from 2020 to 2025, and so we're going to see a lot of different things. So this report evaluates various AI technologies. They use relative to analytics solutions within this huge rapidly growing enterprise industrial data arena. I mean, and we're going to get into exactly how much, but like you said, Internet of Things, you have consumer data, you have enterprise data, you have industrial data and you have government market segments.
Alexander McCaig (05:07):
Alexander McCaig (05:08):
And each of them have their unique needs in terms of infrastructure, devices, systems, and processes.
Jason Rigby (05:15):
So everybody has their own way of how they want to read and interpret things. And everybody has their own way of speaking. Everybody's got their own language, essentially. Not everyone's speaking English. So then how do you communicate value properly across multiple languages and multiple needs?
Alexander McCaig (05:31):
Jason Rigby (05:32):
That's what they're saying. So if this data is spitting out multiple languages and people need it in different formats, how do we effectively stitch that all together? How do we make sense of that?
Alexander McCaig (05:43):
Well, and one of the problems that they said is that, like you said, it's just producing mass amounts of data and it's all unstructured.
Alexander McCaig (05:50):
Yeah. It's unstructured. I mean, maybe it's just sitting in one column on this massive CSV file, comma-separated value file, something you pull into Excel, but there's 10,000 rows of it. Right? How do you manipulate that effectively, even from your own understanding with whatever key markers are in that data? Right? What columns should be in it? What kind of headings? Is this something that can be visual? Is this something that belongs on an [inaudible 00:06:16], right? To just say, "Okay, is this is something that should sit in the legend to describe it?"
Alexander McCaig (06:23):
That is going to fall back on the algorithms that machine learning is going to be programmed with, to actually take this unstructured information, the swaths of it, and then create some sort of understanding from it. So it's like, okay, I'm going to say, I want it in English and I want it to tell me about a fire safety. And I'm talking about a government bus lines, it's pulling out the sensor data. Okay, great. Well, the machine learning model understands your parameters ahead of time. So it'll take all that data, scan through it, pull out the key markers for what you need and then give you some sort of readable format or visual so you can understand what's going on.
Jason Rigby (07:01):
Yeah, and the author of this article from researchandmarkets.com, give him credit, from Dublin. He said that he sees three different types of IoT data. So I want you to speak to this, the raw, the untouched and unstructured. The meta, the data about data.
Alexander McCaig (07:17):
Jason Rigby (07:18):
And then number three, the transform value added of data.
Alexander McCaig (07:21):
Okay, cool, so-
Jason Rigby (07:21):
Because this is going to go right into TARTLE.
Alexander McCaig (07:21):
So, the raw data, okay, is something that this band right here is reading. It's reading raw heart rate, raw quantitative stuff. Essentially it's meaningless numbers. Okay, great. But it's ingesting it. The sensor is picking up, the sensors is doing its job and it's taking it's binary zero, one. It's triggers on triggers off. Okay, that's a value. And it's going to record that. That's category number one. Number two is?
Jason Rigby (07:47):
Meta. Data about data.
Alexander McCaig (07:48):
Data about data. Okay, so now that I have this data instituted off my IoT band, my device, or my sensor, well, now I want to see what can I pull from it? So say, for instance, a photo or our phone, we talked about this before. You take a picture, the picture is going to have geolocation data in it. It's going to have the time, it's going to have the date, and possibly it's going to have the people that are in it if it's smart enough, right? That's going to be the metadata that sits inside of it, it's almost descriptive data about data. And then third one is?
Jason Rigby (08:18):
It's transformed value-added data.
Alexander McCaig (08:19):
It's transformed value-added data. So after you've collected this data, the raw data, and you have the metadata about that data, right? It's categories, you know what should be applied to it? Now we can transform, we can manipulate it into some sort of visual or output, that is readable and understandable for us to make decisions on.
Jason Rigby (08:39):
Yeah. And that's what they're wanting to use. The articles that they want to use are artificial intelligence to take those three types of datas and then be able to create the value in that data.
Alexander McCaig (08:47):
Yeah. You create the value in it. It's just, when we say artificial intelligence, how do we efficiently manipulate data faster than a person would going on, doing pivot tables on Excel?
Jason Rigby (08:57):
Yeah, and what they were talking about is the ability to capture streaming data, determined valuable attributes and make decisions in real time.
Alexander McCaig (09:04):
Yeah. So, that's important. So say I run a gas pipeline company, and I have a bunch of sensors in all my pipes, measuring pressures, temperature, flow rates, you name it. So what I would want to know is if I'm pulling things down real time, and that data immediately gets pumped into the algorithm of the machine learning, structured or unstructured, it will automatically be able to pick up and notify me if there's an issue or a leak in one of the pipes, or if there's an overbuilt of pressure. Right? All those different things can come into play and then I can create immediate, actionable items or insights off of what's going on.
Jason Rigby (09:37):
Yeah. That was really neat. This was on television, I didn't see this personally, but I saw these drones following-
Alexander McCaig (09:40):
So you meta-viewed it?
Jason Rigby (09:45):
Yes. I meta-viewed it. Yes, there we go. So I meta-viewed these drones on either side of a train and they were flying right alongside the train. And their cameras were viewing how the train hits the track. And then they were taking that video footage and then seeing where weaknesses were on the tracks and where they need to fix it. Where it would take people to go look at the tracks and how long would that take compared to-
Jason Rigby (10:10):
It takes a while. So I've actually seen... [inaudible 00:10:12] The stuff you see in your life. I've actually worked in large transit systems. So you'll find pitting on the tracks, areas of extra wear, especially on corners or things of that nature or corrodes early, you have to go actually and check that surface. If there's an excess of shearing on that surface, or we lose contact, there's an inefficiency, there's a slip that you might find in that transportation. And that's a cost over time.
Alexander McCaig (10:37):
Whether it's a safety cost or an economic cost, right? Because now my wheels, they've worn down too much and the tracks are worn down or when there was a crash and we got to pay for all this stuff. So it's better we view it ahead of time, analyze it and fix it. And so, I've seen-
Jason Rigby (10:52):
Because a drone would be an IoT.
Alexander McCaig (10:53):
A drone is an IoT. It's flying around, it's internet connected, after it takes the data and the metadata, so it's filming and it's recording data with it. It can then take that and then put it into a system and that can be analyzed at different levels on the track, right? But the interesting part... Wait a minute, now I'm lost. Oh yeah, the interesting part is that when they rebuild tracks, they usually have a guy that goes out there with this machine, two guys, and it has a whole stockpile of fresh track and they'll just reset the whole thing.
Alexander McCaig (11:21):
Because frankly, that's less of a risk for them, but if you can find an efficient way just to change one small piece of track, these things weigh a lot. They're very heavy, even for one piece of track, it's a lot of iron. So in their mind, they're like, "Let's just swap it all out." But if you can just grab one, analyze that area, pull up the spikes and replace it with a big metal machine that does it automatically, super cool to watch by the way, that's just a more efficient process for them.
Jason Rigby (11:47):
No. Yeah. So that to me shows how AI, how IoT devices, can give a company cost savings with efficiency.
Alexander McCaig (11:57):
It's essentially a cost savings, that's all it is. We'll invest a little bit more in this machine learning plus the IoT device. And it will give us enough of a cost saving so that we don't have to have guys out there on the tracks as much, not as long. The trains can run better.
Jason Rigby (12:11):
I like the idea of giving real-time data. We're not looking at something that is happening in the past.
Alexander McCaig (12:16):
No, and we talk about this with TARTLE, right? The data is real time. People are updating this constantly. It's pulling down APIs from integrated systems constantly. And if you want to subscribe to those datasets, you know when someone's going to take a left-hand turn. You know when they're going to change their lipstick color. You know what they're going to eat in the next five minutes, but you're not going to look at the past and assume that they're going to do the same thing in the future. What a waste. Why guess?
Jason Rigby (12:40):
You know what assume does.
Alexander McCaig (12:42):
We all know what assuming does?
Jason Rigby (12:44):
Yeah. So why are we? So I thought this was interesting. So this is just AI with IoT. Global market for AI and big data and IoT, as a whole, will exceed 26 billion by 2025. Embedded AI in support of IoT things, objects, will reach 5.7 billion.
Alexander McCaig (12:59):
Yeah. So what they're saying is, so we have the onset of sensor technology, IoT, okay, and it's connected to the internet. Now, why don't we just put the machine learning directly into that device? So rather than just having to go through the process of pumping out the raw data, have it analyzed on the device immediately and just send out the answer yeah.
Jason Rigby (13:20):
Yeah, and they were talking about that collaborative robot growth will be at 42.5% by 2025.
Alexander McCaig (13:25):
Robots are cool. I was-
Jason Rigby (13:26):
So 42% by 2020. That's a lot.
Alexander McCaig (13:29):
There's a gentleman over here in Albuquerque, I went and met him and he's designing robots to actually paint. So painters unions, I don't know if you know much about them, they're the highest paid union, workers. So if you run a large construction operation, you'd be like, "Yeah, I'll invest in the robots." And it was really cool to actually watch. They have perfect coverage and pressure the entire time.
Jason Rigby (13:57):
Oh, of course. Yeah.
Alexander McCaig (13:58):
I was watching the robot just paint the room. I give it to you, that's pretty cool to watch.
Jason Rigby (14:03):
Yeah. I don't know. And I may have shared this story with you before, but how Mazda did their painting? The car manufacturer, Mazda. I thought it was really neat, in Hiroshima.
Alexander McCaig (14:10):
You might have...
Jason Rigby (14:11):
Yeah. So they took their senior painters that had painted over 30 plus years.
Alexander McCaig (14:17):
Oh yeah, to program the machines.
Jason Rigby (14:17):
The put the sensors-
Alexander McCaig (14:17):
Yeah I love that.
Jason Rigby (14:17):
They put the sensors on their hands and then followed how they did. And then they taught the robots how to paint exactly like that. And you can watch them, so instead of just going ... they're actually moving around, just like how that [crosstalk 00:14:30] master painter would do. Yeah.
Alexander McCaig (14:31):
It's super cool.
Jason Rigby (14:32):
But I think that's a prime example of looking and seeing. And each of these robots, of course, are exhibiting data.
Alexander McCaig (14:39):
Every single one them.
Jason Rigby (14:40):
Exhuming data out too.
Alexander McCaig (14:41):
Yeah. Anytime there's a movement of the machine, there's some sort of programmable interface, there's an output that comes with it. It has to. Anything on a computer is input, output.
Jason Rigby (14:51):
Yeah, and they're saying autonomous weapons systems in AI, in military robotics, will grow 40% in 2021.
Alexander McCaig (14:55):
Of course it will, because the budget's so large for it. And that's going to also be dependent on the political party that comes in. There's a ton of things that play into that.
Jason Rigby (15:03):
And then finally there's going to be-
Alexander McCaig (15:04):
But what's the point of the frigging weapon? Who cares? Who cares if you put a smart device on a missile, you're just going to blow it up. What a waste. Don't put the money into that.
Jason Rigby (15:12):
Yeah. Top three segments will be, number one, data mining and automation. Number two, automated planning, monitoring and scheduling, and number three, data storage and customer intelligence.
Alexander McCaig (15:22):
Yeah. Where's the data going to sit? How do we get it out of there? How do we get it to move, and then what is it going to tell us about human beings? It's everything that TARTLE does right now.
Jason Rigby (15:32):
Yes, exactly. That's why [crosstalk 00:15:35].
Jason Rigby (15:34):
I thought I'd tee you up with this one.
Alexander McCaig (15:36):
Why do I need an AI algorithm to find out all this stuff? I can just go ask somebody. Duh.
Jason Rigby (15:42):
Duh. Yeah, exactly. Duh.
Alexander McCaig (15:43):
Yeah, we're going to end it on, duh, like Homer Simpson, "Duh."
Jason Rigby (15:46):
Sign up for TARTLE.
Alexander McCaig (15:47):
Yeah, sign up for TARTLE, T-A-R-T-L-E.co. Thank you everybody.
Jason Rigby (15:51):
Speaker 1 (15:58):
Thank you for listening to TARTLE cast with your hosts, Alexander McCaig and Jason Rickey, where humanity steps into the future and source data defines the path. What's your data worth?