If we aren’t careful about our choices, social media can easily turn around and start shaping your identity. But what really gets the ball rolling is how you decide to use the platform. The algorithms are designed to self reinforce: it is our decision to consume a specific type of content on social media, and the algorithms shape themselves around that.
This goes beyond cat videos and memes (which, in themselves, have morphed into a political tool). We have to be mindful about how far we take our perspective on cops, guns, reproductive health, other people and countries because it’s so easy to fall into the rabbit hole of radicalism.
Alexander and Jason discuss how social media is probably more powerful than nuclear. While it may seem difficult to believe that something so ubiquitous has the capacity to change humanity forever, even the government has created a center for analytics dedicated to artificial intelligence, the internet, and online platforms.
The truth is that social media, through their algorithms, create and reinforce hardcore biases. We live in a new era where information warfare is capable of shaping entire communities. We saw this happen in the Cambridge Analytica scandal. It won’t be the last time that our echo chambers will be exploited for the benefit of those at the top.
If we must live in a world where information is a weapon, we need to protect ourselves by learning how to use our data for our own benefit.
In theory, artificial intelligence and algorithms are objective. But these are passive aggressive technologies that are hidden behind so much legal jargon, which are difficult to comprehend for the average user. In addition, the developers behind them have their own biases, which bleed into their creation.
It’s time to move to a platform where you are directly connected to companies and organizations who are invested in ethical data sourcing. It is through this shift that we can take back the power to shape societies and return it to the people.
What’s your data worth?
Sign up for TARTLE through this link here.