Why is there so much emphasis on the thoughts and actions that govern our day to day lives, but not as much on the ones that happen as we sleep?
It’s difficult to understate the influence of modern technology because its effects are so tangible. We see how platforms like TARTLE are geared towards a clear end goal. We’re connected to our smartphones and devices around the clock. But we seem to be forgetting about the first device and data set that we were given to work with: the human mind, and our subconscious.
At most, dreams are an interesting icebreaker or topic for idle talk—but we think that they can mean something more. It is time to revisit how dreams can have an impact on the course of our lives, as well as that of the people around us.
Sidarta Ribeiro shared a personal experience with a fellow PhD candidate. One day, Sidarta Ribeiro needed a ride to the field center of Rockefeller University for an experiment. However, he was unable to push through with his activity because it was used by another candidate.
This setback meant that he had to reschedule his experiment, which affected his productivity. Understandably, this affected Sidarta Ribeiro’s perception of the person. He went to sleep feeling annoyed and irritated.
However, he dreamt of a scenario where he angrily confronted the person and ended up getting physically hurt. When he woke up, he found himself in the right mindset and mood to peacefully discuss what happened with his colleague, and they made amends.
This is a personal example of how dreams can be used to simulate instances of the future using references that we have made in the past. It can help guide us and give us insight. Giving the mind some space to process what has happened throughout our day can have some benefits for our wellbeing.
Alex mentioned how, surprisingly, we only spend 55 percent of our lives awake. This means that if we don’t pay attention to our dreams, we’re missing out on almost half of our entire life experience In the modern world, there is a growing dichotomy between inner work and outer work that we need to bring our attention to— especially when we put so much value on what is external, but choose to forego focused introspection on the self.
Sidarta Ribeiro pointed out that today’s research into mental health and wellbeing appear to be closely intertwined with drugs that induce a dream-like state. This could be the first step in a collective effort to bring back emphasis on our subconscious.
It’s time to return to our inner world and start using dreams, one of our most ancient technologies, to our advantage once more.
The dream state has had a massive impact on the course of history. One solid example is the Oracle of Delphi, a widely revered high priestess of the Temple of Apollo who gave predictions and guidance to both individuals and city-states. Her words influenced the decisions of important figureheads such as Aegeus, the king of Athens; Croesus, the king of Lydia; and Alexander the Great, conqueror of the ancient world.
Ancient and contemporary Mayan religion also posited that dreams are sacred, because they functioned as portals that helped an individual connect with their ancestors for guidance. The dream state is closely intertwined in the definition of spirituality across several religions and concepts of faith.
Today, the role that our dreams fulfilled in old societies is now being fulfilled by a variety of different mechanisms and technologies. Amidst all this progress, it’s time to take a break and ask ourselves: do we like where we’re going, now that we’re leaving our subconscious in the dust?
We are consistently pressured to maximize our productivity and levels of efficiency. The technologies we develop are influencing us to think of our value according to the volume of our work. While our reliance on the dream state has, to a large extent, been diminished due to our increased proficiency in technical knowledge, we forget to ask ourselves about the implications of this change.
It is undeniable that our subconscious has played a massive role—not just in the individual lives of ordinary people, but in the rise and fall of civilizations. The dream state is a data mine that we, as a collective, are slowly losing out on. It’s an opportunity for introspection that can help us make better decisions. Most importantly, it helps us regulate our wellbeing through proper rest and recreation.
What’s your data worth?
Technology, philosophy, and society. We have been primed to think that a capitalist system is capable of giving everyone the compensation they deserve—but we also know that this isn’t always the case, especially for those who may need it the most.
How do we take a closer look at the technologies and the organizations that provide the quality of life we have now? A foundation on the theories that apply to our circumstances is a step in the right direction.
In this episode, Alexander McCaig explores these ideas with Bernd Stahl, author of Information Systems: Critical Perspectives. Bernd is also a Professor of Critical Research in Technology and Director of the Centre for Computing and Social Responsibility at De Montfort University.
The process of emancipating someone may seem like a noble goal. However, it can be difficult to gauge whether or not we are actually doing harm by taking this opportunity on their behalf. For example, one common perspective of companies is that they have the social responsibility to make profits because it would be distributed to the shareholders and trickle down to employees.
Unfortunately, there are plenty of people who do not participate in this success and are not capable of being a part of this economic system. Wealth generation and opportunities to it are vastly different, especially in a capitalist structure.
There are plenty of possible approaches to consider for this problem. Those who are pessimists believe that the system inherently ensures that some people will always be “outside.” As a result, the only true solution would be to implement radical change.
Others believe that information systems can be used to make the economy more inclusive and spread wealth more evenly. With such polarizing views on how the labor system should be structured, it may seem like an impossible task to bring everyone into a discussion where they can give their own benchmarks for what is best.
The magnitude of such a feat is further emphasized when we think of the tech-driven world we live in. Due to our different backgrounds and preferences, Bernd points out that the idea of maximizing individual potential can vary widely from one person to another. The essence of critical theory would be to have a society where people are free to flourish, without other individuals or systems telling them what success is and how it should be achieved.
It’s an extension of our capacity to practice our individual liberties. Sadly, those in power often influence the system to fulfill their vested interests—and a crucial part in making this possible is taking away our ability to self-reflect, or to practice reflexivity.
This is TARTLE’s mission: to give people the avenue to practice critical reflection and self-awareness, bringing back that sense of common responsibility to humanity one step at a time.
In such a tech-driven landscape, the provision of goods and services does not provide a lot of opportunity to interact with other people. Bernd illustrates this by pointing to electronic marketplaces and discussing how straightforward the transaction is.
If this seems like an advantage, we need to dig a little deeper. We are no longer encouraged to think of the human realities behind eBay, Amazon, or other e-commerce platforms. All we need to consider is the availability of the product, estimated shipping time, and the most competitive cost.
As a result, these platforms discourage us from taking more discursive action—all a part of surveillance capitalism efforts by big internet service providers to prevent us from thinking deeper about our purchases. The formula across different systems is similar: structure our work, extract our data, and lead us to buy something that we may or may not need. Regardless, the end result is to influence the general population’s behavior so that they are at an advantage.
“The potential for giving people freedom or reducing their freedom is there in any type of technology, across different types of political systems, even though it may look very different in different systems,” Bernd concluded.
Modern technology draws parallels to a panopticon, where prisoners would be watched around the clock. While the original intention of this set-up was to benefit the prisoner through observation and feedback, the term is now being used as a mechanism of control.
Indeed, when we are under constant surveillance from devices we’ve become so reliant on, it can either have a chilling effect or a normalizing effect. The outcomes are undetermined, but it certainly plays a crucial role in altering human behavior. Transparency in information systems will be important in bringing back the power, and the capacity to speak, to the people.
When asked about his parting words, Bernd encouraged listeners to think of humanity as an ecosystem: the reality that we live in a society of other individuals and other actors, with unique needs and desires. It’s a fragile ecosystem, and one that we should try and balance in our capacity, as stewards of the earth and of each other.
Businesses and information systems were previously thought to be all about improving efficiencies and maximizing productivity. However, we’ve moved far beyond such a profit-driven perspective; now, Bernd hopes we remember that technology is always socio-technical, with human beings working alongside modern devices to improve the living circumstances of their fellow human beings.
It is this sense of urgency to uplift the living conditions for humans across the board that encouraged us to develop TARTLE. Data-driven measures are the key to rebuilding the self-awareness we’ve lost in the great tech race for the boldest, biggest, and flashiest devices. The power is back in your hands.
What’s your data worth?
A Critique of Capitalism With Author, Professor, and Director, Bernd Stahl by TARTLE is licensed under CC BY-SA 4.0
Technology is quickly becoming the backbone of modern infrastructure. At the pace that it is progressing, it may someday become as ubiquitous and as vital to our economy as cement and concrete. However, AI is agnostic. Despite its immense computing capabilities, it will never be capable of human understanding and discernment.
One example of this is the results derived from A/B Testing, where researchers compare two versions of a marketing asset to see which one performs better. While it can show which campaign would run better, it cannot provide any new learning.
With this limitation in mind, is it still beneficial for us to know what the most probable outcome for a certain event would be—even if we don’t understand the why or how for its occurrence?
At this point, David discussed an imaginary scenario where even something as non-controversial as spam mail could become a problem if it was found that legitimate emails from businesses owned by people of color were found to be falsely marked as spam at an unequal rate, in comparison to people who are not of color. Aside from the inefficiency, the AI would become an unfair metric for emails and may even be damaging businesses on the basis of race.
The decision-making process behind sorting emails into the spam folder is compromised because the technology is using so many signals in “deep, complicated, and multi-independent patterns of probability” that will be near-impossible to comprehend without a lot of time, money, and effort. At this point, this massive system is hurting communities who are already disenfranchised in the first place.
This brings to mind Microsoft’s Tay.ai, a chatbot on Twitter created by the tech giant in 2016 that was designed to mimic the conversational patterns of a 19-year-old girl. It would learn from continuous interaction with other users on the social media platform.
Immediately after its release, Tay became controversial after it started tweeting inflammatory and offensive comments. As a result, Microsoft was pushed to shut down the service only sixteen hours after it was launched.
It’s a clear indication that the people responsible for programming AI have a corresponding social burden to fulfill, particularly in ensuring that their technology does not harm anyone. This burden can become even bigger when machine learning and AI is applied to other fields, such as medicine and smart transportation.
Beyond Tay.ai, computer scientists and engineers around the world find themselves at the helm of constructing technologies with so much potential. How do we address inherent human bias in these individuals?
David reveals that most people who have the knowledge to work with these complex technologies do not necessarily have the same depth of understanding for social justice as well. This led to a call for participatory machine learning, otherwise known as the design justice movement.
Participatory machine learning involves people who are familiar with related issues on social justice, as well as communities who would be most affected by the presence of new technologies. They are given a position in planning and management.
Their input is important from the get-go because it does have an impact on how these systems work. To further explain, David painted the picture of an imaginary emerging smart city that decided to use AI to reinvent its bus system.
Ultimately, all the new bus stops, routes, and schedules are successful in moving people faster to their destinations, and the numbers echo its success. However, a caveat: these statistics have been decided on average, and only shows that it is successful based on how well it moves affluent communities more efficiently than those located on the outskirts of the city. Those living on the outskirts, who need efficient transportation more than others for work and productivity, become isolated from the system.
At this point, it would be difficult to unravel all the work put in making the new transportation system a success. It’s important for the marginalized to be consistently consulted on the impacts of new infrastructures and technologies, even after construction and installation is finished. Those responsible for creating these systems have a special responsibility to ensure that those who do not have the same footing will finally get a seat at the table.
David agrees that it may be a lengthier, more expensive process. After all, it will take more time, money, and effort to locate these people, recruit them, and ensure that everyone is on the same page. However, it is the cost that we need to pay if we want a shot at eliminating inequality.
Beyond the cost of bringing people to the table, David acknowledges that technological progress is already expensive in and of itself. Machine learning systems require individuals who are highly educated in computer science and computer engineering; they will also need other systems that require massive technologies to run.
Finally, lingering questions on data sharing and ownership prevent communities from fully utilizing what they have. To what extent do you own your data and what should your relationship with it be? What does it mean to own something?
We do not live in individual data cocoons that we own. We live in a community. This public community cannot be run without public data, and public sharing of information about one another.
The thoughts that define my actions within this system of public information and data, however, are missed by algorithms, analysis, and machine learning. This is because people do not want or have the ability to share why they are driven to take certain actions.
Ultimately, it appears that one of our most profound discoveries from machine learning is that the world is much more complex than we ever wanted to believe. Despite these sophisticated machines processing massive amounts of information, we do not have the capability to provide a completely accurate and precise prediction of what will happen.
This does not mean that the approximate knowledge we have now is worthless. It helps us appreciate our universe in a new way by teaching us to be comfortable with complexity.
In line with TARTLE’s mission to promote stewardship and collective responsibility, Alexander asked the implications of machine learning in helping humans create better decisions and more informed choices based on the observable universe. To this, David asked a thought-provoking question: why do you think humans are entitled to understanding?
Machine learning and artificial intelligence is capable of taking us to greater heights without the interference of human cognitive biases. With its objective oversight, it has the potential to bring out the best in us as human beings that live in a complex system.
As technology continues to innovate at an unprecedented pace, David leaves us with a parting message: machine learning will drive us to examine all the values that we hold, and sometimes to consider painful trade-offs between two or more equally important values.
“So don’t hold on too tightly to any one value; think about how you may have to give up on some of it in order to support other very important targets.” David concluded.
Everyday Chaos: Technology, Complexity, and How We're Thriving in a New World of Possibility Harvard Senior Researcher and Best Selling Author David Weinberger, Ph.D. by TARTLE is licensed under CC BY-SA 4.0
Technology Breakthroughs: Good and Bad
People are amazing creatures. We are constantly developing new, exciting, and at times, worrying technology. Sometimes, they are the same thing. When the first person harnessed fire, figured out how to transport it and get it started, no doubt it was all of these things and for obvious reasons. Even something as benign as the wheel has led to technologies that it would be better if no one had. Yet, we continue to innovate, striving for the good and often stumbling along the way. Recently, the MIT Technology Review released its annual round up of breakthrough technologies. Let’s take a look at some of them and some positives and negatives.
The first on the list is mRNA vaccines. While recent deployment of covid vaccines has gotten a lot of attention for being the first vaccine of this kind, the base technology has existed since the 1990s. While many are skeptical of the covid vaccines, this type of medical application could have a hugely positive impact on our ability to fight a whole host of diseases such as HIV.
Next up is GPT-3 which is a language learning program. It actually has the capability to mimic what people write thanks to being trained on tons of books and of course the internet. The aim of this program is to help computers better understand the way people think and express those thoughts and thus take another step closer to artificial intelligence. On the negative end of this is the fact that some of the people working on it seem to think that they need to train it to not hold certain biases. On the surface, that might not get your attention. It would be a good thing if the AI didn’t have any biases wouldn’t it? Sure, but what about the biases of those who are training it? Since the AI is learning based on reading what people have written, those programmers training the AI are making determinations on biases not just for themselves, but for the AI by determining which human writings are biased. Even more, the AI is likely to be seen as completely unbiased and objective by the general public. There will be a lot of people who accept its conclusions without question, making the inevitable bias inherent in the system something that affects the opinions of potentially millions.
Shifting gears a bit, TikTok has done some interesting things with its recommendation algorithms. It actually works not just off of likes but it cross-references the preferences of others who like the same video. That helps it recognize communities of people, niches with similar interests. You basically get recommendations if someone within this network likes the same video you did.
Naturally, Lithium-Metal batteries are great. They can help store energy from lots of renewables for a long time. As always though there is a downside. Those batteries involve a lot of mining operations that are more than a little rough for the environment. Is it a net gain if we can better use renewables? Maybe, but that is going to take a lot more data to figure that out.
The one we are going to leave off with today is data trusts. The idea is that some sort of entity will manage your data on your behalf. Why on earth do I need someone, anyone to manage my data? The answer is that I don’t. I and thousands of people on TARTLE from all around the world are perfectly capable of taking care of our data ourselves. All they have to do is sign up and then choose whether or not to share it. It’s very simple. When people talk about doing things on your behalf, it usually means they are looking for ways to get something from you. In this case, it is the data itself. This feels very much like an attempt to get more access to people’s data and use it to find ways to manipulate them.
As we said, sometimes technological innovation is both exciting and concerning. There is almost always some kind of downside. One thing that doesn’t have a downside is TARTLE. All you do is sign up and protect or share your data at your own convenience and get rewarded for it. For once, there isn’t a downside.
What’s your data worth? www.tartle.co