World 3
Biascope
smaller
even smaller
Introducing Biascope
Imagine...
Imagine a world in which car crashes could be prevented, school shootings could be eliminated, racism could significantly decline, and we’d feel better about ourselves and have greater capacities for self-respect and feelings of worth.
We envision a world in which humans are better able to regulate their emotional and physical wellbeing by the use of an app (downloadable to both smart devices and laptops). Similar to a “smart watch”, this app monitors heart rate, counts steps, etc. but is also equipped with an AI brain that allows it to detect our subconscious thoughts and emotions. It’ll be able to monitor our brain waves and activity and also pick up signals of developing thoughts of bias, hatred, anger, and the like.
The Raging Waters of Adolescence
Adolescent years are some of the most personally-challenging and traumatic years we’ll ever face in our lives. Our bodies experience biological and physiological changes, our emotions go through constant fluctuations, and our mental wellness is tried on a continuous basis because of our changing hormones. In addition to this, our psychological developmental task during these is to begin creating our own identity and breaking away from the ones our parents created for us. It’s a time during which we’re in between childhood and adulthood, and in which we can safely begin wading through the waters of adult responsibilities without totally being held accountable for them.
By and large, this self-identity relies heavily on our peer relationships - the groups we fit into (and hope to be included in), the ways others think of us, and the ways we perceive ourselves based on those of these influences. Adolescence is a time for us to also develop our senses of right and wrong and truth and falsehood. It’s during these years that we begin asking ourselves the “big” questions and thinking about the larger world around us and our place therein. We take risks - largely because our cognitive capacities for decision-making aren’t fully developed yet - and push the boundaries of what we once thought was possible.
All this is good and healthy, and should be encouraged.
Potential Preventions
But what about when we lose control? What about when we begin developing unhealthy, biased or racial ideas about others or groups? What about when we give into the pressures of fitting in and lose our senses of self-worth and dignity?
What if we could prevent this?
What if we each had an app with face-recognition that could warn us of incoming threats? Or that could read our facial expressions and help us develop more friendly demeanours? What if the app measured our brain activity and sent us notifications and empowerment texts when we were feeling down or more dangerously, attempting self-harm or suicide?
What if this app could sync with our cars and detect an oncoming vehicle and prevent an accident? Or prevent us from going over the speed limit? What if this app was connected to the Police and alerted the necessary individuals of local dangers or medical emergencies?
Outweighing the Potential Negatives
The dangers of such an app are obvious, such as its information getting in the hands of the wrong individual or group, or it taking away our freedom of choice to make a mistake and grow from there. But, what if it was designed in such a way that each user could program its settings to release certain information to only specific individuals (similar to how a doctor requires that his or her patients sign a medical release before consultation with a colleague or before sharing a medical chart). What if it gave people a greater sense of autonomy and kept the community, our schools, and our social venues safer?
Opportunities & Challenges
Bronson's Part =
While we don’t have apps and devices to specifically monitor and log our biases, emotions, and subconscious, social media and other apps can monitor our habits, location and other aspects of our lives.
So many times, we’ll be talking about a certain products and within the same day, Google or Facebook seem to know exactly what to advertise to us, either through Facebook, YouTube, Instagram or even just a Google search engine. Our “suggested” tabs fill up with products we may have mentioned one or two times in passing. Maybe you spoke to your brother about bulking up? Whey protein, exercise bands, and weight sets seem to be in you “recommended products”, fitness pages, and models are in your explore tab on IG, and Goodlife and LA fitness banners are now littered across your Facebook page… and of course we all click. We want to check out that “goal physique”. We’re curious, how much is that protein powder? What are the benefits? If you’ve liked, shared or saved any of these posts, you’re now telling the algorithm I’M INTERESTED! I WANT THAT! So now even if you were not thinking about this too heavily, it’ll be constantly marketed towards you.
Facebook also has the ability to track other websites you’re visiting even when you’re not using the app/website. Because Facebook owns Instagram and WhatsApp, it’s not crazy to think that all of these apps are constantly monitoring the websites you’re visiting, conversations you’re having, and/or places that you’ve been. Google tracks you through services like Google maps, weather updates, and browser searches, and can get an idea of places/ businesses you like to visit/frequent, topics and products you’re “googling”, videos of interest (YouTube) etc.
Wearable tech such as FitBit and smart watches make it easier to track your heartbeat, temperature and other vital signs. Front facing cameras on most up-to-date smart phones can distinguish distinct human faces from one another. Fingerprint sensors can detect and store fingerprints, however currently there are no apps to translate this data into bias or emotional data.
While the algorithms used in these websites and apps track your “buying bias”, and search history, it’s hard for these algorithms to determine your bias towards race, age, gender, sexual orientation etc., and your emotions and subconscious. There are tests and quizzes to test for bias and general personality types, however there is no current app or technology to register these qualities in real time.
IBM has released a program ( OpenScale) to monitor their Ai and machine learning for bias against protected attributes such as sex, ethnicity, age, religion, marital status etc. These bias monitors help to speed up configuration and ensure that AI models are being checked for fairness against sensitive attributes.
Biascopes' Press Release on Education
Responses to Biascope
Marketing Projections
Biascope was developed in 2045 using a combination of technologies to reach and uncover the deepest part of the human psyche – the subconscious. It was introduced to assist the court system, which for decades has been criticized as unfair and outdated. Indeed, it has worked to free thousands of accused criminals who were wrongly imprisoned due to jury decisions and flawed loopholes. On the other hand, it also brought justice to those who were able to walk freely due to their power and status within society. The results were promising; an honest population and a step towards a perfect world. The problems that once plagued humanity suddenly had a clear solution.
Over the next few years, Biascope was licensed and distributed for personal use. It was not mandatory then, yet by 2048 the federal government made the weighty decision to implement this technology into schools. Their reasoning was simple: morally correct students will shape the future into one with less division and more content. Within the studies that were conducted, researchers found that Biascope genuinely makes people into better human beings. They have noted that constantly being reminded of personal bias as well as respectful thoughts encourages individuals to work towards change. While it is too early to tell, there have been speculations that the moderate number of personal users outside of schools have already decreased the amount of conflict that goes on in public places.
Resistance to Biascope
Since the public initiation of Biascope, there has been scepticism of and even full rebellion against this technology. Several groups have publicly declared it an “invasion of privacy,” (Citizens Against Surveillance, 2049) stating that they refuse to have their own subconscious thoughts monitored regardless of who it benefits. Even with the government’s justification for instilling this technology in schools, there is concern about why all students must participate. In response, some radical groups have suggested reverting back to how things were in the early twenty-first century, when “freedom of speech meant something” (Nationalists of NA, 2047), to which critics responded that free speech should not come at the expense of other groups or identities, which is what Biascope aims to fix.
Others have called attention to the growing suspicion that this app is, whether intentionally or not, gearing people to behave and think in the same way. This is because individuals receive similar alerts for similar thoughts, which may unknowingly influence them to change their patterns of thinking altogether. There is, essentially, an ideal way the app assumes people should be thinking. Multiple parents have come forward with reports of growing similarities between their children’s distinct personalities, causing them to question whether the use of Biascope is necessary for so many youths at all times.
Personal Experiences
Several months after instilling Biascope into schools, a number of students and teachers were asked to provide their testimonies. Following are two notable transcripts that showcase the downfalls and benefits of this technology.
Diary of an anonymous 10th grade student:
"I’ve been using Biascope for about two or three months now, and I’m going to write a little about how it’s been. At first, it was good because it helped me feel better about myself and l was able to learn about all the ways I’ve been unfair or had bad thoughts without even realizing it. But then, I realized it was also stopping me from making mistakes that I could learn from on my own. Biascope didn’t let me do even the little things that might be problematic because it would warn me right away and I would know, okay, I can’t do that. I’m not saying it’s bad, since I know it’s doing a lot of good things for the world. I just wish it would let me be human without making every negative thought into a lesson."
Diary of an anonymous high school teacher:
"After installing Biascope, my job has become much more enjoyable and perhaps even easier! There are less problems between the students in school and I think everyone feels more included. It’s safer for sure. In the classroom, we’re able to have respectful discussions and I’m able to assess students based strictly on their learning. I’ve always worried that deep down, I would play favourites, but now I don’t worry about that at all. I can focus on doing what needs to get done and my old worries have been washed away."