Difference between revisions of "World 3"

From Dadaab Wiki
Jump to: navigation, search
Line 61: Line 61:
  
  
== Law and Enforcement ==
+
=== Law and Enforcement ===
  
 
Biascope was first implemented for the officers of Law Enforcement. This was predicted to be highly useful in this system because of the repetitive ill-treatment of marginalized persons (specifically, the Black community) by certain police officers. With efforts to address the concerns Black Lives Matter, the app was expected to significantly identify implicit biases in Law Enforcement officers. A report of their implicit biases was required to be submitted in a timely manner. Research shows that the use of Biascope showed an immediate reduction in police violence against marginalized groups but recent research claims that this change was only temporary. In CTV news last week, (someone laid out some important claims/questions)
 
Biascope was first implemented for the officers of Law Enforcement. This was predicted to be highly useful in this system because of the repetitive ill-treatment of marginalized persons (specifically, the Black community) by certain police officers. With efforts to address the concerns Black Lives Matter, the app was expected to significantly identify implicit biases in Law Enforcement officers. A report of their implicit biases was required to be submitted in a timely manner. Research shows that the use of Biascope showed an immediate reduction in police violence against marginalized groups but recent research claims that this change was only temporary. In CTV news last week, (someone laid out some important claims/questions)
Line 70: Line 70:
 
   
 
   
  
== Law and Order ==
+
=== Law and Order ===
  
 
Biascope ensures that the jurors, lawyers, and judges do not let their implicit biases get in the way of their judgments of crimes. As difficult as this maybe, this app lets them know and be self-aware to make an informed decision free from bias.
 
Biascope ensures that the jurors, lawyers, and judges do not let their implicit biases get in the way of their judgments of crimes. As difficult as this maybe, this app lets them know and be self-aware to make an informed decision free from bias.

Revision as of 09:10, 24 November 2020

Biascope

smaller

even smaller

Introducing Biascope

Imagine...

Imagine a world in which car crashes could be prevented, school shootings could be eliminated, racism could significantly decline, and we’d feel better about ourselves and have greater capacities for self-respect and feelings of worth.

We envision a world in which humans are better able to regulate their emotional and physical wellbeing by the use of an app (downloadable to both smart devices and laptops). Similar to a “smart watch”, this app monitors heart rate, counts steps, etc. but is also equipped with an AI brain that allows it to detect our subconscious thoughts and emotions. It’ll be able to monitor our brain waves and activity and also pick up signals of developing thoughts of bias, hatred, anger, and the like.

The Raging Waters of Adolescence

Adolescent years are some of the most personally-challenging and traumatic years we’ll ever face in our lives. Our bodies experience biological and physiological changes, our emotions go through constant fluctuations, and our mental wellness is tried on a continuous basis because of our changing hormones. In addition to this, our psychological developmental task during these is to begin creating our own identity and breaking away from the ones our parents created for us. It’s a time during which we’re in between childhood and adulthood, and in which we can safely begin wading through the waters of adult responsibilities without totally being held accountable for them.

By and large, this self-identity relies heavily on our peer relationships - the groups we fit into (and hope to be included in), the ways others think of us, and the ways we perceive ourselves based on those of these influences. Adolescence is a time for us to also develop our senses of right and wrong and truth and falsehood. It’s during these years that we begin asking ourselves the “big” questions and thinking about the larger world around us and our place therein. We take risks - largely because our cognitive capacities for decision-making aren’t fully developed yet - and push the boundaries of what we once thought was possible.

All this is good and healthy, and should be encouraged.

Potential Preventions

But what about when we lose control? What about when we begin developing unhealthy, biased or racial ideas about others or groups? What about when we give into the pressures of fitting in and lose our senses of self-worth and dignity?

What if we could prevent this?

What if we each had an app with face-recognition that could warn us of incoming threats? Or that could read our facial expressions and help us develop more friendly demeanours? What if the app measured our brain activity and sent us notifications and empowerment texts when we were feeling down or more dangerously, attempting self-harm or suicide?

What if this app could sync with our cars and detect an oncoming vehicle and prevent an accident? Or prevent us from going over the speed limit? What if this app was connected to the Police and alerted the necessary individuals of local dangers or medical emergencies?

Outweighing the Potential Negatives

The dangers of such an app are obvious, such as its information getting in the hands of the wrong individual or group, or it taking away our freedom of choice to make a mistake and grow from there. But, what if it was designed in such a way that each user could program its settings to release certain information to only specific individuals (similar to how a doctor requires that his or her patients sign a medical release before consultation with a colleague or before sharing a medical chart). What if it gave people a greater sense of autonomy and kept the community, our schools, and our social venues safer?


Opportunities & Challenges

Bronson's Part

While we don’t have apps and devices to specifically monitor and log our biases, emotions, and subconscious, social media and other apps can monitor our habits, location and other aspects of our lives.

So many times, we’ll be talking about a certain products and within the same day, Google or Facebook seem to know exactly what to advertise to us, either through Facebook, YouTube, Instagram or even just a Google search engine. Our “suggested” tabs fill up with products we may have mentioned one or two times in passing. Maybe you spoke to your brother about bulking up? Whey protein, exercise bands, and weight sets seem to be in you “recommended products”, fitness pages, and models are in your explore tab on IG, and Goodlife and LA fitness banners are now littered across your Facebook page… and of course we all click. We want to check out that “goal physique”. We’re curious, how much is that protein powder? What are the benefits? If you’ve liked, shared or saved any of these posts, you’re now telling the algorithm I’M INTERESTED! I WANT THAT! So now even if you were not thinking about this too heavily, it’ll be constantly marketed towards you.

Facebook also has the ability to track other websites you’re visiting even when you’re not using the app/website. Because Facebook owns Instagram and WhatsApp, it’s not crazy to think that all of these apps are constantly monitoring the websites you’re visiting, conversations you’re having, and/or places that you’ve been. Google tracks you through services like Google maps, weather updates, and browser searches, and can get an idea of places/ businesses you like to visit/frequent, topics and products you’re “googling”, videos of interest (YouTube) etc.

Wearable tech such as FitBit and smart watches make it easier to track your heartbeat, temperature and other vital signs. Front facing cameras on most up-to-date smart phones can distinguish distinct human faces from one another. Fingerprint sensors can detect and store fingerprints, however currently there are no apps to translate this data into bias or emotional data.

While the algorithms used in these websites and apps track your “buying bias”, and search history, it’s hard for these algorithms to determine your bias towards race, age, gender, sexual orientation etc., and your emotions and subconscious. There are tests and quizzes to test for bias and general personality types, however there is no current app or technology to register these qualities in real time.

IBM has released a program ( OpenScale) to monitor their Ai and machine learning for bias against protected attributes such as sex, ethnicity, age, religion, marital status etc. These bias monitors help to speed up configuration and ensure that AI models are being checked for fairness against sensitive attributes.

Implications on Society

APP in Education: Although Biascope was not in use within the education system in the beginning, as other fields began to observe tangible positive outcomes, (app name) became vital in Canada’s education system. Until the 2030s, some of the significant concerns of teacher-student relationships, assessments in school and overall student success were the implicit biases of educators. Research in the 2010s and 2020s shows that students from marginalized groups were continued to be streamed into more college-based courses rather than university education. Additionally, even the students from marginalized groups who overcame academic barriers were assessed poorly with regards to their “Learning Skills” compared to their peers strongly due to educator implicit bias. To address, alleviate and eventually eliminate these biases, it was deemed necessary that this app be used in school, especially by professionals in education. Until the 2030s the education system relied heavily on standardized tests, where every subject in every grade and school will be assessed through “question and answer” test formats. However, now, the education system has moved towards lesser standardized testing and more collaborative tasks that asses a student’s progress. By combining culturally responsive pedagogies and Indigenous ways of knowing, students and educators work together to co-construct multiple assessment methods that is focused on more meaningful learning unlike the past where “passing” was the focus. With this change of culture, (app name) as students rely a lot on their self-awareness, and self-reliance, using this technology during the learning stages has helped identify their own biases at an early stage which holds the promise of them developing into empathetic adults. (Note: Does the app have an age limit?). With education being completely global now and remote learning is in every country, people from many different cultures and backgrounds interact with each other daily and are more likely to have implicit biases towards one another. Thus, (app name) is highly beneficial to understand the barriers and work on them for maintaining positive relationships.


Law and Enforcement

Biascope was first implemented for the officers of Law Enforcement. This was predicted to be highly useful in this system because of the repetitive ill-treatment of marginalized persons (specifically, the Black community) by certain police officers. With efforts to address the concerns Black Lives Matter, the app was expected to significantly identify implicit biases in Law Enforcement officers. A report of their implicit biases was required to be submitted in a timely manner. Research shows that the use of Biascope showed an immediate reduction in police violence against marginalized groups but recent research claims that this change was only temporary. In CTV news last week, (someone laid out some important claims/questions)

“Detecting implicit bias is not sustainable, because we are not addressing the conscious problematic beliefs and biases. We are assuming that none of us have actual racist beliefs in 2040 but this is not the case. Some people are just hiding such beliefs”

Cons: Are officers who show implicit bias penalized? Or assessed by higher officials? If so who do the higher officials report to? Or is this more to do with self and if so, how do we know that progress is being made.


Law and Order

Biascope ensures that the jurors, lawyers, and judges do not let their implicit biases get in the way of their judgments of crimes. As difficult as this maybe, this app lets them know and be self-aware to make an informed decision free from bias.

Social Lives and Relationships: With globalization having taken over and relationships becoming extensively online, this app helps interactions with people from different cultures and backgrounds. It eases any barriers people may have between each other.

CONS: Government Policies and Global Politics: CONS: Globalized education and world, are we merging into one large blob? Are our individual cultures and backgrounds being diffused and forgotten? Is there a possibility? App only detects implicit bias, so what about the individuals who hold openly conscious biases such as racism, prejudice, homophobia, xenophobia.


Biascopes' Press Release on Education


Responses to Biascope

Marketing Projections

Biascope was developed in 2045 using a combination of technologies to reach and uncover the deepest part of the human psyche – the subconscious. It was introduced to assist the court system, which for decades has been criticized as unfair and outdated. Indeed, it has worked to free thousands of accused criminals who were wrongly imprisoned due to jury decisions and flawed loopholes. On the other hand, it also brought justice to those who were able to walk freely due to their power and status within society. The results were promising; an honest population and a step towards a perfect world. The problems that once plagued humanity suddenly had a clear solution.

Over the next few years, Biascope was licensed and distributed for personal use. It was not mandatory then, yet by 2048 the federal government made the weighty decision to implement this technology into schools. Their reasoning was simple: morally correct students will shape the future into one with less division and more content. Within the studies that were conducted, researchers found that Biascope genuinely makes people into better human beings. They have noted that constantly being reminded of personal bias as well as respectful thoughts encourages individuals to work towards change. While it is too early to tell, there have been speculations that the moderate number of personal users outside of schools have already decreased the amount of conflict that goes on in public places.

Resistance to Biascope

Since the public initiation of Biascope, there has been scepticism of and even full rebellion against this technology. Several groups have publicly declared it an “invasion of privacy,” (Citizens Against Surveillance, 2049) stating that they refuse to have their own subconscious thoughts monitored regardless of who it benefits. Even with the government’s justification for instilling this technology in schools, there is concern about why all students must participate. In response, some radical groups have suggested reverting back to how things were in the early twenty-first century, when “freedom of speech meant something” (Nationalists of NA, 2047), to which critics responded that free speech should not come at the expense of other groups or identities, which is what Biascope aims to fix.

Others have called attention to the growing suspicion that this app is, whether intentionally or not, gearing people to behave and think in the same way. This is because individuals receive similar alerts for similar thoughts, which may unknowingly influence them to change their patterns of thinking altogether. There is, essentially, an ideal way the app assumes people should be thinking. Multiple parents have come forward with reports of growing similarities between their children’s distinct personalities, causing them to question whether the use of Biascope is necessary for so many youths at all times.

Personal Experiences

Several months after instilling Biascope into schools, a number of students and teachers were asked to provide their testimonies. Following are two notable transcripts that showcase the downfalls and benefits of this technology.

Diary of an anonymous 10th grade student:

"I’ve been using Biascope for about two or three months now, and I’m going to write a little about how it’s been. At first, it was good because it helped me feel better about myself and l was able to learn about all the ways I’ve been unfair or had bad thoughts without even realizing it. But then, I realized it was also stopping me from making mistakes that I could learn from on my own. Biascope didn’t let me do even the little things that might be problematic because it would warn me right away and I would know, okay, I can’t do that. I’m not saying it’s bad, since I know it’s doing a lot of good things for the world. I just wish it would let me be human without making every negative thought into a lesson."

Diary of an anonymous high school teacher:

"After installing Biascope, my job has become much more enjoyable and perhaps even easier! There are less problems between the students in school and I think everyone feels more included. It’s safer for sure. In the classroom, we’re able to have respectful discussions and I’m able to assess students based strictly on their learning. I’ve always worried that deep down, I would play favourites, but now I don’t worry about that at all. I can focus on doing what needs to get done and my old worries have been washed away."