Difference between revisions of "World 3"

From Dadaab Wiki
Jump to: navigation, search
Line 1: Line 1:
  
 
== Biascope ==
 
== Biascope ==
 +
 +
=== smaller ===
 +
 +
==== even smaller ====
  
  

Revision as of 12:06, 19 November 2020

Biascope

smaller

even smaller

Effects of Biascope on World 3

Imagine a world in which car crashes could be prevented, school shootings could be eliminated, racism could significantly decline, and we’d feel better about ourselves and have greater capacities for self-respect and feelings of worth.

We envision a world in which humans are better able to regulate their emotional and physical wellbeing by the use of an app (downloadable to both smart devices and laptops). Similar to a “smart watch”, this app monitors heart rate, counts steps, etc. but is also equipped with an AI brain that allows it to detect our subconscious thoughts and emotions. It’ll be able to monitor our brain waves and activity and also pick up signals of developing thoughts of bias, hatred, anger, and the like.

Adolescent years are some of the most personally-challenging and traumatic years we’ll ever face in our lives. Our bodies experience biological and physiological changes, our emotions go through constant fluctuations, and our mental wellness is tried on a continuous basis because of our changing hormones. In addition to this, our psychological developmental task during these is to begin creating our own identity and breaking away from the ones our parents created for us. It’s a time during which we’re in between childhood and adulthood, and in which we can safely begin wading through the waters of adult responsibilities without totally being held accountable for them.

By and large, this self-identity relies heavily on our peer relationships - the groups we fit into (and hope to be included in), the ways others think of us, and the ways we perceive ourselves based on those of these influences. Adolescence is a time for us to also develop our senses of right and wrong and truth and falsehood. It’s during these years that we begin asking ourselves the “big” questions and thinking about the larger world around us and our place therein. We take risks - largely because our cognitive capacities for decision-making aren’t fully developed yet - and push the boundaries of what we once thought was possible.

All this is good and healthy, and should be encouraged.

But what about when we lose control? What about when we begin developing unhealthy, biased or racial ideas about others or groups? What about when we give into the pressures of fitting in and lose our senses of self-worth and dignity?

What if we could prevent this?

What if we each had an app with face-recognition that could warn us of incoming threats? Or that could read our facial expressions and help us develop more friendly demeanours? What if the app measured our brain activity and sent us notifications and empowerment texts when we were feeling down or more dangerously, attempting self-harm or suicide?

What if this app could sync with our cars and detect an oncoming vehicle and prevent an accident? Or prevent us from going over the speed limit? What if this app was connected to the Police and alerted the necessarily individuals of local dangers or medical emergencies?

The dangers of such an app are obvious, such as its information getting in the hands of the wrong individual or group, or it taking away our freedom of choice to make a mistake and grow therefrom. But, what if it was designed in such a way that each user could program its settings to release certain information to only specific individuals (similar to how a doctor requires that his or her patients sign a medical release before consultation with a colleague or before sharing a medical chart). What if it gave people a greater sense of autonomy and kept the community, our schools, and our social venues safer?



Opportunities & Challenges

Bronson

Implications on Society

Education & Biascope

Critiques of Biascope