🌏 Project URL
https://ekkezia.github.io/shared-minds/week1
🐈 Github
https://ekkezia.github.io/shared-minds/week1
🖍️ Notes
blink-demo


🛠️ Making
The reason why I'm writing this now is because I was awakened at 5:37am (is it the jetlag? or as my mom suggested, this is the perfect healthy waking up time, so maybe it's just my body growing older and realizing what's good for it). First thing that I do when I wake up: opening my eyes. The rest of the things that happen after waking up... I don't really remember what they are and don't keep track of their orders as well. The act of opening my eyes is a reliable marker that I have woken up.

With that in mind, I’m creating an interface that only renders when user is closing their eyes. The media to be rendered is a webcam stream, which essentially shows the user in it as well. Such interaction defeats the traditional idea of an interface (to ) because user cannot see the message on the screen when they closes their eyes. This kind of interaction shuts down the input for visual perceptions. The interface subverts the idea where “an interaction is needed for a connection (e.g, messages, video stream) to happen” into “an interaction is needed for the user to omit the connection itself”. In this case, the interaction is the act of closing the eyes or shutting down one’s visual perception, and the connection to be omitted is seeing the camera stream which will display the user.

To get closer to the consciousness, do we utilise our human senses intensely or do we avoid using them (just like in my web experiment)?

Though our computer experience is not an essential requirement for survival, in a way it could briefly simulate the act of reaching for my consciousness. In our meditation exercise in class last week, closing eyes can be an interaction mantra to reach our inner consciousness, and similarly, my web interface instructs user to close their eyes in order to “materialize”their consciousness.


🚦 Bonus

My consciousness is optimizing my school workload by connecting this week's project (from this class) with PComp. For PComp assignment, we have a task to create a switch of our own. In this case, I'm using the same interaction (closing the eyes) and the project website as an interface to turn ON/OFF the LED.

💬 Rambling

implicit x explicit -> what differs us from computers

mind wanders to nasty place, because evolution favours you thinking the worst thing in order to survive

think about an ecosystem i'd like people to try living in



💡 Ideas
console log vs the visual on frontend -> what if it's flipped?


🧠 Thinking
In “The Interface of Perception”, Hoffman suggested that our perception does not give an accurate representation of the world, but rather a specific user interface that is tailored to each species with the goals of survival. Perception is a component of consciousness.


Code Cheatsheet

running a web server -> python -m SimpleHTTPServer 8000

Elizabeth Kezia Widjaja © 2025 🙂