🖍️ Notes

This week me and Anto rounded up the main features and UI of our dashboard. As agreed before, we’ll display the sketch on the main UI board. Additionally, we’ll add a playback timeline scrubber with a graph view on top of it in a little menu at the bottom right corner.

Bottom left corner shall be settings to toggle the visibility of our graph helper grids, save the sketch, etc.

Despite all these, we still have a question on how these information will be useful to user. So far this has been an expressive and more of an “outputting” device rather than an informative one. Since this assignment emphasizes on functionality of connected device to “translate” some things in our life — often times invisible/imperceptible to human eyes (e.g weather, signals) —into human-understandable data, we augment our sketching idea to be of something directly impactful to our computer: a remote mouse!


Using robotjs on our server, we’re able to use the IMU data from the Arduino to control the mouse position on our computer. So far we have only implemented the heading (for the direction of the mouse) and soon need to figure out how to use the gyroscope and accelerometer to determine the magnitude of the mouse movement.


There is also another unresolved issue of the network keep stopping every once in a while. And also a suspect that the Arduino may not have enough memory to run all these operations. We tried changing to MQTT, and while it seems to be able to reconnect (unlike TCP that never reconnects again according to our experience) to the network, it doesn’t erase the fact that it does still stop sending sensor data every once in a while.


Enough talk about the problems.


We’re even more excited with this project now as it takes a non-technological object that we’re familiar with (i.e, the magic wand, a toy that is related to childhood, whimsy, campy), and bestows it a power to connect to the network and thus transforms into a connected device.


Our typical computer devices and accessories are mostly designed in a discrete way compared to predating non-technological objects. We suggest that casing the program with a hardware that is adapted from other non-technological tools may be impactful to make our “alienated” computers recede into the background of our lives. Wearables have been at the forefront of this direction, but we want to take silly, familiar things that everyone can use (accessibility please!) to be connected devices as well.



That being said, the data for the mouse can be visualized as heatmap (thicker color on the part that has more mouse movement on it) on the sketchboard. This information has been useful for some proprietors, such as determining which area will be most optimal for ads. Of course we’re not aiming for this purpose, but we’re still figuring out how the heatmap will be useful to user. Maybe it can suggest people to move their most used window/widget/app to that area? Or maybe it should just stay as a heatmap just so people are aware of their movement, just like how people want to know their daily steps on our Health app.


Another thing why this mouse idea is more suitable compared just solely sketch is because mouse position is by design bounded to the screen size. So we’ll never run into problem where the path of the movement needs to go out of canvas bound (when you draw on thin air there is no clear boundary unless we set so). That being said, it will still be nice to display the mouse movement as a saveable sketch — still sticking with the idea that this project is an expressive tool.

Elizabeth Kezia Widjaja © 2026 🙂