Skip is a so-called smart lamp. It can vary its brightness and shine in various positions which both can be set by the lamp itself or by its owner. It has a machine learning algorithm and by adjusting brightness or position, its owner teaches it what kind of light they prefer for a given situation.
Learning algorithms, like the one used in Skip, are part of the fast increasing presence of Artificial Intelligence. Smart homes, intelligent lighting, apparently our newest possessions are brilliant, maybe even more than we are? But one’s smart assistant does not read one’s mind, nor does one’s thermostat. They only learn from the data they are provided with. With this in mind, Skip was designed to be explicitly trained, just like you would train a dog. By using this metaphor, the machine learning algorithm becomes a thing everyone can grasp. Praising leads the lamp to show the behavior in similar situations more often, punishment does the opposite.
Skip uses a combination of reinforcement learning and supervised learning to develop autonomous behavior. The full training process is exposed allowing users who are new to the field to understand how machine learning can work. Skip can be seen as both an attempt to explain learning algorithms of so-called smart devices while questioning why machine-learning always has to be hidden away, which is an attribute that serves to alienate the technology from less tech-savvy users.
Three learning algorithms processes briefly explained.
Most smart objects have voice commands as medium of interaction. Even though this seems very intuitive it lacks on a feedforward communication of the device limitations. This devices aren't real people, yet some users interact with them asking tasks that they aren't able to execute. Perhaps, because the other side of the objects is a humanoid voice, this leads users to a lack of awareness about the device limitations.
Through a metaphor, we explore a tangible interaction approach towards teaching a 'smart' lamp certain preferences in a given context. By designing the interactions with the lamp, as if this was a 'dog', we hope to show the object limitations, while helping of the communication between the object and the user. Also, the interaction becomes more embodied and expressive.
Moving the lamp and setting light intensity, teaches the algorithm a certain light preference and position in a given situation. The lamp registers this interaction and measures other input data about the context such as environment light, time and movement.
In a future scenario, in which the scenario matches the one in which the lamp was manually set, the algorithm commands the lamp to repeat the light and position. If the algorithm gets right the user preferences, he/she can stroke the lamp as 'reward' for getting the right preference. Also, this will tell the algorithm to repeat this setting when the conditions are the same.
When in a situation doesn't match any previously saved setting, the lamp can predict a light and position for that situation. Then, the user can either stroke the lamp to positively reinforce those settings for that situation or manually moving the lamp as his/her preference.
Skip's points of rotation and direction (arrows).
Stroking Skip on the back's textile gives input to the lamp about the user's preference.
Our bodies provide rich ways to explore tangible interactions in order to design more expressive ways to interact with products. We used embodied interaction method to explore the lamp's behavior (e.g. reading in bed).
This exercise allowed to quickly explore different movements that the lamp should be able to execute.
A MDF mock-up of the lamp was built to test the sensors, motors and the algorithm created for Skip.
Inspiration, Sketching, Ideation.
Lamp's 'head' detail. ON/OFF button and brightness knob.
Skip's back textile detail. The textile is meant to be conductive so when the user strokes it, the lamp knows what that interaction means.
Front and back view of the lamp.
Skip is able to rotate in 3 axis: body, neck and head.
Skip was developed by the students Davide Amorim, Christian Sivertsen, Mick Haegens and Simone Rietmeijer.