To make the Processing sketch work physically for the installation we wanted to create a system that translated the passage of people through the sensor zones (as with the mouse in the visual sketch) to real inputs and outputs of people in the alley.
We needed something electronic able to talk with the Processing sketch, and allow it to start the installation once the right conditions were satisfied.
During the sketch development we used a sensor zone. The same concept is now translated in physical computing. We did experiments to develop the physical computing system, and tried different technical solutions:
The first experiment was to use PIR motion sensors (as A and B) to catch the movement of a person and understand his direction. But their sensing angle was so wide that almost all the time the two angles crossed together and gave to us wrong inputs.
Then we thought to use a camera to keep track of people’s movement in the alley and count them.
We experimented using CYA People Vision software connected to Processing. We set the right blob’s size to understand exactly when there were more the one person in the alley.
In the image you can see one blog that correctly shows that just one person is walking in the alley.
Unfortunately sometimes the movement of a person was lost for a while, and the system wrongly thought that he had exited, but this not happened in the reality, so we had some problems to fix exactly all the parameters of the software to make the camera tracking work perfectly.
Once we discovered that these two technologies were not the right solution, we decided to use a very simple but strong way to calculate the exact number of people in the alley.
In the “prototype” section is an explanation of what we chose to use: switch sensors.
Memento | Context | Experience | Process | Prototype | Code | Final considerations | Credits | Downloads