Client: Science North
Date: December 31, 2018
Categories: SimWave
A large multimedia experience brings visitors inside the mind and body of extreme athletes to explore their experiences and motivations.
In January 2018, SimWave was approached by Science North to produce three experiences for their Beyond Human Limits exhibit, in collaboration with the Ontario Science Centre. The Central Multimedia Experience was envisioned as a hybrid quiz game/motivational experience for visitors who had visited the other zones within the exhibit. The area was designed as a hexagon, and segmented into three opposing walls, each with a projector playing a video with either educational content or positing a hypothetical question.
During the quiz section of the experience, visitors would be a question about what would be the most beneficial approach to learning a new skill and given two scenarios. All three projected walls would play a video presenting the question, then two would show the possible scenarios to achieve the desired goal. Visitors would then have to move to to the wall with the scenario they believed to be the correct answer. After a brief time has elapsed, cameras suspended in the ceiling would determine how many individuals were by either wall and select that answer. The selected answer would play on all three walls, followed by the correct answer, and a brief segment following the life and struggles of an extreme athlete.
TouchDesigner was chosen as the development platform largely because of it’s built-in video processing capabilities, and Science North had remaining licenses following a previous joint-developed project. During development, we had initially aimed to use RGB cameras to keep costs down and because we already owned licenses to software that could perform person detection on a live video feed, and would provide information back to use via a RESTful API. We were provided all three videos in a single video file, as a single computer would be driving the experience with three projectors. I’d developed a JSON configuration file that contained all of the start/end segments for each video, the correct and incorrect answers, and the overall flow of all the application so that it could be reconfigured by the staff at their behest.
Once I arrived on-site to begin installation, I noticed that the lighting in the area was very dim and found that the software could no longer distinguish people accurately. Moreover, the s0ftware was having issues with individuals wearing shirts with faces, and with individuals with backpacks. At this point, the RGB cameras were replaced by RealSense depth cameras, and the camera handling was re-written to remove dependence on the person tracking software and instead focus on blob detection via background removal. As time on-site was very limited, background removal was done via pictures taken of the area with no participants and the live feed was subtracted at runtime. As I completed my time on-site, the client had asked that children should be valued greater than adults, so using a height map, I added a valuation curve to the height and calculated the average pixel color to determine which area would be considered for the selected answer.
After installation was complete and the client was happy, I provided developer support until the contract expired at the end of the year.
Role: Sole Developer/Support
Time on Project: Jan 2018 – Dec 2018
Technology used: TouchDesigner, Intel RealSense cameras
Language used: Python, C++