Yuri’s Night access at NASA Ames was limited to the expanse of undifferentiated tarmac outside one of the hangers. A quite impressive expanse, but nothing really substantial to map. We were nicely situated outside with a view of everything, fortunate since I hardly saw anything else in detail, busy with the exhibit from 8:30am to 2:30am. During mapping parties, people love party render, animating our GPS trails all together. Even I, GPS curmudgeon, still get excited about seeing the party render. So give Yuri’s Night some form of party render. We only had two GPS units lined up, and not everyone would have be out mapping at the same moment, challenges to the party render model. The OSM exhibit would also be set among some of the most technically advanced technhology, creative, engaging performances, fire, materials, music ever brought together in one place, like Burning Man with reliable power and no dust and slightly more nerds.
We went through a lot of ideas on how to adapt party render to an actual party. The key thing I wanted to get was some sense of interaction among GPS traces. So the animation would synchronize all the traces to the same time. First idea was to draw out a large clock face pattern on the ground, and quantize the traces to attract to each point on the dial, and tween in between. This became apparently impossible when shown the map of the grounds a few weeks before, there would be no space large enough to spare. So just draw out the traces but give some kind of gravity or resistance between traces .. so perhaps the most recent actual trace could knock the other traces out of their path. We played around with the foam actionscript physics engine, but found it just too intensive for this app. Resorted to doing some offline post-processing of the GPS traces, to determine proximities, and then visualized that with growing orbs of attraction — this was in the performance version but honestly no one understood what it was, even after explanation.
Shawn adeptly banged out most of the application in a few hours, I think he’s done this a few times before. Visually enticing stuff, it looked great on Jess’s huge TV. Going retro 8-bit was a good choice .. reducing the amount of information actually helps our limited human brains get to grips on the situation. It suggests a game, a game board. LocoMatrix have figured that out as well with their gaming platform.
Jess and I primed the field by drawing out each letter of YURISNIGHT. This was immediately picked up on, one of the first runs was an energetic rendition of “F**K YOU”, run directly through exhibits and over tables .. nothing would stop this young man from writing out his impressive profanity. Later a woman drew her name in near perfect cursive! Some just wandered around. One couple wandered, then stopped for 15 minutes and had a close chat, then repeated .. very sweet but not very exciting traces. More designs, writings, wanderings. People had fun and the GPS units were out most of the time.
Most everyone thought we were tracking people in real time. Nope it was much more simple, but having a sophisticated looking display made up for it. Maybe we’ll get the real time tracking together for next year, via GPS-radio, or some other location sensing method. Towards the end of the night, when explaining the exhibit started getting repetitious and the atmosphere getting stranger, Jess started saying that every ticket had an embedded tracking device; a couple hours later the rumor found it’s way back, a random passerby informing us of this fact!
I really want to see something like interaction between tracks. GPS Tron light cycles would be awesome. Maybe we’ll try Processing next time.
If you’re interested, the yuri’s night gps code is here. It’s messy, and probably won’t just build from this package, but feel free to use and extend for other party render exhibits.