Ben Snell created a portable rig for carrying the STUDIO's Hokuyo LIDAR, and developed custom software for compiling and visualizing the data slices he obtained by walking and driving down Pittsburgh streets.
Ben also created a custom, motorized, laser-cut rig for producing spherical LIDAR captures of entire environments.
Piloting a drone, Irene Alvarado captured the main façade of St. Peter and Paul's Church in East Liberty, and then used the drone video and photogrammetry to create a high-resolution 3D scan of the building's facade.
Using multiple synchronized-flash cameras, Irene and Smokey were able to apply photogrammetry to high-speed photos, in order to develop a 3D capture of a quickly-changing liquid.
In Underkey, Miles Peyton appled panoramic stitching software to numerous images from an inexpensive USB microscope, in order to create a high-resolution recording of the detritus which had accumulated underneath the keys of a public CMU computer cluster keyboard.
Gigapan panoramas from a robotically controlled microscope platform. Link
Michelle Ma captured a person using a DSLR camera, fed it into photogrammetry software to develop a 3D model of her subject, and then replaced the photographic textures with her own handdrawn portraits.
Claire Hentschker developed Shining360, a 360° video which presents 3D reconstructed scenes from the Shining, and allows the viewer to look around them in real-time, while following the original path the camera took.
Scott Fitzgerald, a workshop student, recorded a drive with a catadioptric 360° lens mounted to the front of his car— and developed custom software in Jitter to compute a 3D slitscan tube from the resulting video.
Faith Kim developed a portable, binaural DepthKit rig in order to record the annual Pittonkatonk festival in immersive 3D.
Chloé Desaulles made high-speed Schlieren recordings of air currents distubing candles.
Using an RC car, a laser line, and custom software, Hizal Celik made a machine to create 3D scans of the undersides of cars.
Caroline Hermans used an ultrasonic proximity sensor and an Arduino to make a custom illumination probe.
She then made a light-painting video recording of herself exploring an unfamiliar space (a men's restroom), using this device, with a 360° camera, in total darkness. The result is a navigable video.
Using the STUDIO's UR5 arm and computer-conrolled cameras, Evi and Soonho conducted a number of experiments in robotic cinematography.
In her project Skies Worth Seeing, Kaitlin Schaer wished to understand: "What are the qualities of the sky that transform it into a moving and even sublime subject for a photograph?"
Using the Openframeworks add-on ofxFlickr, she scraped thousands of images of skies. She then developed a tool in Processing that allowed her to quickly crop images so that they exlusively contained the sky, and not the horizon or land. Finally, in order to organize the images, she sorted them spatially using t-SNE.
Kristin Yin used the Sensel Morph pressure sensor, and neural-network machine learning, to create a Footprint Recognizer.
Geep Warhaftig developed Sphinct!, a "lifestyle sphinctometer" modeled after personal fitness trackers. With the help of an Arduino, pressure sensors, and a custom rubber cast, Sphinct! captures data on pressure, muscular performance, and stress levels in the rectum. Sphinct! syncs to your smartphone, "where you can track your progress and take control of your body. Compare your results to friends and even unlock achievements."
Related to this, see Plethysmography & Photoplethysmography.