Present a design concept for a networked device or system that helps a community in need.
My concept is a GSM button that sends emergency SMS messages to the family, friends, lawyer and other key contacts of an Immigration and Customs Enforcement raid victim.
The device looks like a nondescript watch or bracelet. It contains a SIM card, Adafruit FONA or any other kind of GSM breakout board with SMS support, microcontroller, battery and button. When the device's button is pressed, the device is programmed to send a message to a predetermined list of recipients. If the person wearing it is in trouble, they can press the button on their watch or bracelet and it will send the message to their group.
As a product designed for immigrants at risk of being arrested, the device serves as a distress signal. It can help a user's friends and family quickly address the immediate needs of the user: contact their lawyer, take care of their kids, etc. It also provides the emotional service of letting people close to the user know they've been detained. Safety status broadcasting is an increasingly common consideration- take Facebook's new Safety Check feature, which allows users to broadcast to their network that they're safe in the event of an emergency. This device would allow for a discreet form of broadcast.
One of the biggest challenges I had in developing this product concept was security. It's very important that the device not become a honey pot which ICE officials can use to find networks of immigrants. Here are some security measures I've developed.
Before programming the device, the wearer should meet with their group of intended message recipients and agree on a coded message, ie "04850" or "Let's meet at mom's house?"- any message that has no context in itself but will mean "distress signal from [watch wearer]." Everyone in the group should remember this coded message and not write it down. This is historically the most secure form of encryption.
The device has its own pre-paid SIM card and is not linked to any external phone account. The device has no GPS tracking. The user's group of recipients subscribes to the SMS broadcast with Google Voice accounts, which are notoriously hard to track. Additional layers of removal are encouraged.
The device should look as nondescript as possible.
I presented this product to my class and a panel of two investment advisors. My classmate Dorothy Lam is interested in building out a prototype with me for her homemade hardware class. She suggests making the device a bracelet to ensure wearability. I like that idea; in any case, we'll need to make custom circuits. Here's her blog post.
The most substantial feedback came from a panelist, who suggested I repackage this product to emphasize its wide-market appeal, and accept this specific use as being one of many potential fringe uses for a product that many people can use. He's right- this product is a great alternative to smart-phone based button devices for anyone who doesn't have a smart phone (ie seniors, children, low income people) or who must be discrete in asking for help (ie abuse victims).
I'm going to prototype this with Dorothy, which will include making custom small circuits. I'll then get in touch with the panelist about a revised presentation.
Make a device to play Atari’s Lunar Lander game in a browser. Here’s a different version that uses the up, down, left and right arrows. Your device should support the following actions:
- Send up, down, left, right keypresses, or W, A, S, D keypresses as a USB keyboard
- Send mouse clicks
- Move mouse in X and Y direction to desired location to click
- Your controls should be arranged so that the player can watch the screen, not concentrate on the controls
Make a housing for your controller. Document it according to the project documentation guidelines at the end of this page. For this assignment, you’ll need a microcontroller that can be programmed to be a USB HID device (keyboard or mouse).
For this assignment I wired four arcade buttons to an Arduino MKR1000. I housed the game controller in the same silverware divider box from the Container Store that I used for my music player controller, but with a new custom laser cut top. The box is a great size for a game controller because it's shallow enough to fit nicely in hand and long enough to control with two hands.
The game controller is designed to play the in-browser version of Atari’s Lunar Lander game, which relies on W, A, S, D keypresses for navigation and fuel, and the space bar for boosts. I included only what would be needed when the game is active (the mouse and click are used to navigate to the game and restart the game). Since W, S, A and D translate roughly to "more/less/left/right", I positioned their corresponding buttons into a diamond, with the space bar "boost" button on the opposite side. I chose a flat USB cable because it's thin and flexible enough to fold into the player's hand when they hold the controller.
When demoing the game, I realized that I had designed the controller for right-handed players. I'd like to add a flip switch to the controller that changes the corresponding keys of the diamond of buttons, so that the controller can be oriented to either-handedness with the flip of a switch. I would also like to include a way to restart the game with the controller, so that the player can remain in the world of the game.
Note the keyboard library is required for your computer to recognize the Arduino MKR1000 as an alternate keyboard.
DESIGN CHALLENGE: Make a device to control playback of pre-recorded music. You do not have to build the playback engine, just the physical control device. Your device will send asynchronous serial messages to another device which will play the music. All serial messages should be terminated by a linefeed. The playback device will echo your message back to you as an acknowledgement message when it executes your command. Your device should support the following features:
- Your device should be operable by a user who cannot see the device.
- The user should get an acknowledgement when they activate any control.
- Start or stop the playback of a track (start plays track from the beginning).
- Pause or resume playback of a track (resume plays track from last position stopped).
- Skip ahead one track in the playlist.
- Skip back one track in the playlist.
- Fast-forward the current track (double speed of the track).
- Skip to a random track in the playlist.
Make a housing for your controller.
Because this music controller should be designed for a blind person to be able to use, I decided to follow the standard layout for music player controllers. Here was the initial design I drew, which all of my iterations have in common.
I wanted the playhead to be central, with "backward" and "forward" buttons grouped around it, and the start/stop and random buttons on either side. I figure the playhead button, as an arrow, can provide directionality for how the entire device should be oriented.
The most challenging part of the assignment was getting p5 and the Arduino to talk to each other. I'd done this a year ago, but had forgotten how it worked, and the p5 Serial communication tools and workflow had changed since then. I ended up supplementing Benedetta and Tom's guide with Angela's tip to open the local server in Incognito.
Once I understood how p5 Serial worked, the Arduino code was easier to program as well. My process became:
- Run the p5 Serial tutorial to ensure the Serial port is working properly
- Test Tom's p5 Music Controller code to make sure it's working with keyboard keys
- Merge the p5 Serial tutorial with the p5 Music Controller code
- Code the Arduino to Serial.write() the specified number codes when switches are pressed
- Update the p5 Music Controller code with the correct port
- Run the p5 Serial Monitor GUI (still not sure why we do this)
- Run the p5 sketch in a local server in Incognito mode
- Test to ensure the physical Music Playback Controller is effectively controlling the music
Here's a sadly vertical video of the raw device working:
I ended up with a number of different iterations of this project, and realized that with product design, the number of potential iterations and improvements must be limitless. Eventually I had to stop.
ITERATION 1: BIG CARDBOARD BOX
Big and sloppy! This was a prototype to measure wiring.
ITERATION 2: CARDBOARD + WOOD
This version was housed in a 3" x 9" silverware divider from The Container Store, topped with a piece of laser cut cardboard. After putting it together, I decided I might as well opt for a wood top for a more stable feel. Here's a photo I took of the wiring mess as a reference for keeping my inputs in order for this version.
ITERATION 3: ALL-WOOD ENCASING
This version is internally neater, with wiring trimmed to size and a solid encasing. It's still far from perfect.
Ultimately this is a straightforward product that I think could be ever-improved.
Ideally I would love to make my own buttons for this product. I think casting them from silicone or ceramics would be nice- anything that had a tactile sign of its function. I designed the signs on this last prototype to be placed on the actual button, but ultimately decided to put them underneath because wood-covered buttons didn't provide enough assurance that they had been pressed.
I would also like to incorporate haptic feedback. I started out with haptic feedback, but soon eliminated it because I was spending too much time troubleshooting.
DESIGN CHALLENGE: Make a device to control a light. Use an RGB LED as a stand-in for a larger lamp in class. Your device should support the following actions:
- A control to turn the light on and off. When it turns on, it should retain the relative red, green, and blue brightness levels from when it was last turned on.
- The ability to fade each color channel (red, green, and blue) from off to full brightness.
- Fade level should be maintained when you release the fade controller.
- Fade should be interrruptable by other controllers, e.g. on/off or another fade controller.
- The ab ility to fade the overall brightness of the light.
Make a housing for your controller.
I collaborated with Olivia on this project. We used an Arduino Uno, three potentiometers and two sewable NeoPixels Olivia had purchased from Adafruit. We housed the project in a reused cardboard box and diffused the light with two acrylic cylinders Olivia found on the junk shelf.
The prototype we made reminds me of Mike Kelley's resin Krypton sculptures.
From the beginning our plan was that each potentiometer would control one color on the RGB scale via an analog pin.
We came into the project thinking that wiring and programming the lighting controller would be much more complicated than it actually was. Most of our time programming was spent backtracking out of complicated formulas and referencing basic Arduino tutorials to update our code. We ultimately referenced code from Tom Igoe’s NeoPixel examples, Jamie Ruddy’s assignment, and Olivia's armor game project.
Fabricating this project took forever precisely because we tried to cut corners. Realizing this made us feel dumb. We didn't laser cut a box, solder our circuits, or even cut our wires down to size, and we paid for it by troubleshooting late into the night after we put everything together.
The final product was beautiful, although while making it we couldn't figure out how to add an overall dimmer. Ultimately we learned that we wouldn't need to program an overall dimmer at all- the solution is much simpler. We could control the brightness of the entire circuit with a fourth potentiometer that controls overall voltage to the programmed potentiometers. This is something we learned back in the first few weeks of ITP!
GRAPPLING HOOK PROTOTYPE
The first section detailed cable controls, and in class we made prototypes of venetian blinds and cable controlled arms. In that exercise I learned that cutting slits in one side of a small place of the straw causes it to be able to bend in that place. For my exploration assignment, I used this technique to make a grappling hook prototype. I stuck with the simple materials we used in class- straws, buttons and string. I taped a few straws together and cut slits in one side of the same length of each straw. Then I threaded the straws with string and tied a button to one end of each string. Lastly I tied the other end of the strings together with a separate button. When the group of strings is pulled, the opposite side of the straw grouping bends outward and holds, forming a grappling hook until the strings are released.
GRAPPLING HOOK QUALITIES OF SOFTNESS
Soft - yes
Flexible - yes
Elastic - no
INFLATABLE ROBOTIC ARM PROTOTYPE
The second section detailed inflatables, and for my exploration I made an inflatable spiral arm prototype. It’s made of mylar and has a straw on the outside end that you can blow into to inflate it. When inflated it clenches inward and upward. When not inflated it’s flat and limp, and can be maneuvered like a ribbon. Taken together, this functionality could be used to grab something like a shoelace, or something fragile, both of which would be difficult for a claw to grab. It’s also lightweight, stowable, and portable.
INFLATABLE ARM QUALITIES OF SOFTNESS
Soft - yes
Flexible - yes
Elastic - no
Last week our class researched the Material Connexion’s new materials library, and this week we visited their showroom, which features the newest materials in the collection.
Despite the gorgeous polymer materials, I at first felt obligated to be drawn to natural materials, though I’m not sure if naturally extracted materials would even be more sustainable in every case. Either way, they were beautiful in their own right.
Here’s a form of bacteria secretion that becomes leathery and durable when dried.
Here’s a thick paper made from mulberry bark, beautifully dyed to show its capacity for color fastness. There were a few featured materials derived from mulberry bark.
After the visual spectacle wore off, there were two materials I kept thinking about after I left.
The first is this fabric solar cell. It’s soft, flexible, and maybe even somewhat elastic, but maybe that’s only to the extent that you can smooth out its tendency to wrinkle.
The second is this soft circuit printed on film- a process, not a material. It’s soft, flexible, and elastic. It’s in the Materials Library HERE. Description:
“Screen printing of multiple inks, both conductive and insulative, to form circuits on TPU films. This technique in combination with the inks and substrates selected creates an extensible, or stretchable circuit. After the circuit pattern is determined, screens are “burned” into stencils of the appropriate shape, using photo-reactive masking liquid. The ink is then squeegeed onto the screens depositing the ink onto the substrate, usually a flexible elastomeric film. This is repeated with all the ink layers in their particular pattern and sequence. Multiple ink types are needed for their individual properties, which comprise the circuit.”
For my final, I'm interested in prototyping a soft solar panel.
This week I watched The Drawing Room, an animated short 360 film by Jan Rothuizen and Sara Kolster. I watched it in 360 on YouTube, but it’s made for the Oculus Rift, which it uses for head tracking with six degrees of freedom to create an immersive experience with more 3D depth.
The story is Jan’s attempt to convey his experience in The Room on the Roof artist residency space, over a span of time in which nothing really happened outside of his own emotional adjustment to being there. He conveys this successfully, employing immersive visuals to transport us to the space, and voiceover to recount his internal monologue while there.
Here’s a video of people watching it on the Oculus Rift.
A lot of the wow-factor seems to come from head tracking gimmicks, but overall the piece is a very successful transportation into his experience. It also works as a proof-of-concept for two elements that can be recycled in other VR pieces: simple line drawing environments and illustrated text.
Line Drawing in VR
VR scenes crafted in Unity are often at once hyper-real and lacking life. The line drawing environment allows your imagination to fill in the blanks- it’s more gentle, somehow more giving. The Drawing Room felt wonderful to watch, even on YouTube.
I was surprised by how much I wanted to read the text scattered around. Usually text in VR is a headache to read. The text in The Drawing Room was hand-written and charming- always a delight to find. Here’s all of the text scattered across a single frame:
There was another type of narrow-block lettering used as an overlay sometimes which I found very difficult to read. This may have been deliberate, but it was too much for me. Here’s an example - it says SHOPPING HERE SHOULD BE A PREMIUM EXPERIENCE. This might have been a neat idea in 2D, but in VR it was headache-inducing.
I’d like to watch The Drawing Room on the Oculus next. If I do, I’ll update this post with my thoughts.
Last week Dhruv Mehrotra and I went to North Dakota to document the protest against the certified-dangerous Dakota Access crude oil pipeline being constructed just upstream of Sioux Tribe reservation land.
We collected 360 video, transponder data of surveillance planes flying low overhead, a sense of the overwhelming police surveillance of the Standing Rock protest, and an understanding of why many people believe the fight is about much more than DAPL.
Our final project will present what we learned at Standing Rock. Our final project proposal can be viewed here.
We believe the pipeline is dangerous and cannot proceed as planned. In advocating for this cause, we also want to depict the greater conviction that many DAPL protesters feel for native sovereignty.
This week I was instructed to craft a scene in Unity that evokes the New York Times’ complete list of The 282 People, Places and Things Donald Trump Has Insulted on Twitter.
Here’s a screenshot of my scene:
To construct this scene, I purchased a $5 Trump.obj from Turbo Squid and used Mixamo’s animation library to make Trump cry. I then enclosed Trump in a circle of the people, places and things he’s insulted on Twitter. The collage is composed of .png files imposed on 2D sprites.
I thought it would be great to be able to look at a person, place or thing in the scene and hear Trump’s tweets about it, recited in his voice. The interaction being, if you look at Ted Cruz or Whoopi Goldberg, your gaze would trigger audio of Trump saying all of the insulting things he’s tweeted about them. I don’t think the same effect could be achieved with text- to me, audio in VR feels more impactful.
I first turned to https://clash.me/trump, a text-to-speech program that draws from a pool of Trump speeches on YouTube to create a corpus of words you can hear in Trump’s voice. Unfortunately, it was only able to provide a few words. Even the word “politician” was missing, so I wonder if the program might just be dated.
For today, the program simply opens with an audio mashup of insults Trump has lobbed verbally/in person. I don’t feel this achieves my interactive vision for the project, and I’m considering working with AudioGrep to create the Trump speech corpus I would need.
Originally posted on ITP’s Directing Virtual Reality class blog.
My classmates Jomo and Aimee’s presentation on “Cinematography in VR” last week left me thinking a lot about how environmental lighting can be leveraged to create a richer immersive experience in VR.
This still from Paris, Texas stuck out to me in particular. I find myself wanting more of the space than this frame can capture- I want to be able to look around, to be there. Jomo mentioned that the filmmakers wrapped green cellophane over the fluorescent lights at the gas station to create this otherworldly contrast, adding that VR filmmakers could get better at manipulating and utilizing light.
I started thinking about films that use the dumbfounding effect of ambient light to slow down pacing and highlight the textures of a space. Days of Heaven uses the golden hour to draw out existential prairie scenes that feel effortless to watch.
Looking back at the successful immersive documentary VR films we’ve discussed all semester, I’m now struck by the ubiquity of the golden hour, or horizontal light in general, in outdoor spaces.
Here’s The Displaced, for example:
A newer film, NYT’s 10 Shots Across the Border tells the story of José Antonio, a Mexican teen who was walking to his mom’s house when he was fatally shot through the US/Mexico dividing wall by a US border patrol agent. The film leverages horizontal light in nearly every scene to establish a sense of immersion.
Here’s the first frame of the film, an aerial shot which establishes where the scene of interest takes place. This opening shot is filmed during the golden hour, which serves the dual purpose of evenly lighting the focal point of interest below, and illuminating the level at which the viewer is situated, especially the helicopter behind the viewer, to provide transparency and establish safety, allowing the viewer to orient themselves in this 360 space.
Shots on the ground– of José Antonio’s house, of the wall itself– are filmed during the golden hour or in horizontal light from streetlamps. Both types of light cast long shadows over the space, across any one field of view. This prompts the user to look around, creates a sense of immersion and make the viewer feel connected to the space.
The exception is the place where José Antonio was shot, which is filmed in a harsher, more direct light. It’s illuminated from the side, enough to see it clearly, but it’s not romantic or interesting. In this light the scene is flat and washed out. As a result, it feels hard to connect to this space. This perspective is that of the US Border Patrol– a lens established by being driven around by them at the same height at roughly the same light just before cutting to this shot. It’s hard to have empathy for the person who sees the world in this unromantic light.
Warm light helps the viewer to slow down and feel comfortable. Placed horizontally, it casts long shadows that encourage the viewer to look around past the view of the frame and highlight the dynamic textures of the space. Whether the rule of horizontal light in documentary VR film is emerging or already standard, its value is huge in helping a viewer feel immersed in an outdoor space.
But now the illusion is ruined for me, and every film looks the same.