TANGIBLE INTERACTION WORKSHOP | MUSIC PLAYBACK CONTROLLER

DESIGN CHALLENGE: Make a device to control playback of pre-recorded music. You do not have to build the playback engine, just the physical control device. Your device will send asynchronous serial messages to another device which will play the music. All serial messages should be terminated by a linefeed. The playback device will echo your message back to you as an acknowledgement message when it executes your command. Your device should support the following features:

  • Your device should be operable by a user who cannot see the device.
  • The user should get an acknowledgement when they activate any control.
  • Start or stop the playback of a track (start plays track from the beginning). 
  • Pause or resume playback of a track (resume plays track from last position stopped). 
  • Skip ahead one track in the playlist. 
  • Skip back one track in the playlist. 
  • Fast-forward the current track (double speed of the track). 
  • Skip to a random track in the playlist. 

Make a housing for your controller. 

IMG_2547.JPG

Because this music controller should be designed for a blind person to be able to use, I decided to follow the standard layout for music player controllers. Here was the initial design I drew, which all of my iterations have in common. 

I wanted the playhead to be central, with "backward" and "forward" buttons grouped around it, and the start/stop and random buttons on either side. I figure the playhead button, as an arrow, can provide directionality for how the entire device should be oriented.

The most challenging part of the assignment was getting p5 and the Arduino to talk to each other. I'd done this a year ago, but had forgotten how it worked, and the p5 Serial communication tools and workflow had changed since then. I ended up supplementing Benedetta and Tom's guide with Angela's tip to open the local server in Incognito. 

Once I understood how p5 Serial worked, the Arduino code was easier to program as well. My process became:

  1. Run the p5 Serial tutorial to ensure the Serial port is working properly 
  2. Test Tom's p5 Music Controller code to make sure it's working with keyboard keys
  3. Merge the p5 Serial tutorial with the p5 Music Controller code
  4. Code the Arduino to Serial.write() the specified number codes when switches are pressed
  5. Update the p5 Music Controller code with the correct port
  6. Run the p5 Serial Monitor GUI (still not sure why we do this)
  7. Run  the p5 sketch in a local server in Incognito mode
  8. Test to ensure the physical Music Playback Controller is effectively controlling the music

Here's a sadly vertical video of the raw device working: 

I ended up with a number of different iterations of this project, and realized that with product design, the number of potential iterations and improvements must be limitless. Eventually I had to stop. 

ITERATION 1: BIG CARDBOARD BOX
Big and sloppy! This was a prototype to measure wiring.

ITERATION 2: CARDBOARD + WOOD
This version was housed in a 3" x 9" silverware divider from The Container Store, topped with a piece of laser cut cardboard. After putting it together, I decided I might as well opt for a wood top for a more stable feel. Here's a photo I took of the wiring mess as a reference for keeping my inputs in order for this version. 

IMG_2615.JPG

ITERATION 3: ALL-WOOD ENCASING
This version is internally neater, with wiring trimmed to size and a solid encasing. It's still far from perfect. 

Ultimately this is a straightforward product that I think could be ever-improved. 

Ideally I would love to make my own buttons for this product. I think casting them from silicone or ceramics would be nice- anything that had a tactile sign of its function. I designed the signs on this last prototype to be placed on the actual button, but ultimately decided to put them underneath because wood-covered buttons didn't provide enough assurance that they had been pressed. 

I would also like to incorporate haptic feedback. I started out with haptic feedback, but soon eliminated it because I was spending too much time troubleshooting. 

BREADBOARD

SCHEMATIC

SOFT ROBOTICS | EXPLORING INFLATABLES, CABLE CONTROLS

GRAPPLING HOOK PROTOTYPE
The first section detailed cable controls, and in class we made prototypes of venetian blinds and cable controlled arms. In that exercise I learned that cutting slits in one side of a small place of the straw causes it to be able to bend in that place. For my exploration assignment, I used this technique to make a grappling hook prototype. I stuck with the simple materials we used in class- straws, buttons and string. I taped a few straws together and cut slits in one side of the same length of each straw. Then I threaded the straws with string and tied a button to one end of each string. Lastly I tied the other end of the strings together with a separate button. When the group of strings is pulled, the opposite side of the straw grouping bends outward and holds, forming a grappling hook until the strings are released.

GRAPPLING HOOK QUALITIES OF SOFTNESS
Soft - yes
Flexible - yes
Elastic - no

PROTOTYPE OF SOFT ROBOTIC ARM- RETRACTS INWARD AND UPWARD WHEN INFLATED

PROTOTYPE OF SOFT ROBOTIC ARM- RETRACTS INWARD AND UPWARD WHEN INFLATED

INFLATABLE ROBOTIC ARM PROTOTYPE
The second section detailed inflatables, and for my exploration I made an inflatable spiral arm prototype. It’s made of mylar and has a straw on the outside end that you can blow into to inflate it. When inflated it clenches inward and upward. When not inflated it’s flat and limp, and can be maneuvered like a ribbon. Taken together, this functionality could be used to grab something like a shoelace, or something fragile, both of which would be difficult for a claw to grab. It’s also lightweight, stowable, and portable. 

INFLATABLE ARM QUALITIES OF SOFTNESS
Soft - yes
Flexible - yes
Elastic - no

SOFT ROBOTICS | MATERIAL CONNEXION

Last week our class researched the Material Connexion’s new materials library, and this week we visited their showroom, which features the newest materials in the collection. 

Despite the gorgeous polymer materials, I at first felt obligated to be drawn to natural materials, though I’m not sure if naturally extracted materials would even be more sustainable in every case. Either way, they were beautiful in their own right.

Here’s a form of bacteria secretion that becomes leathery and durable when dried. 

Here’s a thick paper made from mulberry bark, beautifully dyed to show its capacity for color fastness. There were a few featured materials derived from mulberry bark.

After the visual spectacle wore off, there were two materials I kept thinking about after I left. 

The first is this fabric solar cell. It’s soft, flexible, and maybe even somewhat elastic, but maybe that’s only to the extent that you can smooth out its tendency to wrinkle. 

The second is this soft circuit printed on film- a process, not a material. It’s soft, flexible, and elastic. It’s in the Materials Library HERE. Description: 

“Screen printing of multiple inks, both conductive and insulative, to form circuits on TPU films. This technique in combination with the inks and substrates selected creates an extensible, or stretchable circuit. After the circuit pattern is determined, screens are “burned” into stencils of the appropriate shape, using photo-reactive masking liquid. The ink is then squeegeed onto the screens depositing the ink onto the substrate, usually a flexible elastomeric film. This is repeated with all the ink layers in their particular pattern and sequence. Multiple ink types are needed for their individual properties, which comprise the circuit.”

For my final, I'm interested in prototyping a soft solar panel. 

ALT DOCS | THE DRAWING ROOM - JAN ROTHUIZEN

This week I watched The Drawing Room, an animated short 360 film by Jan Rothuizen and Sara Kolster. I watched it in 360 on YouTube, but it’s made for the Oculus Rift, which it uses for head tracking with six degrees of freedom to create an immersive experience with more 3D depth. 

The story is Jan’s attempt to convey his experience in The Room on the Roof artist residency space, over a span of time in which nothing really happened outside of his own emotional adjustment to being there. He conveys this successfully, employing immersive visuals to transport us to the space, and voiceover to recount his internal monologue while there. 

Here’s a video of people watching it on the Oculus Rift. 

A lot of the wow-factor seems to come from head tracking gimmicks, but overall the piece is a very successful transportation into his experience. It also works as a proof-of-concept for two elements that can be recycled in other VR pieces: simple line drawing environments and illustrated text.  

Line Drawing in VR

VR scenes crafted in Unity are often at once hyper-real and lacking life. The line drawing environment allows your imagination to fill in the blanks- it’s more gentle, somehow more giving. The Drawing Room felt wonderful to watch, even on YouTube.

Illustrated Text

I was surprised by how much I wanted to read the text scattered around. Usually text in VR is a headache to read. The text in The Drawing Room was hand-written and charming- always a delight to find. Here’s all of the text scattered across a single frame: 

There was another type of narrow-block lettering used as an overlay sometimes which I found very difficult to read. This may have been deliberate, but it was too much for me. Here’s an example - it says SHOPPING HERE SHOULD BE A PREMIUM EXPERIENCE. This might have been a neat idea in 2D, but in VR it was headache-inducing.

I’d like to watch The Drawing Room on the Oculus next. If I do, I’ll update this post with my thoughts. 

ALT DOCS | DOCUMENTING STANDING ROCK

Last week Dhruv Mehrotra and I went to North Dakota to document the protest against the certified-dangerous Dakota Access crude oil pipeline being constructed just upstream of Sioux Tribe reservation land. 

We collected 360 video, transponder data of surveillance planes flying low overhead, a sense of the overwhelming police surveillance of the Standing Rock protest, and an understanding of why many people believe the fight is about much more than DAPL. 

Our final project will present what we learned at Standing Rock. Our final project proposal can be viewed here

We believe the pipeline is dangerous and cannot proceed as planned. In advocating for this cause, we also want to depict the greater conviction that many DAPL protesters feel for native sovereignty. 

ALT DOCS | DEPICTING A DATASET

This week I was instructed to craft a scene in Unity that evokes the New York Times’ complete list of The 282 People, Places and Things Donald Trump Has Insulted on Twitter

Here’s a screenshot of my scene:

 

To construct this scene, I purchased a $5 Trump.obj from Turbo Squid and used Mixamo’s animation library to make Trump cry. I then enclosed Trump in a circle of the people, places and things he’s insulted on Twitter. The collage is composed of .png files imposed on 2D sprites.   

I thought it would be great to be able to look at a person, place or thing in the scene and hear Trump’s tweets about it, recited in his voice. The interaction being, if you look at Ted Cruz or Whoopi Goldberg, your gaze would trigger audio of Trump saying all of the insulting things he’s tweeted about them. I don’t think the same effect could be achieved with text- to me, audio in VR feels more impactful. 

I first turned to https://clash.me/trump, a text-to-speech program that draws from a pool of Trump speeches on YouTube to create a corpus of words you can hear in Trump’s voice. Unfortunately, it was only able to provide a few words. Even the word “politician” was missing, so I wonder if the program might just be dated. 

For today, the program simply opens with an audio mashup of insults Trump has lobbed verbally/in person. I don’t feel this achieves my interactive vision for the project, and I’m considering working with AudioGrep to create the Trump speech corpus I would need.

DIRECTING VR | HORIZONTAL LIGHT IN OUTDOOR SPACES

Originally posted on ITP’s Directing Virtual Reality class blog

My classmates Jomo and Aimee’s presentation on “Cinematography in VR” last week left me thinking a lot about how environmental lighting can be leveraged to create a richer immersive experience in VR.

This still from Paris, Texas stuck out to me in particular. I find myself wanting more of the space than this frame can capture- I want to be able to look around, to be there. Jomo mentioned that the filmmakers wrapped green cellophane over the fluorescent lights at the gas station to create this otherworldly contrast, adding that VR filmmakers could get better at manipulating and utilizing light.

I started thinking about films that use the dumbfounding effect of ambient light to slow down pacing and highlight the textures of a space. Days of Heaven uses the golden hour to draw out existential prairie scenes that feel effortless to watch.

Looking back at the successful immersive documentary VR films we’ve discussed all semester, I’m now struck by the ubiquity of the golden hour, or horizontal light in general, in outdoor spaces.

Here’s The Displaced, for example:

A newer film, NYT’s 10 Shots Across the Border tells the story of José Antonio, a Mexican teen who was walking to his mom’s house when he was fatally shot through the US/Mexico dividing wall by a US border patrol agent. The film leverages horizontal light in nearly every scene to establish a sense of immersion.

Here’s the first frame of the film, an aerial shot which establishes where the scene of interest takes place. This opening shot is filmed during the golden hour, which serves the dual purpose of evenly lighting the focal point of interest below, and illuminating the level at which the viewer is situated, especially the helicopter behind the viewer, to provide transparency and establish safety, allowing the viewer to orient themselves in this 360 space.

Shots on the ground– of José Antonio’s house, of the wall itself– are filmed during the golden hour or in horizontal light from streetlamps. Both types of light cast long shadows over the space, across any one field of view. This prompts the user to look around, creates a sense of immersion and make the viewer feel connected to the space.

The exception is the place where José Antonio was shot, which is filmed in a harsher, more direct light. It’s illuminated from the side, enough to see it clearly, but it’s not romantic or interesting. In this light the scene is flat and washed out. As a result, it feels hard to connect to this space. This perspective is that of the US Border Patrol– a lens established by being driven around by them at the same height at roughly the same light just before cutting to this shot. It’s hard to have empathy for the person who sees the world in this unromantic light.

Warm light helps the viewer to slow down and feel comfortable. Placed horizontally, it casts long shadows that encourage the viewer to look around past the view of the frame and highlight the dynamic textures of the space. Whether the rule of horizontal light in documentary VR film is emerging or already standard, its value is huge in helping a viewer feel immersed in an outdoor space.

But now the illusion is ruined for me, and every film looks the same.