Wander Watch – Fashion Technology Final Project

Wander Watch is a compass that lights up in the direction of your destination. You can use an app to enter where you’re going, and then put the phone away. This is part of a series of products I’m creating that help undo our overconnected lives.

This project was created for a course titled Expressive Interfaces: Introduction to Fashion Technology, and taught at ITP at NYU.

A slide deck can be found here.

Role: Ideation, concept development, physical design, fabrication, and coding.

Tools: Bluetooth LE, Cordova Phone Gap app, Don Coleman’s Bluetooth library, Javascript, Google Maps API, Tinkercad, 3D printers, Flora microcontroller, Flora Bluetooth LE module, Neopixel ring, webbing, velcro, thread and needle for strap.

IMG_2888

How It Works

The user opens the app, selects a destination, and sends it to the watch using Bluetooth LE. The watch lights up in the direction of the destination. The watch itself is 3D printed and encloses a stack of components.

 

 

Creative Process

Stage 1 – Concept development

  • Brainstormed what a wearable navigation piece might look like.
  • Illustrated the look and feel of the watch.

Artboard 1-100

Stage 2 – Research

 

Stage 3 – Code

  • Step by step, I assembled together my working code:
    • Confirm bluetooth connectivity using Flora microcontroller & Bluetooth module with Bluefruit app to confirm connectivity
    • Send commands to turn on and off specific Neopixel ring LEDs
    • Download and set up Don Coleman’s Cordova Phonegap app example
    • Use app to send commands to Neopixels
    • Insert Google Maps API Heading code into Bluetooth app.
    • Replace one map pin with phone’s actual location.

Stage 4 – Fabrication

  • Meanwhile, I developed and printed the physical design of the watch.
    • Research 3D printed watch designs online.
    • Create my own unique design in Tinkercad, including a tailored closing mechanism and opening for the strap.
    • Print test examples on the Ultimaker 3D printers at ITP.
    • Print a final prototype at NYU’s LaGuardia print shop on the Mojo printer.
    • Assemble strap by sewing velcro for an adjustable fit.

 

IMG_2851

IMG_2833

 

Next Steps

I hope in the future to add the GPS and magnetometer on the watch itself. In addition, I’d like to add three buttons on the side, so that a user can pre-program a few locations at home and leave their phone behind!

 

 

Collective Play: Final Project Playtesting

How might we create something intentional out of the unintentional? For this play test, Hadar and I asked people to leave behind something in their pocket and to leave behind a thought they want to get rid of.

 

IMG_2431

Thought Process

We discussed emotions and dynamics we wanted to create in people. We liked the idea of turning a random action into a purposeful action.

Playtesting

We set up a table underneath one of the TVs in the lounge area with paper and pens. The monitor above gave play instructions. Folks started to leave behind objects and thoughts.

IMG_2433

Expressive Interfaces: LED Circuit with a Switch

For this project I made use of snap buttons to create a switch to turn on and off and LED in the shape of another cat.

 

AdFSbp

How the piece is turned on, by connecting snap buttons.

Screen Shot 2018-03-29 at 3.54.50 PM

Illustrated diagram of the top piece’s circuit.

Screen Shot 2018-03-29 at 3.54.12 PM

Illustrated diagram of the back piece’s diagrams.

Tangible Interaction Workshop – Lighting Controller

IMG_1635

This is a wireless controller for the type of stage lights found in many performance venues.  The controller selects colors, changes the brightness, and turns the light on and off. The enclosure itself has a strong visual design that doubles as a control panel, suggesting to users how to interact with a circular soft potentiometer, a sensor not familiar to most people.

Tools: Soft potentiometer and rotary encoder, Arduino MKR1000 microcontroller, Tom Igoe’s sACNSource.h library, and vinyl adhesive sheets.

Demonstrating the controller’s color selection feature.

How It Works

The user’s interactions with the tangible sensors causes analog and digital signals to be sent to the Arduino microcontroller. Code saved on the microcontroller runs logic to determine what to do based on these signals, and then stores answers that are sent via wifi to lights using the DMX protocol (the standard language used to communicate with these standard lights found in performance and retail spaces).

IMG_1624

The circuitry inside.

Creative Process

Prototype 1 – Test Connectivity

  • Ran generic tests to guarantee connectivity to existing wifi and lights systems. Used simple code to connect microcontroller to private WiFi network and send DMX commands to stage lights.

Prototype 2 – Draft Pseudo Code & Test Features

  • Drafted code’s logic and structure to realize new designs for lighting controller.
  • Got help from Tom Igoe doing office hours to clarify understanding of rotary encoder’s behavior and how data is compiled to be sent as packets.
  • Obtained clean readings from sensors & tested new features by implementing one block of code at a time to isolate bugs and ensure smooth progress.

Final Prototype – Write Final Code & Fabricate

  • Controlled lights and fabricated enclosure.

 

Challenges & Next Time

I’d like to understand why my all of my controller’s features work when powered by my laptop, but only half work when powered by a LiPo battery (I used a 3.7v 2000mAh battery, which should be enough but I will have to find out). I also have some flow issues in the structure of my code, so that while a command exists to entirely shut off the lights, it’s not being activated. I hope to completely finish this controller and the MIDI music controller this spring.

Code

Designing Meaningful Interactions – Draft of Product Prototype

Here is a flow chart & drafted working prototype for my friend’s idea of a Studio Rental app. Looking forward to getting user feedback from people in the next week.

Link to prototype here, and a few screen shots below. I used Sketch to create the prototype, and Invision to create a shareable version online. I’ll work on connecting the hotspots.  https://invis.io/WYG3SWTBNRC#/282238030_Studio_Page 

I made the Flow Chart in Illustrator, using the graphics I created for the User Journey last week.

User Flow Chart

 

Screen Shot 2018-03-01 at 12.13.46 AMScreen Shot 2018-03-01 at 12.14.06 AMScreen Shot 2018-03-01 at 12.14.38 AM

Collective Play – Pictionary

This is an assignment for Collective Play, in which neither player should be the leader or the follower – both should lead as closely as possible.

IMG_1485

 

In this game by myself and Hadar Ben-Tzur, two players draw a word together before the buzzer goes off. They need each other to be successful, because one player can only press the up and down arrow keys, whereas the other player can only press the left and right arrow keys. See links below.

Play yourself!

  1. One player can only press UP and DOWN arrow keys, and visits glitch.com/edit/#!/collectivepictionary/udinput
  2. Second player can only press LEFT and RIGHT arrow keys, and visits glitch.com/edit/#!/collectivepictionary/rlinput
  3. Everyone watches by going to glitch.com/edit/#!/collectivepictionary/output

Check out the code here: glitch.com/edit/#!/collectivepictionary