ALIVE – FINAL

 

Set up for ITP Winter Show 2017:

LEVEL 1 – Follow the sin ! LEVEL 2 – Hold your breath!LEVEL 3 – Random!

Code

Special thanks to our professors: Jeff Feddersen, Tom Igoe, and Daniel Schiffman for their advisement and feedback. Big hugs and kisses to Leon Eckert for helping us with all our coding inquiries, Aiden Nelson, and all our dear friends at ITP who tested our many, many prototypes along the way.

Wearable Enclosure

I thought last week’s class about rulers was my favorite until Ben taught us about enclosures. It’s always a wonderful treat to listen to someone talk about their passions with enthusiasm. It helps to put things into perspective and learn to appreciate them.

Materials:

Leftover Acrylic  – Canal Plastic

Arcade Joystick – courtesy Roland

Bamboo Tray – Container Store

With finals in full swing, I planned on taking it easy and making the classic Luisa Pereira  enclosure to mount my joystick and house my Arduino. That didn’t come to fruition, however.

For PCOM/ICM finals, I am working with Simon Jensen on a wearable Heartbeat and Respiration monitor. On par with our timeline and this week’s user testing, it was time to enclose our respiration and heart sensors into a wearable. On Monday, I made a first prototype. Originally, I wanted to begin building it into a sports bra / tank because I want to begin thinking of the components as parts of a whole and understand how best to integrate them in an intuitive way. As a patternmaker, two of my primary concerns are fit and closures (how one gets into/out of a garment). I hit a roadblock thinking about how best to accommodate as many people as possible for user testing without having to make multiples. (Sizing standards exist for a reason and no size actually fits all.) Simon convinced me to focus on integrating the components first instead:

The design is essentially an adjustable belt that clips into position beneath the chest and above the abdomen. It has 2 channels for the respiration sensor and Polar belt to feed through:

It also has a pocket bag to house our feather board, LiPo battery, and bluetooth module. This is a video Simon documented of me wearing it beneath my shirt and testing it with data visualization of my breath:

I got great feedback from Simon and Aiden regarding size, comfort, sensor placement, and the adjustable strap closure. On my commute home, I sketched a design between falling asleep:

The next morning, I went to work and during my break, I drafted a pattern on CAD based on the measurements and feedback I received:

The pattern itself is quite simple. It is a rectangle on fold with 3/8″ seam allowances, clean finished edges and the channels are created by applying a topstitching. The top channel (1 1/8″) is for the respiration sensor, followed by an 1 1/2″ spacing between it and the Polar sensor channel to pocket the hardware. Notice the topstitching stops in the middle for alligator clip wires to attach to the respiration sensor.  I made a quick prototype out of muslin:

The trick to get all clean edges is by sewing an L-shape, flipping the remaining side through the pocket hole to sew together, then flipping it inside-out, then altogether right side out.

 Topstitching applied to create the channels:

I made some adjustments to the pattern and channel widths and remembered I had some leather leftover from a previous project:

Sadly, when I tried applying the final topstitching using the awful home-sewing machine in the ITP Soft Lab, it tore the leather. I know I’m not supposed to blame the machine and I know they aren’t equipped to sew leather to begin with, but I think our Soft Lab needs some upgrades. I almost lost my shit, but there’s no point moping and since I’ve already made three, I knew the fourth would come easier:

Back view w/ all parts enclosed:Front view w/ all parts enclosed:User testing today had a 6/7 success rate and received a lot of good feedback on our progress!

PCOMP/ICM Final: Project Planning + Respiration

Project Planning

Timeline:

BOM:

Respiration Prototype Testing:

Last week, Simon and I met with Dan O’Sullivan and Simon’s PCOMP professor, Tom Igoe to get some feedback/advisement on our project idea. We were advised to rethink ways to incorporate more interactivity with our user and perhaps give focus to factors we can consciously control such as breathing, as opposed to heart rate, that we cannot unless accompanied by an universal trigger.

On Friday night, we tested the EeonTex conductive stretch fabric we ordered a week ago that we want to incorporate in our wearable to be used to measure respiration (and to be tested as a potential alternative to electrodes with adhesive backing). We tested its conductivity and stretch with a multimeter, then cut two 1/2″ x 13″ strips held together with safety pins to wrap around the chest. As we moved the alligator clips closer to the sternum at about 1″ ~ 1 1/4″ apart, we were able to visualize via serial monitor a consistent variation of ~30 points from deep breaths in and deep breaths out. We tested on ourselves, Sam, MH, and Leon.

To visualize our data in P5, we sat down with Leon to program the data to adjust accordingly to each individual.

My task this weekend was to make a working and adjustable prototype to playtest on Monday. I made and tested many to optimize data visualization. Different lengths, widths, layers, stitches, and combination of fabrics, but what we found in testing each prototype that followed was that our first safety pin prototype worked best.

The biggest challenge is working with stretch fabric with poor elastic retention. Every time the fabric is stretched out, it grows. That means we need to design a closure that not only accommodates people of different chest widths, but also accounts for fabric growth over time. This was a headache because I would test on myself and it would work perfectly, but if I tried to test on myself again after Simon, Jesse, and Kai, it would be too loose and noise would appear on screen. After many bouts of momentary successes followed by failure and frustration, I think I managed to make an adjustable prototype by repurposing some clasps from a backpack someone left on the junk shelf. The real challenge will come when we attempt to incorporate it in a garment, but for now, I have documentation it worked last night:

PCOMP/ICM Final Project Proposal: Wearable ECG

In my first year of working in the fashion industry, my supervisor told me, “It’s just clothes. You’re not saving lives.” This sentence continues to have a profound and resounding influence on me. On one hand, its repetition can provide a sense of relief when I’m confronted with challenging obstacles and demanding deadlines. In times of quiet introversion, it has the adverse effect of propelling me into existential crisis. Last year, I attended a panel discussion about brain-machine interfaces during the World Science Festival where neuroscientist Miguel Nicolelis gave a talk about his Walk Again project, a robotic exoskeleton aiming to restore full-body mobility to patients who suffer from paralysis. Further research opened my eyes to the potential of wearables. Learning my professional background could be useful in designing/redesigning assistive wearable technology, I applied to ITP to learn the technology.

For final project, I am working with Simon Jensen on a pulse and respiration sensing interactive wearable designed to make the ECG (electrocardiography) procedure more efficient and “seamingly wireless” with the incorporation of conductive soft materials (threads and fabric) and bluetooth. Instead of getting hooked up to a series of wires, we want the user (i.e. patients, athletes, etc.) to be able to wear a garment with the technology built in. We will use P5 serial control to visualize the data we collect from a person’s pulse using an ECG sensor and breathing patterns using a stretch sensor. The data will be synced to the beat of a  P5 sketch of an anatomically correct heart and diaphragm expanding and collapsing with every breath:

Progress:

Last week, we successfully replaced ECG wires with conductive threads and metal snaps!

Over the weekend, we made our first prototype to understand whether incorporating our set up into a shirt might affect our data. We learned a lot, particularly with what power sources to use/not to use (coin cell batteries – although lightweight, are not reliable to power our multiple components – Arduino and ECG sensor). Although we got fairly accurate pulse readings, it only worked when the user was standing/sitting still. Movement distorted and added too much noise to the data. We will do more research on how to eliminate noise and filter accurate data. More to come…

Saturday, 11/04 – Our sweet classroom set up

We used one of Simon’s old shirts to make our 1st wearable prototype

Simon wearing prototype and resoldering some questionable ports

Fortune Teller

Tonight, I had a chat and ICM review session with my friend/ITP alumni/ICM goddess, Wipawe Sirikolkarn. I expressed to her that I’m having a hard time finding programming appealing and as a result, am struggling to be creative in this context. I asked her for resources to spark my interest, but after looking at a number of examples online, she best put it that I’m just not “turned on” by it. I’m still hopeful it will kick in at some point, but for now, I will do my best to complete the homework assignments.

For this week’s homework, I had two ideas. After attending the fly-by last Thursday where  2nd year student Cristobal introduced us to Mappa, I thought it might be fun and vindictive to create a map with planes that visualize all the delayed and cancelled flights for unreliable airline companies (ahem) Spirit Airlines. (I tried to save a couple bucks last year on a flight home to CA and ended up stuck in Chicago and the airport for two days.)

I found this API resource: https://developer.flightstats.com/

This idea was a bit more ambitious to execute than I thought, so I might save it for a different project. I ended up referring to the JSON files shared by dariusk and landed on divination. A colorful array of psychics, crystal balls, tarot cards, horoscopes, magic 8-balls, and Zoltar, those kitschy coin-operated animatronic fortune tellers came to mind. The spectrum of ways in which people desire to be understood and anticipate the future has always fascinated me. My idea is to create a fortune generator with the click of a mouse.

First, I looked up images of fortune tellers to use as my background and set it up in preload. I then created a button to trigger a fortune.

In separate tabs, I referenced the data set I wanted and used the code we used in class to print the data I wanted to use.

 

The tricky part was putting them together. I needed a way to wrap the text so it fit in the crystal ball, but didn’t know how. Thanks to a resident, I was introduced to Div.

Chino walked through drawing a rectangle in the crystal ball with me to figure out the dimensions and font size for our text box and opened up the editor to test it out. We then declared our variables and set up ‘fortune telling’ in an array and used for loop to push a random fortune when the button “Get your fortune!” is pressed.

It worked!

Yikes! I sure hope not!

Serial Input to P5.js

This weekend, I tried serial input to p5.js. Given my background in fabrication, programming is an abstract concept and language for me. For the past month, I had this nagging discomfort in the back of my mind every time I’ve had to open up p5. A few weeks ago, I finally figured out what it is: functions looping infinitely frame by frame gives me the same anxiety and impulse to shut a running faucet. This is why when Schiffman introduced serial to us in class last week, I actually breathed a sigh of relief. At least with my Arduino, I can physically unplug to stop all the madness. My homework’s not due until Wednesday, but I was excited to get started.

I watched the Serial Output from Arduino video from Tom and Jeff and used Schiffman’s simplified serial example code to get it working. At first, it didn’t seem to work and I kept getting “NaN” in my canvas. I noticed that in the same lab for Physical Computing, the code for including p5.serialport.js in the index is slightly longer than Schiffman’s simplified code and wondered if that was the problem.

After I replaced

<script src=”p5.serialport.js”></script>

with:

<script language="javascript" type="text/javascript" src="p5.serialport.js"></script>

The sketch worked fine with each turn of my potentiometer and I was finally in control of the size or position of the ellipse on 
my screen. 
 


I later tried running the simplified code and it worked as well, so I'm not certain if there is any correlation or if there was just a lag 
when I tried it the first few times.

iPhone Unlocked!

Last week, I simulated the slide to unlock function on iPhone for the group assignment, but only had enough time to accomplish a gradient effect with the slider using an alpha mask over the screen underneath. This week, I looked into how to import an image into the screen and reformatted the code to draw with functions to create a more realistic and interactive simulation with a motion we’re all too familiar with.

iPhone Unlocked!

Homage to Homage to the Square

To build on last week’s composition, I wanted to explore interactive elements I can use to demonstrate the interaction of color. Josef Albers was very skillful in creating compositions that highlight the subtleties of color and how we perceive them in different contexts:

        

To recreate Albers’ composition, I set my background color to display a different color/per frame. I created squares and filled them with complimentary colors yellow and purple using lerpColor. lerpColor blends the two colors to find colors in between to create a gradient effect. These colors remain static.

However, they look different in the context of a perpetually changing background color. As you can see, the yellow in the dark teal composition takes on a warmer tone compared to in the magenta.

   

I added alpha to see more variations:

  

Lastly, I set a square to the (mouseX, mouseY) coordinates so the user can see and understand the interaction for themselves. I think this can be helpful for understanding color theory.

  

Play

Code

Computing: Interaction of Color

When we attempt to learn a new language, it’s typical to cover basic greetings/niceties and jump straight to the bad words. I’ll be honest. Sometimes I like to imagine I’m sitting in a dark minivan hacking the National Archive security system and talking strategy with Nicholas Cage through a headset. Sometimes I like to imagine we have a female president. I know hacking is not what we will be covering in this class, but it’s a start. I’m hoping understanding computation will add definition and realism to my characters when I develop these specific simulations in VR in the near future.

For my first composition in computation, I chose to render the iconic book cover of Josef Albers’ “Interaction of Color”. This book is important to me because prior to reading it as an undergrad design student, I worked primarily in grey-scale (pencil, pen, and charcoal). I didn’t have a good grasp of color and the vast amount of options overwhelmed me. After recreating exercises from the book with color cards for a class assignment, I gained a better understanding of how to utilize and manipulate color. If it were not for this book, I’m not sure I would have stepped outside my comfort zone. 

I wanted to be as faithful as I could. In the process I noticed subtle nuances in the composition that I didn’t before. The rectangles are not exactly the same dimensions as I previously assumed, and I struggled for several hours trying to find the precise rotation angles, colors and degree of transparency:

  

  

In the end, I think I got pretty close. It’s not perfect but if I had kept at it, I probably would have lost my mind.

Is there an eyedropper function I can use in my code?