Craft

Inclusive VR Painting Solution | Inclusive UX Design

TECHNOLOGY: UE4 / Oculus Rift / Sketch

TEAM MEMBER: Shimin Gu / Cherisha Agarwal / Joanna Yen / Pratik Jain /Raksha Ravimohan / Srishti Kush

ROLE: Technical Director / UX Researcher

vision.gif

Designing for every person means embracing human diversity as it relates to the nuances in the way we interact with ourselves, each other, objects, and, environments. Designing for inclusivity opens up our experiences and reflects how people adapt to the world around them. What does this look like in the evolving deskless workplace? By saying deskless workplace, we mean people who aren't constrained to one traditional or conventional office setting.

Goal: Design a product, service, or solution to solve for exclusion in a deskless workplace.

Since we are all designers and we are in a program that involves various artists, we connected with the idea of solving for artists with limited hand mobility and trying to create content in VR. Our professor — Dana Karwas and Nick Katsivelos from Microsoft encouraged us to work in this area since not much has been done.

Initial Research

How might we build a multi-modal tool for people with limited mobility in their arms to create art in Virtual Reality? 

Approaching this idea was not an easy task and it involved us talking to a lot of designers at IDM and understanding their workflow. We spoke to them about how they use software and the potential problems that people with limited hand mobility would run into. We were very fascinated by the idea of creating a multi-modal application that uses eye-tracking and voice commands to enable people to draw in VR.

To dive deep into the technology and understand its use, we read a few papers listed here and got great insights from professors at NYU like Dana Karwas, Todd Bryant, Claire K Volpe, experts in UX research like Serap Yigit Elliot from Google, and Erica Wayne from Tobii. Every person we spoke to gave us more insights to tackle the challenge at hand from a user experience, VR technology, and accessibility perspective. 

Interview

We had conversations with 115 individuals to validate our assumptions. We tested our prototype with potential users who face mobility issues. We also spoke to accessibility experts and disability centers to get a sense of how they would invest in our product to benefit their community.

PEOPLE WITH DISABILITIES

  • Paralysis

  • Cerebral Palsy

  • Multiple Sclerosis

  • Spinal Cord Injury

ORGANIZATIONS

  • Hospitals

  • Senior Homes

  • Disability Centers

  • Accessibility Research Centers

EXPERTS

  • VR Prototypers

  • UX Designers

  • Accessibility Experts

  • Therapists

User Persona and journey

Low-fi Prototype

After extensive research, we decided to get our hands dirty and build our very first prototype. How do we make a prototype for an art tool in VR? We figured out that the best way was to convey our interaction is through a role play video. We did some rapid prototyping using some paper, sharpies, and clips to bring our interface to life.

High-fi Prototype

The final prototype was done using Unreal and had the functionality for drawing using head movements, changing the stroke, changing the environment and teleporting.

The most basic function of an eye-tracking painting tool in virtual reality was realized in our prototype by using head movements. The user could choose tools, paint, teleport, and change the environment with our final prototype but it was still constrained by some technical limitations. Few functions like erase, undo, redo could not be realized by Unreal for now but we hope we can make them work by using other software and hardware. We also hope to look into the technology to enable eye tracking and make it possible to select the tools using the gaze movements to draw. In our current prototype, the voice instructions are manually monitored. We would like to include this functionality as well to make a multi-modal solution for our users.

User Test

Next, we went out to meet potential users from the Adapt community. Adapt formerly known as — United Cerebral Palsy of New York City, is a community network is a leading non-profit organization in providing cutting-edge programs and services that improve the quality of life for people with disabilities. There we got to meet Carmen, Eileen, and Chris who were very interested in painting and loved the idea of drawing in VR. This kind of experience was new to them and to our surprise, we got good feedback from them. All three of them expressed interest in trying out a tool that enabled them to draw using eye movements and voice commands.

Paper Prototype and User Test

Once we got insights and a whole lot of motivation from our users, we decided to give a structure to our interface. This time, we printed our tools on paper and stuck them together in an organized way. We used a laser pointer as the “eye tracker” and we were all ready for our second round of user testing. The whole process of prototyping for VR was very interesting and this process made us think clearly about details of the interactions and potential issues.

Our second round of user testing opened us to two kinds of scenarios based on prior experience. The users who had experience with using design software could complete the tasks well and pointed potential issues with the concept. Some of the issues were with figuring out the z-axis, feedback for interactions, adding stamps, etc. The users from the ADAPT community who did not have prior experience wanted the tools to be less ambiguous and some of the tools felt unnecessary.

Taking cues and learning from responses, we went to make the hi-fidelity prototype using the Unreal engine. Unreal is a good tool to build Virtual Reality content. To make a working prototype, there are several tasks we needed to accomplish which are as explained below:

Drugmatic

VR Experience | Reality Virtually Hackathon 2019 | MIT Media Lab

34556.jpg

Drugmatic is an educational VR immersive experience to help teens fully understand the effects of drugs. Based on the research of true feelings after taking drugs, teenagers will have a taste of the illusion. During this experience, they will also have chances to make a change and to stop this dramatic wired and fake world. Once a decision is made, there is no going back. Fantastic or horrifying, they have to go through the entire illusion by themselves. By the end of this experience, they will realize the unpredictable result of drugs.

TECHNOLOGY

UE4 / Tiltbrush / Maya / 3Ds Max

Oculus Rift/ Leap Motion

time

2019.01

TEAM

Shimin Gu (Self)

Yunzhu Li / Shuting Jiang / Tianyue Wu / Wentao Pu

role

Technical Director

Award

Drugmatic was in Top 8 ‘Best VR’ Finalist of 105 Projects


INSPIRATION

Drugs have always been a serious problem, especially for the younger generation. Unfortunately, most teenagers started their journey with drugs because of curiosity. So how might we help them get away from drugs with respect to their curiosity and imagination?

What It Does

"Drugmatic" is a virtual reality drug experience focusing on teenagers. Based on the research of true feelings after taking drugs, teenagers will have a taste of the illusion. During this experience, they will also have chances to make a change and to stop this dramatic wired and fake world.

Once a decision is made, there is no going back. Fantastic or horrifying, they have to go through the entire illusion by themselves. By the end of this experience, they will realize the unpredictable result of drugs.

How We Built It

We started this project by exploring the VR drawing tools Tilt Brush and Quill VR in order to draw an illusion of drugs. With this VR experience, we tried to rebuild the scene where this event happened. Players could make their decision by themselves using leap motion.

  • BRAINSTORMING AND PROBLEM DEFINITION

In order to decide the concept, we did a brainstorming session. At first, we thought about several ideas but in the end, we made our minds dive into a drug experience.

  • RESEARCH

For us, it's very difficult to recreate an experience without a real reference. Thus we started our project by doing some research about what illusion would be like after taking drugs. We looked at some articles, videos and movies, decided to create the entire illusion world based on the research.

  • STORYBOARD

It took us a lot of time on the storyboard. We hoped our players could enjoy the fancy part of the illusion but also the scary side. It's very important to let our players know that the illusion of drugs can lead you to a horrible world with the scariest result you may never have a chance to regret.

static1.squarespace.jpg
  • STYLE AND DESIGN

During our research, we found out that there are a lot of common things of drug illusion such as higher color contrast, fluid and strange world, strange physical contact...These changes remind us of hand-drawing. We decided to use Tilt Brush and Quill to build the world and animations. We also wanted to have some interaction with our players in the illusion world.

  • PROGRAMMING AND EXPORTING ASSETS

This is the time when everything started to become messy. We watched a lot of tutorials, tried a million times how to export our drawings into Unreal Engine, learn how to build animations...The process is bitter but it's also a fun journey.

Normal World
Illusion Stage 1
Illusion Stage 2
Team Working

Challenges We Ran Into

  • Develop a solution to explore the world of drug effects based on research.

  • Integrate Tilt Brush assets into different graphics engines

  • Limit time and resources to create an educational experience that depends on high-quality art pieces.

Accomplishments that We are Proud of

  • Successfully bring Tilt Brush VR painter into the whole creation workflow.

  • Build an incredible illusion world.

Interaction

  • Natural hand movement instead of traditional controllers

  • Create your own illusion by hand drawing

  • Choose your own adventure

  • Incorporate taste and touch into own VR experience

What We Learned

  • Drugs may make you high for a moment, but the cost of the illusion will be extremely high, too.

  • New creation tools

  • Teamwork

  • Time Management is important

  • Some toolkits are not compatible!!

What's Next for Drugmatic

  • Refine the project by adding interactions and visual effects

  • Research more on drug effects

  • Playtest among teenagers

  • Reach out to schools, provide this new way for advisors and teachers.

  • Reach out to researchers and medical centers, provide our project as an assistant to help people get rid of drugs.

Leo and Barry

Interactive VR Experience | Narrative | 3D Animation

This is an interactive animation about the friendship between two robots in VR.

Time

2018

Technology

UE4 / Oculus Medium / Maya / Mixamo / Oculus Rift


Concept

I got inspiration from the puzzle/adventure game Machinarium. I really like its steampunk-like art style and I also want to create a warm story that happens between robots. The story is surreal but it’s related to friendship which should be heart-touching. 

 
moodboard.png

Story

The main character Leo is a small robot who read about the story about iron giant so he wanted to build one to be his friend but he needs a battery to start it. He happened to know another robot Barry nearby collected a powerful battery so Leo invited him to his house. Barry thought the battery was valuable and could be used as his backup battery since he didn’t have too much power left. If Leo wanted it, he had to do a lot of things to make Barry happy. After that, Barry promised to give Leo the battery but Leo realized that he just wanted a friend and he already had one. In the end, the two little robots were dancing in front of the giant.

Character

I created two small robots in Oculus Medium. After exporting them, I used Meshlab to reduce facets and then uploaded them to Mixamo to auto-rig and do animation. According to my story, I found all the animations I might need and download them into a folder for further use.

Leo

Leo

Barry

Barry

Mixamo-1.gif

Environment Setting

I found a steampunk interior asset online but I felt the environment was too cloud and I didn’t like the yellowish color, so I rearranged the asset, changed its color tune by adjusting the parameters of the textures, and reset the lighting. I also added a post-process volume to avoid auto exposure which was annoying to me. 

Orignal envrionment

Orignal envrionment

After changing color tune and lighting

After changing color tune and lighting

Animation

Since I planned to do an interactive piece, Barry’s animation should be triggered by user input, so I tested how to call animation by pressing the keyboard first. The difficult point is how to make every interaction according to logic.

call-animation.gif

Process in Unreal

Animation Map

Animation Map

Cast to animation map in level blueprint

Cast to animation map in level blueprint

Trigger by overlap and motion controller input

Trigger by overlap and motion controller input

SOUND

Generate robot voice

Generate robot voice

Robot voice triggered by user input and animation time

Robot voice triggered by user input and animation time

VR Animation: Two

Narrative VR Experience | 3D Animation

This is the first time I tried to tell a narrative story in virtual reality in Unreal. The short story is about loneliness and hope. 

Due to a huge disaster, most elves who used to live in the forest have died or left so there are only two elves live in the forest now. The male little elf named Charlie keeps praying to the old tree for help. But the tree never responds to him. Another elf tries to comfort him but it doesn't work. Charlie is almost desperate, but he sees sunlight through leaves and still remains a little hope. 

The audience will see the story from the perspective of the second elf.

Technology

UE4 / Oculus Medium / Maya / Mixamo / Oculus Rift

Time

2018


Idea

MOOD BOARD

MOOD BOARD

STORYBOARD

STORYBOARD

Character

I decided to make my own character using Oculus Medium. Making VR in a VR environment is a very cool experience. I found a baby-sized body in Medium and used it as a reference, which is much easier than making it out of thin air for me. To rig more easily, I changed his position to the traditional T-pose.  The cut little elf was named Charlie then! Then I uploaded the .fbx file to rig the character and add animations.

highkick.gif
bandicam-2018-03-07-14-23-30-749.jpg

Environment Setup

This part is about Unreal. I bought one asset package in Unreal Marketplace. I think the lovely animation style matches Charlie. But the color and placement of assets was not what I pictured in mind so I changed the placement of elements and their color tune.

bandicam-2018-03-05-15-31-13-997.jpg

Animation in Sequencer

I use the sequencer to do the animation. The difficult point for me is how to make Charlie’s animations to be continuous. I also found it hard to control the movement of the camera. What we see in normal preview mode is always a little different from what’s in VR mode. And I had to control the camera to move smoothly in order to avoid discomfort in VR.

bandicam-2018-03-05-15-40-36-310.jpg

100% Human

Future Dystopia | 3D Stealth Game

Imagine our civilization 20 years from now. Things have either collapsed beyond repair or life has become a harmonious dream. Through some crazy combination of wormholes and string theory, you have the opportunity to create a piece of information that will be sent back to the current population.

What does your imagined future look, sound, and feel like?  What message would you transmit to the current population from the future?

Technology

Unity / Photoshop

Time

2017


Ideation: Imagine a Eugenics dystopia

Moodboard

Moodboard

Eugenics tree

Eugenics tree

storytelling

Imagine what if we have totally cracked the secret of genes in the future? Maybe we will use it to generate perfect offsprings. What perfect meaning are higher intelligence, faster learning speed, healthier mental and physical conditions?

Because of this, our social system will change totally. It will become a military-constructed society. Everyone will be a vitro, so marriage will no longer exist. The government decides whether you can have a baby or not based on your personal genes and social contributions. Nature-borns are not allowed. When a woman gets pregnant accidentally, she will compulsively accept an abortion.  If someone is allowed to have a baby, he/she will be distributed a good-matched sperm or an egg as well as a set of good genes such as high intelligence and a strong body. Although every baby is using the same set of good genes, it will show differently after birth. Those babies who can not reach the common standard will be eliminated in one month.

Picture from Gattaca

Picture from Gattaca

Picture from Fallout Shelter

Picture from Fallout Shelter

The main character is a male scientist who does gene research in the laboratory. He gets permission to have his own daughter. Usually, parents are not allowed to see their baby before the final gene report comes out, but the scientist has access due to his work. He accidentally sees his daughter’s serial number appearing on the “watch list”. Her gene index is only 20% now.

Although people at that time don’t have much family concept, the scientist doesn’t want her to be killed. According to his recent research, if he has other sets of good genes, he can inject them into his daughter to improve her index. He decides to steal them. There will be armed soldiers there guarding those genes. He must get them instead of being caught by the guards. Every time the scientist puts one more good gene into his daughter’s body, the index on the gene testing machine will increase a little. His goal is to recreate a 100% human.

Finally, he succeeds. The girl passes the examination. Then, all the children will be uniformly educated, as parts of the state apparatus. They won’t see their parent often. In some sense, they are raised as “functional replicants”. Of course, the scientist never tells his daughter the story. Since she gets more good genes than others, she grows much more superior than her peers. Meanwhile, she is also more arrogant than others. She becomes the leader of gene supremacists in the end.

The message is sent by her. In her description, the future is a utopia. All humans are perfect, there will not be any deceases, crimes, or racial discriminations. She wants us to accelerate the gene research and carry out the principle of Eugenics because if we do so at present, their elimination work will be much easier.

It is an ironic ending but I think we can make our own judgment.

Storyboard

physical prototype and timeline

Timeline

Timeline

Tools

Tools

Physical Prototype

Physical Prototype

High-fi Prototype

Based on my story, I build a 3D stealth game using Unity. The main character is the scientist whose mission is to steal genes from the warehouse without being killed by robot guards and then inject them into his daughter’s body.

In the first version, I put some basic props into the scene and create some simple functions. The players can walk, run, jump and use the mouse to change perspective. The winning condition is to get genes from the warehouse and go back to the nursery room, interact with a certain tube. 

If the player is spotted by the guard, the game ends. I put one guard in the gene warehouse as the enemy. The biggest problem is the AI guard is rather “stupid”, it will sometimes go through models.

Also, because I haven’t put any background information into it, the player had a hard time knowing the purpose. The certain object player needs to interact with is not clear.

Then I iterate the game, using different scripts to control the guards. If they see the players, they will shoot and the alarm system will turn on. The Player has health, the shooting damage is related to the distance between the player and the guard. I also put one more guard in the scene to improve difficulty. And there are more props in the scene to tell the background information.

Screen-Shot-2017-12-18-at-1.51.03-PM-1024x636.png
Screen-Shot-2017-12-18-at-1.51.23-PM-1024x639.png

I changed the color of certain objects the player needs to interact with. The player can interact with some objects to see the detail. These are the narrative part and also as the instruction for what you need to do. Also, put background music into it to create an intense atmosphere.

Here is the game playing process (success)

The successful ending of the game is ironic, the scientist saved his daughter but she then became the leader of Eugenics. The message is sent by her, Lady S. She describes the future as a utopian world and encourages current people to keep developing DNA technology. I create a news webpage to show her message.

Although she describes a utopia world, my real concept is to criticize on this kind of Eugenics and show a dystopia.

Social Class

2D Narrative Game

This piece reflects the problems with social classes. Try it on Github

Technology

P5.js / Photoshop / Handdrawing

Time

2017

Voice Chamber

Wearable | Physical Computing

A way to help you understand/get out of social isolation and loneliness

Social isolation and the feeling of loneliness is commonly existing among human beings which could cause health issues. The reason behind it varies and there is no general solution that fits all. Social isolation and loneliness are hard to measure and don’t have any clear standards. People who have frequent social activities may still face these kinds of issues. Also, sometimes people even don’t aware of or feel afraid of facing it. Through this project, I’d like to inform people this kind of situation is a common thing instead of something you should hide or be ashamed of. Hopefully, it could help reduce the pain and health outcomes social isolation may cause.

Technology

Arduino / Adafruit / Soldering / Lasercutting / MaxMsp

Time

2018


RESEARCH

Since social isolation and loneliness is a very human-centered topic and everyone’s situation varies, so I decided to design an initial survey first to get a basic sense of people’s attitude towards social isolation and loneliness, have they ever felt that, as well as getting their demographic information and contact information.

The link to the survey is: https://goo.gl/forms/c5qM5Zl2ClFrXMK33 

Screen Shot 2018-11-27 at 4.38.38 PM.png
  • INSIGHTS

The result is a little bit surprising that most people described that they were socially isolated or felt lonely before and think social isolation or loneliness is a neutral thing, which corresponds to the opinion I want to express through my project: You don’t have to feel bad about social isolation and loneliness. Meanwhile, more people think social isolation is a kind of feeling instead of an external state, which does not correspond to the academic definition of social isolation. When I designed the survey, I didn’t define social isolation anywhere intentionally because personally, I don’t totally agree with the ‘official’ definition. I’m not sure I was right or not, but just want to know people’s opinions about it.

  • CONSTRAINTS

For now, I got 24 responses to my survey and the age group is centered on young grown-ups. The number is too small and the demographic information is not varied enough. Also, this time I didn’t ask any deep questions about the reason or ask them to share some stories because I think it would be more efficient to talk to somebody in person to get these kinds of information.

1st prototype: Echo Chamber

‘Echo chamber gear’ works as a simulation of social isolation/loneliness in order to reach the informal purpose. The gear would be a closed, dark environment inside, and people inside can only hear his/her own voice echo instead of outside sounds. The unpleasant experience could only stop by he/her taking off the gear or somebody else touch the touch sensor outside because I think social isolation/loneliness could only be overcome on one's own initiative or being helped by the community.

IMG_0097.jpg
static1.squarespace.jpg

For the first prototype, I decided to test if the concept could really help people feel isolated and lonely without using any circuit. So, I used a paper box as the gear and Max MSP to create echo sound. Below is a short demo video:

2nd Prototype: Voice Changer

Another idea that came to me is since a lot of people become socially isolated because of their personal situations including financial status, appearance, etc, so what if I created a wearable that can fuzz up their identities?

Basically, I used the same core components in my prototype one. I used the adafruit wav shield with Arduino to build a voice changer. I also laser-cut a wood box and put all the electronic parts inside it. I designed the box to have a cute face because I assume that will encourage people to try it and let other people willing to interact with the person inside.

I invited multiple people to try them on and the process was really fun. People were willing to put it on, talk and act as an anonymous person. But for now, I still not sure if it will give isolated people the courage to go out and connect with people. I need more people to test it out in the future.

  • SOLDERING PROCESSS

After I get the wav shield, I started to soldering all the components, which is a lot to do. The before and after pictures are attached below.

IMG_3920.jpg
IMG_3923.jpg
  • PLAY SOUND FILES

Credit to lady ada on Adafruit for providing this tutorial: https://learn.adafruit.com/adafruit-wave-shield-audio-shield-for-arduino.

The first test I did is running through the most basic function of the wav shield which is playing sound from the SD card. One thing that needs to mention is that the wav shield can only play PCM 16-bit Mono WAV files at a 22KHz sample rate, so any sound files that have different settings from this need to be converted. I used Audacity to that. The result is all the sounds play perfectly in the loop and I can use the potentiometer to adjust sound volume in real-time.

See the video below:

  • VOICE CHANGER

Credit to Phillip Burgess on Adafruit providing this tutorial: https://learn.adafruit.com/wave-shield-voice-changer/overview

I used a breadboard to hook everything up. Basically, the Arduino code will get the pitch of every sound voice and change and map the pitch of voice output from 0 to 1023.

projects_circuit1.png

It worked as shown in the video:

But the constraint here is the potentiometer that controls the pitch can’t work in real-time because Arduino and wav shield can’t read the sound files and change the pitch together, so I change the pitch by turning the potentiometer, I have to press the reset button.

  • LASER CUT

Here comes the wearable part. I designed the box to be a cute robot avatar which I think could let other people be willing to interact with you. I put acoustic foams inside in order to isolate sound as well as provide a soft surface to wear.

IMG_3940.jpg
IMG_3942.jpg
IMG_3943.jpg
IMG_3955.jpg

In order to make it portable, I soldered the wires on a PCB board and made it power supplied by a 9V battery. Luckily the battery can provide enough power to both the Arduino and the wave shield.

Then I cut a hole on the side of the box to mount the circuit part on the edge. It looks kinda nice from the outside and inside. I left the wav shield exposed because I have to press the small reset button. I think there should be a better way to cover the shield which I could do later.

IMG_4059.jpg
IMG_4060.jpg

User Test

I had multiple people try on my voice box and we were having fun inside.

IMG_3971.jpg
IMG_3986.jpg

NEXT STEP

My next step would be to find more people for my survey and talk to them, especially people who experienced social isolation or are experiencing it deeply. I’m trying to reach some people on Facebook groups, meetups, and organizations. I think that’s one of the most difficult parts of doing the project that I can’t easily reach somebody who’s very socially isolated. I’m also trying to interview more experts in this field including sociologists, psychologists, and therapists. I think talking to a specialist in this field would be useful for me to get better insights and maybe help me find someone to talk to.

Meanwhile, I will keep developing my prototype: echo chamber, do user testing, and do iterations after I get feedback from them. I would also love to do some cultural probes experiment when I have time. I think it would be a really good way to understand people better.

 

Power Running Suit

Wearable

An eco-friendly way to utilize the kinetic energy the human body generated by walking or running and transform it into the power of lightning using the mechanic of shake flashlight.

Technology

Circuit / Soldering / Sewing

Time

2018


Mechanic

faradyanim.gif
31PTZALa8UL._SX425_.jpg

If we rotate a wire loop in a magnetic field, the field will induce an electric current in the wire.

 

While we walk or run, our lower body will keep moving back and forth, which can be used to keep the magnet moving in the coil. I tried to figure out which part of our lower body moves the most.

 
Human Running
 

Ideally, the coil should be tied to the shank or foot. The size should be designed small enough to fit on the body. There will be LEDs on the side of the leg.

 
Draft

In order to test which part works better and know the inner structure of shake flashlight, I bought one on eBay and tried to tie it to different part of body. It turns out tying it to one of the foot would be the most efficient solution.

1st Prototype

FLIBTG0IECNDTL0.LARGE.jpg

Materials

  • Coil (Wire & Pipe)

  • Magnet

  • LED lights

  • Diode bridge

  • Capacitor

  • Resistor

The first thing i did was to make the coil. I tried to utilize some materials I already had, so I made the pipe by wrapping up a plastic sheet and used some electric wires to do the coil. I put both the sphere and disc magnet inside. However, when I shake the coil, the LED connected didn’t light up.

IMG_2619.jpg

The problem might be I used the wrong type of wire, I didn’t have enough rounds of wires or the magnet was not strong enough. In order to troubleshoot, I bought some new supplies online and tried to build the second prototype.

2nd Prototype

For the second prototype, I used enameled copper magnet wires (Remington Industries 30SNSP.125 Magnet Wire, Enameled Copper Wire Wound, 30 AWG, 2 oz, 402' Length, 0.0108" Diameter, Red) which is very thin and I wrapped it hundreds of times over the center of the coil. I also used some super strong magnets (10Pc Super Strong N42 Neodymium Magnet 1.26" x 1/8" W/3m Adhesive NdFeB Discs) to get a strong magnetic field. I used a multimeter to test if there was power generated or not.

It turned out I got some power!

Then I added the circuit to breadboard. This diode bridge was used to take the alternating electrical output of the coil and get it flowing in one direction. I used four single diode to build the bridge. The capacitor was used to store some electricity.

The second prototype worked but was not steady enough to become a wearable and do test on human bodies.

Final Prototype

In order make it steady and smaller the size, i decided to use a double design for the coil, which means adding an extra acrylic tube outside the coil. Meanwhile, I used a smaller PCB board to solid all the circuit and sewed it on a sport pant.

The main problem I faced was how to tie the coil onto my feet. I haven’t found the best solution yet so I just used some rope to do that. I worked fine, but I’m still looking for a better solution now.

Three versions!