How I Taught Myself to Code in 3D
As an interaction designer, I’m always testing my limits and teaching myself new technologies. And whenever possible, I like to use those new tricks to make people smile. My projects often have no purpose other than giving me an excuse to play on the computer. But I’ve found that combining my passion for the absurd with my passion for learning new tools can lead to glorious outcomes.
I’ve always been a self-taught coder, so a few months back, I decided I’d get a jump on AR by playing around with 3D coding. What better way to figure it out than to create 3D dancing avatars of my friends so I can host virtual dance parties? (The idea may have been inspired by some of my previous work.)
So I went around asking my colleagues if I could 3D-scan them. I know what you’re thinking: How did I get anyone to agree to this? When I asked people to be scanned, they were so excited to "be 3D" they didn't even ask what I was going to do with the scans.
Left: IDEO CTO Tom Eich IRL. Right: Tom's 3D model mid-groove.
Once I got the scans, I "rigged" them—meaning, I gave them bones. Once they had bones, I could assign motion capture dance data to them that I pulled using 3D software. Finally, I loaded them into my code, which I had written with the help of countless tutorials on three.js, an API for 3D graphics. I created a desktop, AR, and VR version so that people could experience the dance party in the reality of their choosing. (3D assets in motion are surprisingly easy to transition between realities.)
The IDDDEO New York dance floor.
Tom Eich dancing in AR.
When participants saw their avatars in motion, they were definitely confused; for many, it was their first time seeing themselves in 3D. Being able to rotate a camera 360 degrees to see yourself from every angle is just not possible in reality, especially when you’re in motion. Most people immediately pointed out all the color/modeling imperfections in their 3D likenesses that resulted from my use of (what is clearly) an off-the-shelf 3D scanner.
Once the studio saw their coworkers’ 3D models dancing, everyone wanted an invite to the dance party. My fellow tinkerer Danny DeRuntz even brought the party to IDEO Cambridge, where he had long lines of people waiting to be scanned in.
The IDDDEO Cambridge dance floor.
My next endeavor will be creating an audio-visual beat machine that lets anyone sample and manipulate YouTube clips using my own pseudo format, AZZ. (Here’s a demo for the curious.) In the meantime, I’ll be fine-tuning the 3D dance code in hopes of eventually bringing this experience to real life parties in NYC. Sadly, elegant and portable 3D body scanning does not yet appear to be a thing. Hopefully soon?
Check out the dance floor and create your own party here.
Want to follow the project? I’ll be posting updates on my Instagram.