x.pose is a wearable data-driven sculpture that exposes a person's skin as a real-time reflection of the data that the wearer is producing. In the physical realm we can deliberately control which portions our bodies are exposed to the world by covering it with clothing. In the digital realm, we have much less control of what personal aspects we share with the services that connect us. In the digital realm we are naked and vulnerable.


In today's data-driven society, individuals carrying smartphones and interacting with social media networks have agreed, often without conscious consideration, to policies that grant service providers explicit rights to harvest and utilize personal data on a massive scale. These include companies like Google and Facebook that offer "free" services for a value exchange of our data.

There currently exists a paradox in our internet culture. As a generation, we are simultaneously obsessed with publicity and privacy. While we publish and post details about our lives online, at the same time we demand the most advanced privacy protection software. An unprecedented degree of potential exposure comes with the current mode of existence.

I have ceded control of my data emissions and Based on my activity logs, Google clearly knows where I am, where I've been and possibly even where I’m going. Yet when I wanted a log of my location history, I had to go through numerous steps to “enable” tracking...

By participating in this hyper-connected society while having little to no control of my digital data production, how much of myself do I unknowingly reveal? To what degree does the aggregated metadata collected from me paint an accurate portrait of who I am as a person? What aspects of my individuality are reflected in this portrait?

x.pose is my exploration of these questions. Since I have already ceded control of my data, I wanted to go a step further and broadcast it for anyone and everyone to see.

The first step was to build a mobile app and server to automatically collect my data over time. Done using Node.js and PhoneGap.

Second: the recorded data set was used as the basis for the generative aspects of the personalized wearable couture. The output is an abstract 3D mesh armature of my location data points collected over about a month. The dataset was fed into processing to produce the pattern and exported to Rhino to make the 3D mesh.

Lastly, the mobile app and server is used to provide real-time data transmission through bluetooth to an Arduino, which controls reactive displays that change in opacity to reveal the wearer’s skin. This occurs in proportion to the volume of information that is passively generated.

ITP NYU Thesis 2014
Special thanks to: Nancy Hechinger, Ben LIght, Eric Rosenthal, Thesis class, Talya Stein, ITP community, Family & Friends

In collaboration with Pedro G. C. Oliveira
Photography By: Roy Rochlin
Model: Heidi Lee
Makeup: Rashad Taylor

Featured on FastCompany, Wired UK, AnimalNY, The Creators Project, CNET, iO9, NOTCOT, Ars Technica, DesignBoom, Engadget, Huffington Post, ELLE, and many more!

Winner of YouFab Global Creative Awards 2014!

Exhibited at the HopeX conference in NYC July 18-20, 2014 & The Engadget Expand Wearables on the Runway show November 7th, 2014.