Pop Pop: A Living Pedestrian Signal in NYC

Pop Pop is a project to re-imagine New York’s Pedestrian Signals — mainly, to see if they can be imbued with personality and emotion that might make daily life a bit more fun and interesting. To do this, Pop Pop is a to-scale, working pedestrian signal that has real human emotions (based on real-time data of the location) and interacts with people as they walk by. Are there too many jaywalkers? How crowded is the corner? Is it rainy? Bad traffic? Pop Pop knows and reacts accordingly. The goal of the project is to explore how “connected” objects can have personalities that can bring delight into our lives.

image

"Pop Pop" can be imagined like a caring older gentleman who is protective of his intersection: he wants to make sure everyone is safe and happy as they cross through his intersection.

image

People can see and interact with Pop Pop either at the corner of Broadway and Waverly, or by going to the project website (which updates his current mood in real time).

And if they feel inclined, they can let Pop Pop know he’s doing a good job by pressing his crossing button.

image

Who Did It?

Alexandra CoymSam Slover, and Steve Cordova, all Master’s students in tech and design at NYU’s Interactive Telecommunications Program

Why Did We Do It?

With more and more objects becoming connected devices (the “Internet of Things”), we felt it was important to explore the relationships that can come from these objects, and how they may make our lives more interesting and fun. We also wanted to explore how personification can change people’s relationship to an object.

How Does it Work?

Pop Pop has both Web and physical computing aspects.

Physical System

For the physical build of the pedestrian signal, we constructed custom LED matrix panels (each powered and updated by an Arduino Yun). One panel depicts Pop Pop’s current emotional state and the other panel shows the corresponding text that is communicated to the passer-by.

The build of the signal panels consists of pine, masonite and black plexiglass to make it as similar as possible to a real pedestrian signal.

Web System

The Web system is a node.js app that collects real-time data through Mechanical Turk and several hyper-local APIs. Every few minutes, the app pings the Mechanical Turk community to watch a live feed, from which it gets back quantitative data on what they see happening at the intersection (but the camera is set up so faces cannot be detected). The app then combines this data with other APIs (weather, traffic, crime), and does sentient analysis to come up with a current emotion. This can be anything from upbeat to happy to a bit down, and more.

When a new emotion is computed, it is automatically updated on both the physical pedestrian signal (via the Web-enabled Arduino Yuns) and on the website’s front-end (via Web sockets).

Where Did We Do It?

The project was done at NYU’s Interactive Telecommunications Program, under the umbrella and guidance of Microsoft's Humanizing Data Research Group and New York's InSITE Fellowship.

Did Everything Go Smoothly?

In the end, yes. But as with any project, we had our share of hiccups. This one in particular:

image