Jade is thinking

Runway In Store

Runway in Store is a projection-based Augmented Reality installation designed for Nike retail store. It explores the future of retail experience using AR with Kinect that gives shoppers compelling new interactive experiences.

This prototype project is supported by Verizon Envrmnt and NYC Media Lab.

Tech: Unity3D, C#, Kinect, Shader programming

Team: Luqian Chen

My role: Developer, Visual Designer

Experiential Retail Is The Future

The retail industry is undergoing a major transformation as e-commerce disrupts traditional brick-and-mortar store models and gives rise to new modes of “experiential retail.” By creating a more immersive retail experience, retailers can drive people towards their stores and ensure they leave not just with your products but also memories. A key element of experiential retail is the innovative use of technology to provide interactive and immersive experiences.

Image
Image

Moodboard - experience marketing

Ar: Creative, Interactive And Affordable

Except for the cool but expensive physical installations, what other technologies could bring more possibilities and convey the brands' attitudes in a more creative way? The emerging technology, Augmented Reality is incorporated by some forward-thinking retail brands into the customer experience, which has the following advantages:

  • Enables customer engagement
  • Triggers more customer-generated contents on social media
  • Is scalable and easy to change themes
  • Costs much less than physical installations
  • Runway in store is a physical playground that encourages customers in the retail store to move around and interact with the augmented visuals that are projected on the wall or an LED screen, where a Kinect v2 is used to capture the real-time the human image and drive a real-time movement of a 3D character in Unity. By choosing Nike as the brand, the AR installation tracks human motions with multiplayer features and metaphorical explorations of the augmented 3D visuals.

    Customers then could save and share the looks to social media, so that the brand can employ social media as a means of reaching out and connecting with potential customers.Fortunately, this idea drew the attention from Verizon who gave us lots of support to move this project forward.

    User Research

    We narrowed it down to choose particularly Nike as the brand and design the customized retail experience for their stores. As a sportswear brand, Nike adapts to current trends, continuously uses new technologies to make more enjoyable experiences for their customers and maintains their brand image in all aspects which my thesis idea could perfectly fit in. The target audience would be young-adult customers (aging between 18-30) who are tech-savvy and love sports. Most importantly, they are constantly looking for fun experiences to share on social media.

    Image

    User Persona

    Prototyping Process

    Initially I wanted to build a mobile AR app, that could enable users to track their foot movement and see the triggered effects on the screen which led to three technical directions for detecting human movement: Machine learning: Vision framework (iOS) or YOLO with CoreML (iOS), Vuforia 3D object recognition, Kinect with creative coding.

    My first mobile prototype uses CoreML vision framework which processes a live feed from the camera and extracts information from each frame using both built-in and external machine learning models. However, the real-time object detection results are not reliable even if there are no big moves; The marker-based AR framework Vuforia has size limitations to the scanned object (which ideally would be the shoes), and the 3D object recognition may also lose track sometimes.

    Customers then could save and share the looks to social media, so that the brand can employ social media as a means of reaching out and connecting with potential customers.Fortunately, this idea drew the attention from Verizon who gave us lots of support to move this project forward.

    Image

    Vuforia 3D recognition

    Image

    CoreML object detection

    So I finally decided to go with Kinect v2 which can capture the RGB images and sense the depth data. I also chose Unity 3D as the developing platform that enables 3D visual programming. Here is how the space setting looks like: customers enter the playground facing a Kinect and a wall to be projected or a big LED screen. The Kinect will track human movements and 3D augmented visuals will be showing on top of the RGB images.

    Image

    Kinectron demo (JavaScript)

    Image

    Unity3D with Kinect

    Visual Programming In Unity3D

    I designed the implemented an algorithm for the augmented 3D visuals with motion trails and particles. All of the parameters could be adjusted and customized easily such as the color, particle lifespan, texture and so on. The algorithm works like this:

  • Layer 0: Kinect RGB image as background image.
  • Layer 1: Augmented 3D visual effects. Use Kinect v2 to drive a real-time movement of a 3D character (model).
    • Map the Kinect skeleton data to a rigged mesh
    • Use the Skinned Mesh Renderer component to render Bone animations, where the shape of the Mesh is deformed by predefined animation sequences.
    • Use vertices of an animating skinned mesh as emitting points.
    • Computes the velocity and acceleration of each emitting points and generates particle effects accordingly.
  • Image

    Particles

    Image

    Particles with trails

    Image

    Particles with larger textile

    User Test and Improvement

    In terms of the scalability of my AR installation, I was thinking about how to make it easy to change themes and also flexible to implement for different stores, with minimum budget. The solution is:

  • Extract the human silhouette from what Kinect sees
  • Remove the annoying surroundings background
  • Replace with the customized 2D image or add 3D scenes
  • Add shader that reduces the disturbing edges caused by silhouette extraction
  • Video Documentation

    We showcased our piece at several different locations (NYC Media Lab, NYU Tisch) and here are some feedbacks we got from audiences.

    “This is the typical instagramable experience! I love how I look in this psychedelic filter and it’s really cool when I move!” — Chris

    “If I see my friend posted this on Instagram, I would really wanna go visit that store!” — Jane