Eric Wang

(Cheng-Yao Wang)

I'm a Human-Computer Interaction Researcher Interested in Mobile Interaction, Novel Input/Output Techniques and Ubiquitous Computing.

About me

My name is Eric Wang (Cheng-Yao Wang). I’m a Human-Computer Interaction researcher interested in inventing interaction technologies which can bridge the gaps between computing resources and people's daily lives in a natural and efficient way. My interests fall into the research fields of of mobile interaction, novel input/output techniquesand ubiquitous computing.

Advised by Prof. Mike Y. Chen and Prof. Bing-Yu Chen, I received Master degree from Graduate Institute of Networking and Multimedia in National Taiwan University. Through a series of research projects during my master’s studies, one-year independent research, and research internship with Prof. James Landay at Stanford university, I have been crafting novel interaction technologies which covered a variety of platforms including smart wristband/watches, smartphones, head-mounted display and personal drone; besides, I have top conference publications and won the 1st prize in competitions.

In addition to my background in computer science,I have gained invaluable knowledge and practical experience in hardware prototyping, user study designand interaction design. My role in Human Computer Interaction is to use these research tools to foster powerful and natural interactions between humans and computers.

Education

  • M.S. in CS, National Taiwan University

  • Sep '12 – Jun '14
  • B.S. in CS, National Taiwan University

  • Feb '06 – Aug '10

Experience

  • Chief Counselor,Compulsory Military Service, Taiwan

  • Jun '10 – Jul '11
  • Software Engineer,EZTABLE

  • Jun '09 – Jun '10

download cv

Novel Technologies Published in Top Conference and 1st Prize of Competitions

projects

MobileHCI 2015
PalmType: Using Palms as Keyboards for Smart Glasses

Cheng-Yao Wang,Wei-Chen Chu, Po-Tsung Chiu, Min-Chieh Hsiu, Yih-Harn Chiang, Mike Y. Chen

Published in MobileHCI 2015 (full paper)

We present PalmType, which uses palms as interactive keyboards for smart wearable displays like Google Glass. PalmType leverages user's innate ability to pinpoint a specific area of palm and fingers without visual attention (i.e. proprioception) and provides visual feedback via wearable displays. With wrist-worn sensors and wearable displays, PalmType enables typing without requiring users to hold any devices and visual attention to their hands. We conducted design sessions with 6 participants to see how users map QWERTY layout to their hands based on their proprioception.

MobileHCI 2015
PalmGesture: Using Palms as Gesture Interfaces for Eyes-free Input

Cheng-Yao Wang,Min-Chieh Hsiu, Po-Tsung Chiu, Chiao-Hui Chang, Liwei Chan, Bing-Yu Chen, Mike Y. Chen

Published in MobileHCI 2015 (full paper)

With abundant tactile cues and proprioception on palms, the palm can be leveraged as an interface for eyes-free input which decreases visual attention to interfaces and minimizes cognitive/physical effort. We explored eyes-free gesture interactions on palms that enables users to interact with devices by drawing stroke gestures on palms without looking at palms. To understand user behavior when users draw gestures on palms and how gestures on palms are affected by palm characteristics, we conducted two 24-participant user studies. Also, we implemented EyeWrist that turns the palm into a gesture interface by embedding a micro-camera and an IR laser line generator on the wristband, and three interaction techniques that takes advantages of palm characteristics are proposed.

CHI 2014
EverTutor: Automatically Creating Interactive Guided Tutorials on Smartphones by User Demonstration

Cheng-Yao Wang,Wei-Chen Chu, Hou-Ren Chen, Chun-Yen Hsu, Mike Y. Chen

Published in CHI 2014 (full paper)

We present EverTutor that automatically generates interactive tutorials on smartphone from user demonstration. It simplifies the tutorial creation, provides tutorial users with contextual step-by-step guidance and avoids the frequent context switching between tutorials and users' primary tasks. In order to generate tutorials automatically, EverTutor records low-level touch events to detect gestures and identify on-screen targets. When a tutorial is browsed, the system uses vision-based techniques to locate the target regions and overlays the corresponding input prompt contextually. It also identifies the correctness of users' interaction to guide the users step by step.

MM 2013
RealSense: Directional Interaction for Proximate Mobile Sharing

Chien-Pang Lin, Cheng-Yao Wang,Hou-Ren Chen, Wei-Chen Chu, Mike Y. Chen

Published in MM 2013 (short paper)

We present RealSense, a technology that enables users to easily share media files with proximate users by detecting the relative direction of each other only with built-in orientation sensors on smartphones. With premise that users are arranged as a circle and every user is facing the center of that circle, RealSense continuously collects the directional heading of each phone to calculate the virtual position of each user in real time during the sharing.

EyeWrist: Enabling Gesture-Based Interaction on Palm with a Wrist-Worn Sensor

Cheng-Yao Wang,Po-Tsung Chiu, Min-Chieh Hsiu, Chiao-Hui Chang, Mike Y. Chen

Excellent Work,The 8th Acer Long-Term Smile Innovation Contest, 2014

We present EyeWrist, which uses palms as the gesture interface for smart wearable displays such as Google Glass. With abundant tactile cues and proprioception on palms, EyeWrist can also be leveraged for device-less and eyes-free remote for smart TVs. EyeWrist embeds a micro-camera and an IR laser line generator on the wristband and use computer vision algorithms to calculate the finger’s position on the palm. Without requiring the line of sight of users’ fingertips on palms, the camera height could be lower, making the whole device more portable. We also implemented a gesture recognizer to distinguish different symbols, letters or touchscreen gestures(e.g. swipe, pinch) on palms. The recognition result would be sent to smart devices via Wi-Fi for gesture-based interaction.

EyeWatch: Touch Interaction on Back of the Hand for Smart Watches

Cheng-Yao Wang,Po-Tsung Chiu, Min-Chieh Hsiu

2nd Prize,MobileHero – user experience design competition, 2014

We present EyeWatch, which uses back of the hand as gesture interface for smart watches. EyeWatch not only overcomes the Big smartwatch problem: occlusion and fat finger problem, but also enables more powerful and natural interaction such as drawing a symbol quickly to open an application, or intuitively handwriting on back of hand to input message. Our proof-of-concept implementation consists of a micro-camera and an IR laser line generator on the smart watch, and computer vision algorithms are used to calculate the finger’s position on the back of hand.

The Incredible Shrinking Adventure

Cheng-Yao Wang,Min-Chieh Hsiu, Chin-Yu Chien, Shuo Yang

Final Shortlist,ACM UIST 2014 Student Innovation Contest

Imagining you were shrunk, you can explore your big house, play with big pets and family. To fulfill the imagination and provide users with incredible shrinking adventures, we use a robotic car and a google cardboard which turns a smartphone into a VR headset. We build a robotic car and attach a smartphone on the pan/tilt servo bracket. stereo images are generated from smartphone’s camera and are streamed to the other smartphone inside of the google cardboard. When users see the world through the smartphone on robotic car, they feel they were shrunk.

drone.io: Gesture Input and Projected Output for Collocated Human-Drone Interaction

Under review for ACM CHI 2016 (Full paper)

We introduce drone.io, a body-centric interface to facilitate natural HDI where users interact with simple gestures above a projected radial menu. drone.io is a fully embedded input-output system using a depth camera to recognize the position and shape of the user’s hand and a mobile projector to display the interface in real time. We show that it is an easy, effective, and enjoyable way for people to interact with drones.

my skills

Computer Science

  • Web, smartphone, smart watches, head-mounted displays app development
  • C/C++, JAVA, iOS, Android, Rails, Node.js
  • Computer Vision, Gesture Recognition, Machine Learning, Augmented Reality

Hardware Prototyping

  • Using hardware rapid prototyping tools to create systems
  • Infrared sensor, depth camera, robotic car, drone
  • Arduino, Raspberry Pi, Processing, 3D printing

User Study Design

  • Mixed qualitative and quantitative study design
  • Statistical analysis

publications

PalmType: Using Palms as Keyboards for Smart Glasses

ACM MobileHCI 2015, Full Paper

Cheng-Yao Wang,Wei-Chen Chu, Po-Tsung Chiu, Min-Chieh Hsiu, Yih-Harn Chiang, Mike Y. Chen

PalmGesture: Using Palms as Gesture Interfaces for Eyes-free Input

ACM MobileHCI 2015, Full Paper

Cheng-Yao Wang,Min-Chieh Hsiu, Po-Tsung Chiu, Chiao-Hui Chang, Liwei Chan, Bing-Yu Chen, Mike Y. Chen

EverTutor: Automatically Creating Interactive Guided Tutorials on Smartphones by User Demonstration

ACM CHI 2014, Full Paper

Cheng-Yao Wang,Wei-Chen Chu, Hou-Ren Chen, Chun-Yen Hsu, Mike Y. Chen

RealSense: Directional Interaction for Proximate Mobile Sharing Using Built-in Orientation Sensors

MM 2013, short paper

Chien-Pang Lin, Cheng-Yao Wang,Hou-Ren Chen, Wei-Chen Chu, Mike Y. Chen

drone.io: Gesture Input and Projected Output for Collocated Human-Drone Interaction

Under review for ACM CHI 2016

awards

EverTutor: Automatically Creating Interactive Guided Tutorials on Smartphones by User Demonstration

1st Prize,The 11th Deep Innovations with Impact, National Taiwan University, 2013

1st Prize,The 11th Y.S. National Innovation Software Application Contest, 2014

MagicWrist - Connect the world

Excellent Work,The 8th Acer Long-Term Smile Innovation Contest, 2014

The Incredible Shrinking Adventure

Final Shortlist,ACM UIST 2014 Student Innovation Contest

EyeWatch: Touch Interaction on Back of the Hand for Smart Watches

2nd Prize,MobileHero – user experience design competition, 2014

EyeWrist - Enabling Gesture-Based Interaction on Palm with a Wrist-Worn Sensor

Final Shortlist,MediaTek Wearable device into IoT world competition, 2014