FLASH Robotics | Social robots

Keeping eye contact

FLASH™ has taken part in an innovative experiment. It was aimed at discovering the key features that affect the human-robot interaction.


Experiment aims were to study various elements of interaction with a social robot using unique techniques. Proposed the following hypotheses:

  1. Gaze patterns of people interacting with FLASH™ follow rules observed in interaction between humans.
  2. Anthropomorphic elements of FLASH's design attract attention the most and cause the robot to be perceived as humanlike.


The pilot experiment was preceded by a pilot study on a group of 19 people. In experiments, as research tools, we used a mobile eye-tracker, which allows to achieve qualitative (heat and focus maps, gaze path, etc.) and quantitative (statistical quality indicators called KPIs) results. Interviews with the participants have also been carried out. The main experiment was carried out at the Main Railway Station in Wrocław. The choice of this location was motivated by the need to acquire a large number of participants with diverse demographic and social background. Test subjects were between 15 and 82 years of age and 26,9 was the average. 90 people (42 females and 48 males) participated in the experiment with the eye-tracking device and have lled out questionnaires afterwards. Additionally, 66 people took part in an in-depth interview.

Site and equipment

The mobile eye-tracking device from from Sensomotoric Instruments - SMI EyeTracking Glasses, provided by Simply User, is a high-end piece of equipment allowing the recording of a person's point of view and the path that their sight follows. It is completely unobtrusive to the user and consists of a sleek pair of glasses connected to a laptop that the test subject carries with them in a backpack. This kind of technology allows various eye-tracking experiments, such as shelf-testing, driving research or even analysis of training methods in sports, to be conducted out in the eld. 


Eye tracker and user

During the whole experiment FLASH™ operated autonomously and delivered all the instructions to the participants himself. The scenario implemented for this experiment was divided into 2 parts:

  • INTRODUCTION - the robot greeted the person and held out his hand for a moment to allow the person to shake it. Next, FLASH™ said a few words about himself (where, when and for what purpose he was created) and demonstrated his gesticulative abilities. As the last part of introduction the robot complimented the color of the participant's clothes. All of these actions were undertaken to familiarize the person with the machine and create a bond between them. 
  • GAME - the main part of the experiment consisted of a simple game. The participant had to show toys to the robot in any order they wished. FLASH™ then reacted depending on the choice of a toy by pronouncing an emotional comment and expressing his emotions with facial expression and gesticulation. Robot's commentary added context to the emotions that he expressed, e.g. when expressing sadness he would inform the participant that They never let me play with this toy! or when surprised he would say I was sure that I had lost this toy!. This was done to increase the recognizability of the emotions. Four emotions were implemented for the experiment: anger, happiness, sadness and surprise. The game ended when the participant showed six toys after which the robot fell asleep.

Data collection

After the main part of the experiment, paper-and-pencil and individual indepth interviews were conducted. During the whole experiment FLASH™ operated autonomously and used his competencies to detect person, face, differentiate between human body parts and various colors and express emotions: anger, joy, sadness and surprise. The main aim was to check wether people look at FLASH™ just as they would at another human being, during the whole experiment and also its individual parts, i.e. the handshake, introduction, compliment, different toys and different emotions. Our expectations were that the results would turn out to be similar to those show on the website below:

Heatmap examples


The experiment proved to be instructive and managed to verify the proposed research methodology and the possibility of long-term autonomous operation of the robot. The results suggest the validity of our hypotheses. The face of the robot draws the most attention which lets us suspect that just as in human relationships, the respondents maintain eye contact with the robot. FLASH's torso as well as his hands and forearms were also drawing participants' gaze. More examples can be found here: www.eyetracker.com

Heat and Focus map obtained from the whole experiment. Subjects were watching the whole robot. However, they focused the most on FLASH's face.

heatmap 1 heatmap 2 

(click to enlarge)

Heat and Focus map obtained from the welcome part (handshake). Subjects still focus their eyes on FLASH's face, but also on his right hand.

heatmap 3 heatmap 4
(click to enlarge)

Heat and Focus map obtained from the final part of the experiment. Subjects are used to the robot by now and they focus only on FLASH's face.

heatmap 5  heatmap 6
(click to enlarge)

Based on data gathered using IDI's we conclude that people could become attached to FLASH™. Popular reasons given by participants include: 

Since he can communicate I believe I could get attached pretty easily. ANNA (participant)
People are able to get attached to different things nowadays. That is why they will get attached to FLASH as well. MICHAŁ (participant)
It will take time, but it is possible. KAROLINA (participant)
Yes. Especially, if FLASH would have some more sophisticated options and functions. TOMASZ (participant)


  • M. Dziergwa, M. Frontkiewicz, P. Kaczmarek, J. Kędzierski, M. Zagdańska, 2013. "Study of a Social Robot’s Appearance Using Interviews and a Mobile Eye-Tracking Device", Social Robotics, Lecture Notes in Computer Science, 10.1007/978-3-319-02675-6_17, pages 170-179, Springer International Publishing [LINK] [DOWNLOAD].

ncn en

The project has received funding from National Science Center, grant no. 2012/05/N/ST7/01098.