ICAT 2008
18th International Conference on Artificial Reality and Telexistence
December 1-3, 2008
Keio University, Yokohama, Japan

General information

Oral presentation

  • Each oral presentation will be 20 minutes, including discussion.
  • For oral presentation, you can use either your own laptop PC or a PC prepared by ICAT2008 committee.
  • If you would like to use the PC prepared by the committee, please bring your presentation files via USB memory. The PC has MS power point, Adobe Acrobat, etc.
  • Please check your presentation items (your PC, and/or your files) before your session.


  • A poster panel (W910 x H1820 mm) will be provided for each poster presentation.
  • For poster presentation, authors should print poster and bring it to the conference. ICAT2008 organizing committee will NOT print posters for poster session.
  • For poster, please note that the core time of your presentation will be 13:30-14:30. You can present the poster all the day, so that you may want to talk about your poster paper to the audience during the break time and even lunch time.



You can download the detailed program here : program.pdf.
And this is the latest version of the program : program2.pdf.

December 1st (Mon)

09:00-09:30 Opening 09:30-10:30 Invited Talk (Bruce Thomas)
10:50-12:30 Session 1: Display (20min x 5)
12:30-14:30 Lunch, Poster Session 1 (15 posters), demos
14:30-16:10 Session 2: Haptic Interface (20min x 5)
16:20-17:40 Session 3: Multimodal System (20min x 4)

December 2nd (Tue)

09:30-10:30 Invited Talk (Michael Haller)
10:50-12:30 Session 4: Application (20min x 5)
12:30-14:30 Lunch, Poster Session 2 (15 posters), demos
14:30-15:50 Session 5: Perception (20min x 4)
16:00-17:40 Session 6: Virtual Environment (20min x 5)

December 3rd (Wed)

09:30-10:30 Invited Talk (Takashi Maeno)
10:50-11:50 Session 7: Augmented Reality / Mixed Reality (20min x 3)
11:50-13:00 Lunch
13:00-14:00 Ending Talk (Sabine Coquillart)
14:00- continued by ACE2008 : ICAT2008 registrants can attend ACE2008 on this day.
17:30 Bus to Banquet Site will leave from Conference Site.

Invited talks


You can download the invited talk summary here : invited.pdf.

Bruce H. Thomas : How to Make Augmented Reality User Interfaces Work

Augment Reality has been in existence for a number of decades, but we still do not have a set of solid user interface technologies. There have many great innovations into user interfaces for AR, but these have not translated into a pervasive user interface paradigm. The classic question is: "Why does AR not have the equivalent to the WIMP for desktop computers?" This talk will examine what is great and lacking in the current state-of-the-art for AR user interfaces. More importantly the talk will explore what we can do to make AR user interfaces that people will want to use. Many of these activities we can start doing today as the Mixed and Augmented Reality community.

Professor Thomas is the current the Director of the Wearable Computer Laboratory at the University of South Australia. He is currently a NICTA Fellow, CTO A-Rage Pty Ltd, and visiting Scholar with the Human Interaction Technology Laboratory, University of Washington. Prof. Thomas is the inventor of the first outdoor augmented reality game ARQuake. His current research interests include: wearable computers, user interfaces, augmented reality, virtual reality, CSCW, and tabletop display interfaces.

Michael Haller : Natural user interfaces for collaborative environments

Until recently, the limitations of display and interface technologies have restricted the potential for human interaction and collaboration with computers. For example, desktop computer style interfaces have not translated well to mobile devices and static display technologies. However, the emergence of interactive whiteboards has pointed to new possibilities for using display technology for interaction and collaboration. A range of emerging technologies and applications could enable more natural and human centered interfaces so that interacting with computers and content becomes more intuitive. This will be important as computing moves from the desktop to being embedded in objects, devices and locations around us and as our "desktop" and data are no longer device-dependent but follow us across multiple platforms and locations. The impact of Apple's iPhone and an increasing number of multi-touch surfaces show that users' expectations about using these devices in their daily lives have increased. The reaction to these natural interface implementations has been very dramatic. With the increasing development of interactive walls, interactive tables, and multi-touch devices, both companies and academics are evaluating their potential for wider use. These newly emerging form factors require novel human-computer interaction techniques which will be discussed in this presentation. My research goal is to design, develop, and evaluate natural user interfaces that will enable everyone, not just experts, to use our interactive surfaces. In this presentation, we will describe particular challenges and solutions for the design of tabletop and interactive wall environments and present the user-centered design.

Michael Haller is working at the department of Digital Media of the Upper Austria University of Applied Sciences (Hagenberg, Austria), head of the Media Interaction Lab, and responsible for computer graphics, multimedia programming, and augmented reality. He received Dipl.-Ing. (1997), Dr. techn. (2001) and Habilitation (2007) degrees from Johannes Kepler University of Linz. He is active in several research areas, including interactive computer graphics, augmented and virtual reality, and human computer interfaces. In 2004, he received the Erwin Schroedinger fellowship award presented by the Austrian Science Fund.

Takashi Maeno : Haptics of Humans and Robots

Technologies on tactile sensation have not been progressed compared with visual and oral technologies. For example, perpendicular axes representing the fundamental characteristics of tactile sensation has not been clarified. Role of main four mechanoreceptors underneath the skin for texture perception has not been clarified as well. Hence, the presenter have been involved in the research on mechanical characteristics of human skin and its relationship to tactile perception as well as the psychological analysis of humans touching various surface of objects. As a result relationship between humans’ texture perception and physical properties has been clarified. The presenter is also involved in the development of tactile sensors and tactile displays. Examples of those sensors/displays are shown. Tactile sensors are for detecting texture of surface of objects. They can be used both for device for industry to quantify the texture of products and for humanoid robots. Tactile displays are for presenting texture, softness and friction of various objects to human fingers. The tactile displays are realized by using amplitude modulation of ultrasonic vibration of Langevin type vibrator as well as force feedback using force display. I hope those technologies are useful for progress of haptic technologies in the field of virtual reality and robotics.

Takashi Maeno received his B. S. and M. S. degrees in mechanical engineering from the Tokyo Institute of Technology, Tokyo, Japan, in 1984 and 1986, respectively. From 1986 to 1995, he worked for Canon, Inc., in Tokyo, Japan. He received his Ph. D. degree in mechanical engineering from the Tokyo Institute of Technology, Tokyo, Japan, in 1993. Since 1995, he has been with Keio University, Yokohama, Japan, where he is currently a Professor. He was a Visiting Industrial Fellow at the University of California, Berkeley, from 1990 to 1992. He was a visiting professor at Harvard University in 2001 as well. His research interests are on tactile sensors/displays, recognition of robots/humans and large scale complex system design.

Sabine Coquillart : Haptics and Pseudo-Haptics: from Reserach to Industry

This talk will start by extending the presentation I gave at ICAT'2002. In 2002, I presented a new first-person projection-based visuo-haptic environment named the "Stringed Haptic Workbench". This talk will present applications developped thanks to this environment, including one in use in the industry. New trends in pseudo-haptics as well as recent applications will also be presented.

Sabine Coquillart is a research director at INRIA (the French National Institute for Research in Computer Science and Control) and LIG (the Laboratory of Informatics of Grenoble). Her research interests include VR and 3D user interfaces. Coquillart received her PhD in computer science from Grenoble University and a research-supervising Habilitation from the University of Paris XI. Contact her at sabine.coquillart@inria.fr.