Smart Environments

As smart environments one can consider sensorized environments that can perceive the users and their activities in it, and which aim at supporting these users during their activities in these rooms. Examples are smart houses, smart meeting rooms or smart lecture rooms.

We have extensively worked on building perceptual technologies for smart environments, including smart offices, meeting and lectures rooms. Recently this has mainly be done in the framework of the European Integrated Project CHIL - Computers in the Human Interaction Loop - in which we were responsible for the overall scientific project coordination, and in which we contributed critically important perception modules, including person tracking, person identification, estimation of head pose and focus of attention as well as recognition of activities.

An example application for a smart office is the so-called Connector service, which we developed in the CHIL project. This service uses perception components in and in front of an office to determine the current activity inside the office, and uses this information to handle incoming phone calls in a context-specific way. In a user study we could show that this lead to significantly reduced number of disruptions while still preserving caller-satisfaction.

Building a smart room to support crisis control is currently the main focus of our research in the Perceptual User Interfaces research group at Fraunhofer IITB. In this project we build a smart room in which we use audio-visual perception to support team collaboration and to facilitate multimodal interaction with large displays. Please see the IITB group's website for further information. 

Related Videos:

Selected Recent Publications:

  • Rainer Stiefelhagen, John Garofolo (Eds). Multimodal Technologies for Perception of Humans. Proceedings of the First International evaluation workshop on Classification of Events, Activities and Relationships, CLEAR 2006, Springer Lecture Notes in Computer Science No. 4122., January 2007. (Springer)
  • D. Mostefa, N. Moreau, K. Choukri, G. Potamianos, S. M. Chu, J. R. Casas, J. Turmo, L. Cristoferetti, F. Tobia, A. Pnevmatikakis, V. Mylonakis, F. Talantzis, S. Burger, R. Stiefelhagen, K. Bernardin and C. Rochet. The CHIL Audiovisual Corpus for Lecture and Meeting Analysis inside Smart Rooms. Journal on Language Resources and Evaluation, Vol. 41, No. 3-4, December 2007, pp. 389-407, Springer. (pdf).
  • R. Stiefelhagen, K. Bernardin, H.K. Ekenel, J. McDonough, K. Nickel, M. Voit, M. Woelfel. Audio-Visual Perception of a Lecturer in a Smart Seminar Room. Signal Processing - Special Issue on Multimodal Interfaces, Vol. 86 (12), Dec. 2006, Elsevier. (pdf)
  • See publications page for more!