Behaviour Lab

The Behaviour Lab provides a flexible laboratory environment to study human movement behavior. In a 8.0x4.0x2.6 m room, several people can move freely while their motion can be captured with an IR camera system. In combination with the presentation of visual experimental stimuli on head-mounted displays, a stereoscopic powerwall or a three-sided CAVE including integrated eye movement capture, the laboratory offers ideal conditions for the investigation of perception-action coupling with high experimental control. Furthermore, it allows the observation of group processes by audio and video recording of the entire room.

Three-sided CAVE: Three Barco F70 projectors project high-resolution and optionally three-dimensional images onto three walls (pixel size 1 mm). Integrated warping and blending automates the recalibration process and ensures geometrically and color-correct reproduction of the content without artifacts due to unevenly illuminated image edges.

Powerwall: A Powerwall (projection area 2.7x1.6 m) backprojected by two Infocus 5110 projectors (Full HD, 60 fps) allows the shadowless display of stereoscopic visual content using Infitec passive filters and glasses.

Head-mounted displays: Two head-mounted displays with integrated eye tracking (Vive Pro Eye) enable complete immersion in virtual worlds. In contrast to the three-sided CAVE, this setting is particularly suitable for the highly controlled presentation of experimental tasks with lower dynamics.

Room cameras: Four wide-angle streaming cameras (Axis, Full HD, 30 fps) mounted in the room’s corners allow video and audio recording of the entire laboratory space and observation of the behavior from the neighboring control room without the physical presence of the experimenter in the recording room.

Rendering workstations: The projectors and HMDs are fed by three workstations that generate the images on Nvidia Quadro M4000 graphics cards and send them to the respective devices. The graphics are rendered by a modern game engine, which was extended with research-context relevant features with the support of the Faculty's technology platform.

Motion capture: Kinematic parameters are captured at high resolution and high frequency using a marker-based Vicon 3D motion capture system with 10 cameras (Vicon T20s, 500 Hz, 2 MP). The low latency of the 3D motion reconstruction allows its use for interactive virtual reality settings.

Integrated eye-movement capture: A lightweight binocular eye tracking system based on the Pupil-Labs open source project captures eye movements with a frequency of up to 200 Hz. The synchronous capturing of head movements (Vicon) and the integration with Streamix  allow – contrary to regular video based analyses of gaze data – the automation and the real-time (in-the-loop) use of these analyses.

Experiment management system: The in-house developed experiment management system caters for an intuitive user experience while maximizing control over the experiment. A web based user interface allows the researcher to configure and control all relevant features while the controlled application based on the coordination language Streamix, assures - amongst others - the synchronicity of the data streams.