← ALL PROJECTS 12 April 2026

AME: The Autonomous Multimodal Experiment System

behavioural-scienceexperiment-platformvideo-conferencingwebrtcparticipant-matchingmultimodal-data-collectionreal-time-interactionbody-trackingcomputer vision
AME: The Autonomous Multimodal Experiment System

An experiment tool for automated multimodal data collection in online setting.

AME — Autonomous Multimodal Experiment



AME is a web-based platform for designing, deploying, and running online behavioural experiments autonomously. It enables researchers to conduct
experiments involving manipulated video conferences, multimodal data collection, and automated participant matching — without needing to be
present in each session.

What Researchers Can Do

Design Experiments Visually
Build experiments using a three-panel visual canvas: a routine tree with element palette on the left, a 12-column grid canvas in the centre, and a
property inspector on the right. Drag and drop elements to compose each step of your experiment — no coding required.

Compose Multi-Step Experiment Flows
Experiments follow a hierarchical structure: Experiment → Progress → Routine → Element. Each progress step maps to a routine containing one or
more element sets. Researchers define the full sequence of instructions, stimuli, tasks, surveys, and video chat sessions that participants
experience.

Automate Participant Matching
Define matching rules at the routine level — not the experiment level. Each video-chat routine can use a different strategy:
- FIFO — first-come-first-served queue, match N participants as they arrive
- Condition-aware — match participants across experimental conditions (e.g., one from Group A with one from Group B)
- Manual — assign groups from the admin dashboard in real time

A single experiment can contain multiple video-chat routines, each with its own matching strategy and partner assignments.

Collect Multimodal Data
Capture diverse data types within a single experiment:
- Video/audio recording via webcam with resumable background upload
- Body, hand, and face detection using browser-based ML models
- Facial expression analysis and speech recognition
- Surveys with built-in question builder
- Keyboard input, buttons, text fields, sliders, option groups, and file pickers
- Volume checks and timers for monitoring participant state

Run Video Chat Sessions
Native WebRTC video conferencing is built directly into experiment routines. Participants are matched into peer-to-peer sessions (2–4
participants), with local stream recording and non-blocking background upload after the routine completes.

Use Supplementary Devices
Participants can connect a mobile phone or tablet as a secondary camera or remote controller by scanning a QR code. The device streams video to
the desktop via WebRTC, enabling capture from multiple angles or freeing participants' hands during tasks.

Manage Conditions and Randomisation
Assign participants to experimental conditions via round-robin. Randomise progress sequences using Latin-square designs. Each condition maps to a
distinct sequence of routines, and matching periods are detected automatically for multi-participant sessions.

Operate Fully Autonomously
Once an experiment is started, it runs without researcher intervention:
- Automated position and field-of-view checks using browser-based ML
- Automatic condition assignment and participant matching
- Session resumption on disconnect with roll-back enforcement
- Auto-withdrawal of inactive participants
- Content delivery driven entirely by server-pushed WebSocket events

Import Content via CSV
Upload CSV files to define trial-by-trial content — each row overrides default routine configurations, enabling content-driven experiment flows
where stimuli vary across trials and conditions.

Monitor and Export
- View participant status and progress in real time from the admin dashboard
- Manually match or reassign participants during live sessions
- Export results as CSV and download recordings as ZIP archives

Contact

For access requests or enquiries, contact jy2154@bath.ac.uk