Experimental platform: begin !

We’ve started the development of our experimental platform

We’re developping a serious game program to help us evaluate blind navigation performance in multiple settings.
Software
Experimental
Published

January 1, 2022


The platform uses Unity, connects to various motion tracking devices used by the consortium (Polhemus, VICON, pozyx), uses PureData for sound-wave generation and Steam Audio for 3D audio modeling, and communicates with the consortium’s non-visual interfaces wirelessly.

Screenshot of the testing environment of the experimental platform of SAM-Guide

(a) Testing environment with a PureData audio beacon

 

Screenshot of the maze generator of the experimental platform of SAM-Guide

(b) Auto-generated maze with 3D audio beacons on waypoints

Figure 1: Screenshots from SAM-Guide’s experimental platform (in development)

This platform allows one to easily spin up experimental trials by specifying the desired characteristics in a JSON file (based on the OpenMaze project). Unity will automatically generate the trial’s environment according to those specifications and populate it with the relevant items (e.g. a tactile-signal emitting beacon signalling a target to reach in a maze), handle the transition between successive trials and blocks of trials, and log all the relevant user metrics into a data file.

Screenshot of the experimental protocol file specifying the avatar and the experimental blocks' characteristics

(a) Specifying the avatar and the experimental blocks’ characteristics

 

Screenshot of the experimental protocol file specifying experimental trials, which can be repeated and randomized within blocks

(b) Specifying experimental trials, which can be repeated and randomized within blocks

Figure 2: Examples of settings used to generate experimental trials on the fly.