Overview
This is an overview of the cosound platform. The platform consist of two parts,
- Handles digital media objects in a structured way and enables large scale processing using a number of audio processing tools and machine learning models.
- Cockpit revolves around making controlled behavioural experiments on the web. The aim here is making a framework that makes the design, specification, deployment of experiments and later retrieval of results easy. Furthermore at the same time ensuring that results and data in collected and recording in a reliable and coherent manner.
Cockpit
The cockpit framework consist of three major parts,
- Front-end user interface that is implemented in typescript.
- Back-end that handles the specifications and answers from users.
- SDK that allows the user to specify experiments using a java/matlab interface.
Frontend
The typescript implementation of the front-end cockpit is centered around defining new experiments with a long range of response formats (e.g. likert and continuous scales, radiobutton and checkbox groups, ranking and k-AFC) interaction mechanisms and different types of stimuli such as music, audio or images. The front-end implementation can be found here
Backend
This is a simple installation of a flat structure database with an API that can create and delete experiments. The code can be found here with installation instructions
Interface
In order to interface with the system an SDK has been created in Java with a thin layer of Matlab.
SDK
The SDK is responsible for letting users define experiments, upload experiments to a server, and retrieve answers from participants when an experiment has run its course. The experiments are represented in a hierarchical manner in which trials, trial components, participant answers, etc. are defined as self-contained objects. A given experiment is organized in a tree structure where a node corresponds to an element of the experiment with the experiment itself being the root and participant feedback being the leaves:
Example
The following example illustrates how to create an experiment with a single trial containing a Likert scale with an attached sound stimulus:
% Create top-level experiment
ex = cosexperiment('Name', 'Cockpit:Test',...
'Id', 'a9f56a58-aaaa-eeee-1351-fa4358765432',...
'CreatedBy', 'Anders Kirk Uhrenholt');
% Create first trial
trial1 = coscreatetrial(ex);
% Add a header text and a Likert scale component
cosaddheader(trial1, 'HeaderLabel', '{{center|This is the first trial}}');
likertOptions = {...
'1{{n}}{{b|sleepy}}{{n}}{{b|tired}}{{n}}{{b|drowsy}}';...
'2';...
'3{{n}}{{b|neutral}}';...
'4';...
'5{{n}}{{b|alert}}{{n}}{{b|wakeful}}{{n}}{{b|awake}}';...
};
cosaddlikertscale(trial1, 'HeaderLabel', 'How does this music make you feel?',...
'Items', likertOptions,...
'Stimulus', {musicURI, 'Music excerpt'});
% Upload experiment to server
cosuploadexperiments(apikey, ex);
% Retrieve participant answers from finished experiment
answers = cosgetanswers(apikey, ex.id);
After being uploaded to the server the browser representation of the above experiment looks as follows:
Media Processing
The media processing part of the cosound platform revolves around handling multimedia objects. The system handles the raw content, processed contents using feature extraction and a FRBR structured meta data.
Frontend
To interface with the system and API is used.
API calls
Backend
Database
The back-end is built around MCM system. The code can be found here
Datastructure
The FRBR structure is used to handle large archives of media data.
Octopus
Octopus is a distributed computing system that is specially made to by accessed from API end-points. The code can be found here.
For additional information please contact Jens Madsen email