[mocaptoolbox] MoCap updates from Oslo

  • From: Kristian Nymoen <krisny@xxxxxxxxxx>
  • To: mocaptoolbox@xxxxxxxxxxxxx
  • Date: Fri, 31 Aug 2012 12:44:27 +0200

Hi list,

I've adapted two of my MoCap scripts for matlab to interface with the MoCap toolbox. One script for plotting data from a potentially large number of markers, and another one for reading motion data from the Gesture Description Interchange Format.

I have include some links, descriptions, and references below, in case anyone is interested to have a look.

cheers,
Kristian



1. mcmocapgram

The function plots what we call a mocapgram of a mocap recording. XYZ position coordinates are normalised and projected onto an RGB space. Each marker is assigned a separate row. This gives a quite good visualisation recurring and periodic patterns in the recording.

direct link:

references for mocapgrams:
A. R. Jensenius, S. A. Skogstad, K. Nymoen, J. Torresen, and M. E. Høvin. Reduced displays of multidimensional motion capture data sets of musical performance. In Proceedings of ESCOM 2009: 7th Triennial Conference of the European Society for the Cognitive Sciences of Music, Jyväskylä, Finland, 2009.

K. Nymoen, J. Torresen, R. Godøy, and A. R. Jensenius. A statistical approach to analyzing sound tracings. In S. Ystad, M. Aramaki, R. Kronland-Martinet, K. Jensen, and S. Mohanty, editors, Speech, Sound and Music Processing: Embracing Research in India, volume 7172 of Lecture Notes in Computer Science, pages 120–145. Springer, Berlin Heidelberg, 2012.



2. mcreadgdif

We have been recording motion data into SDIF files, using a toolbox developed for Max/MSP and Jamoma. The script "mcreadgdif.m" reads a data stream from these so-called "GDIF-files" and organises it into a MoCap Data structure. This means that practially any sensor data stream can be recorded as GDIF-files, and then imported into the Mocap Toolbox for analysis.



references for GDIF / recording of motion data into SDIF files:

A. R. Jensenius, T. Kvifte, and R. I. Godøy. Towards a gesture description interchange format. In Proceedings of the International Conference on New Interfaces for Musical _expression_, pages 176–179, Paris, France, 2006. Paris: IRCAM – Centre Pompidou.

A. R. Jensenius, K. Nymoen, and R. I. Godøy. A multilayered gdif-based setup for studying coarticulation in the movements of musicians. In Proceedings of the International Computer Music Conference, pages 743–746, Belfast, 2008. 

K. Nymoen and A.R. Jensenius. A Toolbox for Storing and Streaming Music-related Data. In Proceedings of SMC 2011 8th Sound and Music Computing Conference “Creativity rethinks science”, pages 427–430, Padova University Press 2011.




__________
Med vennlig hilsen / Best regards,

Kristian Nymoen
ROBIN / fourMs Lab
Dept. of Informatics, University of Oslo
Office phone: (+47) 22841693
Mobile phone: (+47) 99708138
http://fourms.uio.no
http://ifi.uio.no/robin

Other related posts:

  • » [mocaptoolbox] MoCap updates from Oslo - Kristian Nymoen