By Topic

Rhythm and Tempo Analysis Toward Automatic Music Transcription

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Takeda, H. ; Graduate Sch. of Inf. Sci. & Technol., Tokyo Univ. ; Nishimoto, T. ; Sagayama, S.

This paper discusses model-based rhythm and tempo analysis of music data in the MIDI format. The data is assumed to be obtained from a module performing multi-pitch analysis of music acoustic signals inside an automatic transcription system. In performed music, observed note lengths and local tempo fluctuate from the nominal note lengths and long-term tempo. Applying the framework of continuous speech recognition to rhythm recognition, we take a probabilistic top-down approach on the joint estimation of rhythm and tempo from the performed onset events in MIDI data. Short-term rhythm patterns are extracted from existing music samples and form a "rhythm vocabulary." Local tempo is represented by a smooth curve. The entire problem is formulated as an integrated optimization problem to maximize a posterior probability, which can be solved by an iterative algorithm which alternately estimates rhythm and tempo. Evaluation of the algorithm through various experiments is also presented.

Published in:

Acoustics, Speech and Signal Processing, 2007. ICASSP 2007. IEEE International Conference on  (Volume:4 )

Date of Conference:

15-20 April 2007