[go: up one dir, main page]

WO2006037366A1 - Appareil et procede destines a generer un motif rythmique code - Google Patents

Appareil et procede destines a generer un motif rythmique code Download PDF

Info

Publication number
WO2006037366A1
WO2006037366A1 PCT/EP2004/011266 EP2004011266W WO2006037366A1 WO 2006037366 A1 WO2006037366 A1 WO 2006037366A1 EP 2004011266 W EP2004011266 W EP 2004011266W WO 2006037366 A1 WO2006037366 A1 WO 2006037366A1
Authority
WO
WIPO (PCT)
Prior art keywords
rhythmic
pattern
group
encoded
music
Prior art date
Application number
PCT/EP2004/011266
Other languages
English (en)
Inventor
Markus Cremer
Matthias Gruhne
Jan Rohden
Christian Uhle
Original Assignee
Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. filed Critical Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.
Priority to EP04790208A priority Critical patent/EP1797507B1/fr
Priority to PCT/EP2004/011266 priority patent/WO2006037366A1/fr
Publication of WO2006037366A1 publication Critical patent/WO2006037366A1/fr

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/40Rhythm
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/071Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for rhythm pattern analysis or rhythm style recognition
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/121Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
    • G10H2240/131Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set

Definitions

  • the present invention relates to audio data processing and, in particular, to metadata suitable for identifying an audio piece using a description of the audio piece in the form of rhythmic pattern.
  • Metadata about data is to for example detect the genre of a song, to specify music similarity, to perform music segmentation .on the song or to simply recognize a song by scanning a data base for similar metadata.
  • metadata are used to determine a relation between test pieces of music having associated test metadata and one or more reference pieces of music having corresponding reference metadata.
  • rhythmic (percussive) accompaniment While some people are interested in an algorithm for the automated transcription of rhythmic (percussive) accompaniment in modern day popular music, others try to capture the "rhythmic gist" of a piece of music rather than a precise transcription, in order to allow a more abstract comparison of musical pieces by their dominant rhythmic patterns. Nevertheless, one is not only interested in rhythmic patterns of percussive instruments, which do not have their main focus on playing certain notes but generating a certain rhythm, but also the rhythmic information provided by so-called harmonic sustained instruments such as a piano, a flute, a clarinet, etc. can be of significant importance for the rhythmic gist of a piece of music.
  • rhythmic elements of music determined by the drum and percussive instruments, play an important role especially in contemporary popular music. Therefore, the performance of advanced music retrieval applications will benefit from using mechanisms that allow the search for rhythmic styles, particular rhythmic features or generally rhythmic patterns when finding out a relation between a test rhythmic pattern and one or more reference rhythmic patterns which are, for example, stored in a rhythmic pattern data base.
  • the first version of MPEG-7 audio (ISO-IEC 15938-4) does not, however, cover high-level features in a significant way. Therefore, the standardization committee agreed to extend this part of the standard.
  • the work contributing high-level tools is currently being assembled in MPEG-7 audio amendment 2 (ISO-IEC 15938-4 AMD2) .
  • One of its features is "rhythmicpatternsDS" .
  • the internal structure of its representation depends on the underlying rhythmic structure of the considered pattern.
  • One way is to start from the time- domain PCM representation of a piece of music such as a file, which is stored on a compact disk, or which is generated by an audio decoder working in accordance with the well-known MP3 algorithm (MPEG 1 layer 3) or advanced audio algorithms such as MPEG 4 AAC.
  • MP3 algorithm MPEG 1 layer 3
  • MPEG 4 AAC advanced audio algorithms
  • the detection and classification of percussive events is carried out using a spectrogram-representation of the audio signal. Differentiation and half-way rectification of this spectrogram-representation result in a non-negative difference spectrogram, from which the times of occurrence and the spectral slices related to percussive events are deduced.
  • PCA Principle Component Analysis
  • NICA Non-Negative Independent Component Analysis
  • the spectral characteristics of un-pitched percussive instruments allows separation using an un-mixing matrix to obtain spectral profiles, which can be used to extract the spectrogram's amplitude basis, which is also termed as the "amplitude envelopes".
  • This procedure is closely related to the principle of Prior Sub-space Analysis (PSA), as described in "Prior sub-space analysis for drum transcription", Fitzgerald, D., Lawlor, B. and Coyle, E. Proceedings of the 114 th AES Convention, Amsterdam, Netherlands, 2003.
  • PSA Prior Sub-space Analysis
  • the extracted components are classified using a set of spectral-based and time-based features.
  • the classification provides two sources of information.
  • spectral profiles are provided by a k- nearest neighbor classifier with spectral profiles of single instruments from a training database.
  • additional features describing the shape of the spectral profile e.g. centroid, spread and tunes are extracted.
  • Other features are the center frequencies of the most prominent local peaks, their intensities, spreads and skewnesses.
  • Onsets are detected in the amplitude envelopes using conventional peak picking methods.
  • the intensity of the on ⁇ set candidate is estimated from the magnitude of the envelope signal. Onsets with intensities exceeding a predetermined dynamic threshold are accepted. This procedure reduces cross-talk influences of harmonic sustained instruments as well as concurrent percussive instruments.
  • the audio signal is segmented into similar and characteristic regions using a self- similarity method initially proposed by Foote, J., "Automatic Audio Segmentation using a Measure of Audio Novelty", Proceeding of the IEEE International Conference on Multimedia and Expo, vol. 1, pages 452-455, 2000.
  • the segmentation is motivated by the assumption that within each region not more than one representative drum pattern occurs, and that the rhythmic features are nearly invariant.
  • the temporal positions of the events are quantized on a tatum grid.
  • the tatum grid describes a pulse series on the lowest metric level.
  • Tatum period and tatum phase are computed by means of a two-way mismatch error procedure, as described in "Pulse-dependent analysis of percussive music", Gouyon, F., Herrera, P., Cano, P., Proceedings of the AES 22 nd International Conference on Virtual, Synthetic and Entertainment Audio, 2002.
  • the pattern length or bar length is estimated by searching for the prominent periodicity in the quantized score with periods equaling an integer multiple of the bar length.
  • a periodicity function is obtained by calculating a similarity measure between the signal and its time-shifted version. The similarity between the two score representations is calculated as a weighted sum of the number of simultaneously occurring notes and rests in the score.
  • An estimate of the bar length is obtained by comparing the derived periodicity function to a number of so-called metric models, each of them corresponding to a bar length.
  • a metric model is defined here as a vector describing the degree of periodicity per integer multiple of the tatum period, and is illustrated as a number of pulses, where the height of the pulse corresponds to the degree of periodicity. The best match between the periodicity function derived from the input data and predefined metric models is computed by means of their correlation coefficient.
  • tatum period is also related to the term "microtime".
  • the tatum period is the period of a grid, i.e., the tatum grid, which is dimensioned such that each stroke in a bar can be positioned on a grid position.
  • a grid i.e., the tatum grid
  • the tatum period is the time period between two main strokes.
  • the microtime i.e., the metric division of this bar is 1, since one only has main strokes in the bar.
  • the bar When, however, the bar has exactly one additional stroke between two main strokes, the microtime is two and the tatum period is the half of the period between two main strokes. In the 4/4 example, the bar, therefore, has 8 grid positions, while in the first example, the bar only has 4 grid positions.
  • the microtime is 3
  • the tatum period is 1/3 of the time period between two main strokes.
  • the grid describing one bar has 12 grid positions.
  • Fig. 2a shows one bar having a meter of 4/4, a microtime equal to 2 and a resulting size or pattern length of 4 by 2 equals 8.
  • Fig. 2a also includes a line 20c showing the main strokes 1, 2, 3, and 4 corresponding to the 4/4 meter and showing additional strokes 1+, 2+, 3+, and 4+ at grid positions 2, 4, 6, and .
  • velocity indicates an intensity value of an instrument at a certain grid position or main stroke or additional stroke (part of the bar) , wherein, in the present example, a high velocity value indicates a high sound level, while a low velocity value indicates a low sound level. It is clear that the term velocity can therefore, be attributed to harmonic-sustained instruments as well as un-pitched instruments (drum or percussion) . In the case of a drum, the term “velocity” would describe a measure of the velocity, a drumstick has, when hitting the drum.
  • Fig. 2a prior art rhythmic pattern cannot only be obtained by an automatic drum description algorithm as described above, but is also similar to the well-known MIDI description of instruments as used for music synthesizers such as electronic keyboards, etc.
  • the Fig. 2a rhythmic pattern uniquely describes the rhythmic situation of an instrument.
  • This rhythmic pattern is in-line with the way of playing the rhythmic pattern, i.e., a note to be played later in time is positioned after a note to be played earlier in time. This also becomes clear from the grid position index starting at a low value (1) and, after having monotonically increased, ending at a high value (8) .
  • rhythmic pattern although uniquely and optimally giving information to a player of the bar is not suited for an efficient data base retrieval. This is due to the fact that the pattern is quite long, and, therefore memory-consuming. Additionally and importantly, the important and not so important information in the rhythmic pattern of Fig. 2a is very well distributed over the pattern.
  • a search engine in a database using test and reference rhythmic patterns as shown in Fig. 2a therefore has to compare the complete test rhythmic pattern to the complete reference rhythmic patterns to finally find out a relation between the test rhythmic pattern and the reference rhythmic patterns.
  • the rhythmic pattern in Fig. 2a only describes a single bar of a piece of music, which can, for example, have 2000 bars
  • the number of pieces of music in a reference data base is to be as large as possible to cover as many as possible pieces of music
  • the size of the data base storage can be explode to a value of the number of pieces of music multiplied by the number of bars per piece of music multiplied by the number of bits for representing a single bar rhythmic pattern.
  • Fig. 2a While the storage might not be a large issue for personal computers, it can raise the size and costs of portable music processors such as music players. Additionally, the size of the rhythmic pattern in Fig. 2a becomes even more important when one tries to have a reasonable time frame for the search engine correlating a test rhythmic pattern to the reference rhythmic patterns. In case of high-end work stations having nearly unlimited computational resources, the Fig. 2a rhythmic pattern might not be too problematic. The situation, however, becomes critical, when one has limited computational resources such as in personal computers or once again, portable players, whose price has to be significantly lower than the price of a personal computer, when such an audio retrieval system is to survive on the highly competitive marketplace.
  • an apparatus for generating an encoded rhythmic pattern in accordance with claim 1 a method of generating an encoded rhythmic pattern in accordance with claim 13, an encoded rhythmic pattern in accordance with claim 14, an apparatus for determining a relation between a test piece of music and a reference piece of music in accordance with claim 15, a method for determining a relation between a test piece of music and a reference piece of music in accordance with claim 18, an apparatus for decoding a rhythmic pattern in accordance with claim 19, a method of decoding the rhythmic pattern in accordance with claim 20 or a computer-program in accordance with claim 21.
  • the present invention is based on the finding that an efficient representation of a rhythmic pattern is obtained by encoding a normal rhythmic pattern so that the encoded rhythmic pattern has a first group of velocity values followed by a second group of velocity values, the first group of velocity values being associated with grid positions at a first rhythmic level, and the second group of velocity values being associated with grid positions at a second rhythmic level.
  • velocity values associated with grid positions at the same rhythmic level are in one group, which results in the fact that the encoded rhythmic pattern is a rhythmic pattern, which is not ordered in accordance with the correct time sequence for playing the bar associated with the rhythmic pattern, but is sorted in accordance with the importance of grid positions in the bar.
  • This representation of a piece of music allows a data base search engine to process the encoded rhythmic patterns sequentially i.e., to process the first group of velocity- values having a higher importance for the rhythmic gist of the piece of music before processing the second group and further groups of velocity values, which have lower importance for the rhythmic gist of a piece of music.
  • processing of the first group of the velocity values will result in a recognition in the data base search engine that several reference rhythmic patterns, which are not in line with the test rhythmic pattern with respect to their first groups of velocity values can be eliminated from further consideration, i.e., when velocity values of lower rhythmic levels, i.e., velocity values having a lower importance on the rhythmic gist of a piece of music are processed by the search engine.
  • the present invention therefore does not contribute maximum attention to the magnitude of the velocity value, i.e., the loudness or intensity, but attributes maximum importance to the rhythmic level, to which a certain velocity value belongs to.
  • This is in line with the human perception, which tries to find a rhythm in a piece of music or which feels a rhythm in the piece of music irrespective of the fact, whether the beats, which make the rhythm are very loud or not.
  • it is not the loudness or intensity or, generally stated, the velocity of a note or of notes in a sequence of notes, which make a listener to move his hands, feet or his body in a rhythmic way, but it is the importance of a note in a rhythmic frame, which determines the rhythm.
  • the inventive encoded rhythmic pattern it has also a very storage-efficient appearance, since all grid positions, irrespective of their rhythmic levels are completely eliminated when there velocity values are lower than a certain threshold, which is for example, the lowest velocity quantization level or velocity quantization step size.
  • the grid positions to be eliminated in this case are, therefore, the grid positions having a value of e.g. zero in a quantized rhythmic pattern.
  • the determination which grid positions are more important than other grid positions, i.e., which grid positions are to be attributed to the first group, and which grid positions are to be attributed in the second group is performed based on a prime decomposition of the nominator of the meter fraction, which is given by the term (meter nominator) / (meter denominator) . It has been found out that the hierarchical grid position/rhythmic level determination is automatically performed by using the prime numbers resulting from a prime decomposition of the nominator. In accordance with the present invention, the grid position index is replaced by a prime index derived by using a prime factor decomposition of the meter nominator.
  • the prime index is determined such that higher importance bit positions have lower prime index values while lower-importance grid positions have higher prime index values.
  • the present invention is advantageous in that it provides a compact representation of a rhythmic pattern, which can also be automatically extracted from an audio signal or consists of excerpts taken from an existing music notation as well. Based on this compact representation, which is constructed in accordance with the importance of a velocity value to a rhythmic impression rather than a time sequence of velocity values or even a magnitude of velocity values, an efficient comparison of patterns such as for classification purposes can be performed with minimum memory for the search engine on the one hand and minimum computational resources for the search engine on the other hand.
  • a particular application is the field of music content retrieval in particular with temporary popular music, in which the rhythmic information is characteristic for a piece of music and, therefore, provides a characteristic fingerprint of this piece of music.
  • inventive encoded rhythmic pattern allows settings of several grades in resolution such as rhythmic hierarchies, rhythmic levels based on the velocity value resorting, which is especially suitable for further classification or matching of rhythmic patterns.
  • Fig. 1 shows a preferred embodiment of the inventive concept for generating an encoded rhythmic pattern
  • Fig. 2a illustrates a prior art rhythmic pattern
  • Fig. 2b illustrates an output of the processor of Fig. 1;
  • Fig. 2c illustrates an output of the zero eliminator in Fig. 1;
  • Fig. 2d illustrates an output of the sorter of Fig. 1;
  • Fig. 3a illustrates an output of the processor for example of a music piece having a ternary feeling
  • Fig. 3b illustrates an output of the zero eliminator for the Fig. 3a example
  • Fig. 3c illustrates an output of the sorter for the Fig.
  • Fig. 4a illustrates an MPEG-7 conform description of the audio pattern data syntax
  • Fig. 4b illustrates the semantics for the Fig. 4a example
  • Fig. 5a illustrates an MPEG-7 conform example of the audio rhythmic pattern syntax element
  • Fig. 5b illustrates the semantics for the Fig. 5a embodiment
  • Fig. 6 illustrates an example instance of audio rhythmic pattern type metadata for a plurality of instruments
  • Fig. 7 illustrates a preferred method embodied by the processor based on prime factorizations of the nominator of the meter and the microtime;
  • Fig. 8 illustrates another preferred method embodied by the processor based on prime factor decompositions of the nominator of the meter and the microtime;
  • Fig. 9 illustrates a block diagram of an inventive apparatus for determining a relation between a test piece of music and a reference piece of music
  • Fig. 10 illustrates an encoded test rhythmic pattern and an encoded reference rhythmic pattern used in the apparatus of Fig. 9;
  • Fig. 11 illustrates a preferred method embodied by the search engine of Fig. 9;
  • Fig. 12a illustrates a query pattern before zero elimination for a plurality of instruments
  • Fig. 12b illustrates a query pattern after zero elimination for several instruments.
  • the inventive encoded rhythmic pattern is suitable for a flexible syntax having a semantic information for an underlying piece of music so that a maximum scope of musical styles and a maximum scope of rhythmic complexity can be incorporated.
  • this bar as represented in the encoded rhythmic pattern can relate to more bars of a piece of music as well.
  • the encoded rhythmic pattern is the result of several rhythmic raw patterns of the same meter, which have been combined using any kind of statistic methods such as forming an arithmetic average value or a geometric average value or a median value for each grid position.
  • Fig. 1 illustrates an inventive apparatus for generating an encoded rhythmic pattern from a rhythmic pattern, which includes the sequence of velocity values associated with grid positions, wherein the rhythmic pattern input into the inventive apparatus of Fig. 1 can exist in a time-wise manner or in a magnitude-wise manner with respect to velocity or in any other manner.
  • the rhythmic pattern input into the Fig. 1 apparatus therefore, has a set of velocity values, each velocity value being associated with a grid position from a plurality of grid positions, the plurality of grid positions further having grid positions at at least two different rhythmic levels, a grid position at the first rhythmic level having a higher significance than a grid position at a second rhythmic level.
  • Such an (uncoded) rhythmic pattern is input into a processor 10 for determining grid positions having a first rhythmic level and for determining grid positions having a second rhythmic level.
  • Processor 10 outputs an illustrative representation as shown in Fig. 2b.
  • the main purpose of the processor is to generate line 21, which indicates the rhythmic level for each grid position.
  • the beats at grid positions 1 and 5, i.e., parts 1 and 3 of the bar have the highest rhythmic level, which is illustrated by 3 stars in Fig. 2b.
  • the off-beats at grid positions 3 and 7 or parts 2 and 4 of the bar have the second rhythmic level, which is indicated by 2 stars in Fig. 2b.
  • the grid positions 2, 4, 6, 8 all have the third rhythmic level, which is indicated by a single star in Fig. 2b.
  • the processor 10 is also operative to generate a prime index for each grid position as is indicated by line 22 in Fig. 2b.
  • the prime index includes a value for each grid position, while grid positions belonging to the highest rhythmic level have low indices 1, 2, the grid positions belonging to the second rhythmic level have higher indices 3 and 4, and the grid positions at the third rhythmic level have even higher indices 5, 6, 7, and 8.
  • the inventive prime index determination as illustrated in Figs. 7 and 8 results in a prime index, which has a two ⁇ fold meaning.
  • a prime index for a velocity value at a bit position having a higher rhythmic level is lower than the prime index for a velocity value at the grid position having a lower rhythmic level.
  • the second characteristic of the prime index is that, when there are several velocity values for grid positions at the same rhythmic level, the order of the prime index also reflects the time sequence of the velocity values. This means that the velocity value for grid position 1 receives a lower prime index than the velocity value for the grid position 5 having the same rhythmic level. This velocity value, however, receives a higher prime index, since grid position 5 appears after grid position 1, when the rhythmic pattern of Fig. 2b is to be played.
  • the inventive processor does not have to generate the prime index in one embodiment. In this embodiment, the processor does not even have to change anything at the description of
  • Fig. 2a as long as the processor provides information on the rhythmic level to a sorter 11 for sorting the velocity values associated with the grid position at different levels to obtain first and second groups of grid positions.
  • a sorter is operative for sorting the velocity values so that the velocity values associated with the grid positions at the first rhythmic level form a first group and that the velocity values associated with the grid positions at the second rhythmic levels form a second group, and that the first and the second group are in sequence to each other so that at an output of the sorter 11, the inventive encoded rhythmic pattern having a sequence of velocity values according to the groups is obtained.
  • the processor is operative to generate the prime index 22 in Fig. 2b, which replaces the grid position 20a in Fig. 2a as can be seen in Fig. 2c and in Fig. 2d.
  • the inventive apparatus further comprises a zero eliminator 12 for eliminating grid positions having a zero velocity.
  • the "zero" velocity can be a velocity having indeed a value of zero, or can be a velocity which is below a minimum threshold, which can correspond to a quantization step size for the lowest quantization bin.
  • the zero eliminator 12 is operative to eliminate grid positions having a zero velocity. In the Fig. 2b example, the zero eliminator would eliminate positions 2, 4, 8 from further consideration. In the case in which the zero eliminator is positioned after the processor 12, but before the sorter 11, the zero eliminator 12 would output the processed rhythmic pattern as shown in Fig. 2c, which only has the prime index 22 and the velocity values 20b. It has to be noted here that the rhythmic level 21 is only shown for illustration purposes, but would not appear in the processing of Fig. 1, since the prime index 22 includes the information on the rhythmic level and, in addition, includes information on the time sequence of velocity values, as has been outlined above. The Fig. 2c representation is input into the sorter 11, so that a representation given in Fig.
  • the sorter 11 includes a simple sorting algorithm, which outputs an encoded rhythmic pattern, in which velocity values having lower prime index values are positioned more to the start of the encoded rhythmic pattern, while velocity values having higher prime index values are positioned more to the end of the encoded rhythmic pattern.
  • velocity values for the prime index 1 and the prime index 2 form the first group
  • velocity values for the prime index 3 and the prime index 4 form the second group
  • the third group having the prime index 7 only has a single velocity value.
  • the zero eliminator 12 can also be positioned before the processor 10, or after the sorter 11. Positioning of the zero eliminator before the processor would result in a slightly better computational performance, since the processor 10 would not have to consider zero-valued grid positions when determining the rhythmic levels for the positions. On the other hand, the preferred positioning of the zero eliminator between the processor 10 and the sorter 11 allows the application of one of the preferred algorithms in Fig. 7 or Fig. 8, which rely on the time sequence of the un-coded rhythmic pattern. Finally, the zero eliminator can also be positioned after the sorter. Since sorting algorithms exist, which do not necessarily need a full sequence of integers, the zero eliminator 12 is positioned before the sorter 11.
  • Figs. 3a to 3c show another example, also having a 4/4- meter.
  • the microtime in Fig. 3a to Fig. 3c is 3, which results in a higher pattern length or size of the rhythmic pattern in Fig. 3a and which also results in the fact that the music piece has a kind of a ternary feeling.
  • Fig. 3a shows the output of the processor, since the rhythmic level is marked. Nevertheless, the Fig. 3a embodiment still includes the grid position index, which is the "equivalent element index" in Fig. 3a.
  • Fig. 3b the output after the zero eliminator 12 in Fig. 1 is shown, i.e., a situation in which all velocity values equal to zero and the corresponding prime indexes (elements) are deleted.
  • Fig. 3c finally shows the encoded rhythmic pattern as output by the sorter 11 having, again, three groups, wherein the first group has two elements, the second group has two elements and the third group has four elements.
  • Fig. 3c embodiment is remarkable in that both members of the first group have lower velocity values than both members of the second group, while the velocity values of the third group are all lower than the velocity values of the first and second groups.
  • Fig. 4a shows an MPEG-7 conformant description of the audio pattern description syntax (DS) .
  • the audio pattern data syntax includes information on the meter of the corresponding bar of the piece of music and can be included in the MPEG-7 syntax.
  • the Fig. 4a embodiment includes information on the tempo of the drum pattern in beats per minute (BPM) .
  • emphasis is drawn to line 40 in Fig. 4a, which has the element name "pattern", wherein a further description of the pattern 40 is given in subsequent Figs. 5a and 5b.
  • barNum indicates the position of the bar or the rhythmic pattern in the piece of music.
  • the barNum for the first bar would be one
  • the barNum for the tenth bar would be ten
  • the barNum for the five hundredth bar would be five hundred, for example.
  • the barNum for the first ten bars would be one
  • the barNum for the bars eleven to twenty would be two, etc.
  • Fig. 5a illustrates a more detailed representation of an audio rhythmic pattern.
  • the Fig. 5a embodiment preferably includes an instrument ID field.
  • the Fig. 5a description further includes the prime index vector, which is, for example, in line 22 in Fig. 2c and a velocity vector, which is in line 23 of Fig.
  • the Fig. 5a embodiment also includes the microtime and tempo in beats per minute. It is unnecessary to include the tempo in the
  • Fig. 5a description as well as in the Fig. 4a description. Additionally, the Fig. 4a description includes information on the meter, from which the prime factor decomposition is derived.
  • Fig. 6 illustrates an example for several instruments, i.e., for instruments having instrument IDs 10, 13 and 14. Additionally, as has been outlined above with respect to Fig. 5a, the Fig. 6 embodiment also includes a bar Num field as well as the microtime, naturally the prime index vector and the velocity vector. The Fig. 6 example also illustrates a similar description for the next bar having bar Num 2, i.e., for the bar following the bar having the bar Num equal to zero.
  • Fig. 7 illustrates a preferred implementation of the inventive processor 10 for determining the grid positions having several rhythmic levels.
  • the processor is operative to calculate each prime index of a rhythmical pattern by using a prime factorization of the nominator of the meter, which is, in the Fig. 2 example, a vector having only elements (2, 2) .
  • a prime factorization of the microtime is performed, which results in the vector having a single component of two.
  • an iterative calculation of the prime indices for the grid position is performed.
  • a first iteration index i is incremented until the length of the prime factorization vector, i.e., until two in the present embodiment.
  • an iteration parameter j is iterated from one to the number of components in the prime factorization vector of the microtime. In the present embodiment, this value is equal to one, so that the inner iteration loop is only processed for a first time, but is not processed for a second time.
  • a certain grid position is then determined by the variable "count".
  • each grid position has a certain prime index value, as illustrated in the Fig. 2 and Fig. 3 embodiments.
  • the vector primeVec is, therefore, completed.
  • FIG. 8 An alternative embodiment is shown in Fig. 8, which receives, as an input, the prime index vector nomVec, which is the vector having the prime factors of the nominator of the meter. Additionally, the embodiment in Fig. 8 also receives the microtime prime index vector mtVec.
  • the first iterative processing step is then performed using the prime factorization vector of the nominator of the meter, which is followed by a second iteration process determined by the prime factorization vector of the microtime.
  • the function entitled "Prod” outputs the product of all components in a vector. Alternate embodiments for calculating the prime index values for the associated velocity values can be devised.
  • the inventive encoded rhythmic pattern is based on a non-linear indexing of the velocity values with the help of the prime index vector.
  • the prime index vector indicates the rhythmic significance (rhythmic level) within the pattern.
  • velocity values that occur on the beat will be indicated by a prime index with a lower integer value than velocity values occurring between two beats (off-beat) .
  • off-beat a beat that occurs between two beats
  • different numbers of rhythmic hierarchies will result.
  • Fig. 9 illustrates a preferred embodiment of an apparatus for determining a relation between a test piece of music and a reference piece of music.
  • a test piece of music is processed to obtain a test rhythmic pattern, which is input into an encoder 90 to obtain the encoded rhythmic pattern, such as shown in Figs. 2d or 3c.
  • the encoder 90 is structured as shown in Fig. 1 and as has been described above.
  • the inventive apparatus further includes an input interface 91 for providing an encoded rhythmic pattern of the test piece of music.
  • This encoded rhythmic pattern is input into a search engine 92 for correlating the encoded rhythmic pattern of the test piece of music to an encoded rhythmic pattern included in database 93.
  • the correlation between the encoded rhythmic patterns is performed such that the first group of velocity values of the test rhythmic pattern is compared to the first group of the rhythmic values of the reference rhythmic pattern before the comparison is continued for the second and further groups.
  • Each group can have only a single group member or more than one or even more than two group members, as has been described above with respect to Figs. 2d and 2c.
  • the search engine 92 is operative to provide a correlation result to an output interface 94 for indicating the relation between the test piece of music and the reference piece of music based on the correlation result.
  • the database will include varying numbers of reference pieces of music.
  • the database 93 does not have to include the whole reference piece of music from which the encoded rhythmic pattern under consideration is derived.
  • the database only includes an identification of the corresponding piece of music which, for example, can be used by another database, from which the user can retrieve the final piece of music, which is identified by the output interface.
  • the relation to be determined by the output interface 94 therefore is a statement that the test piece of music is equal to the (single) reference piece of music or not, that the test piece of music belongs to a certain music genre, that the test piece of music is similar to one reference piece of music from several reference pieces of music (qualitative statement) or that the reference piece of music is equal to one or several pieces of music with certain matching degrees (quantitative statement) .
  • Fig. 10 shows the situation in which two encoded rhythmic patterns having the same meter (4/4) are compared to each other.
  • the zero eliminator was active, so that both rhythmic patterns have different lengths.
  • the search engine 92 of Fig. 9 only has to compare the first and second prime index factor components. Thus, only the number of elements of the shorter representation is taken into account. Since the patterns are sorted, so that more important grid positions come first and less important grid positions come later, the comparison of only the number of elements of the shorter representation is sufficient for obtaining the useful comparison result.
  • two steps have already been performed, which are illustrated by reference to Fig. 11.
  • a meter matching 110 has been performed in the database, so that only encoded rhythmic patterns, which are based on the same meter, are considered for comparison purposes. Therefore, all encoded Rhythmic reference patterns having a meter different from 4/4 are deleted from further consideration by step 110.
  • the functionality of the zero eliminator from Fig. 1 is advantageously used.
  • all reference patterns are deleted from further consideration, which have zero values in the first group, when the test pattern does not have a zero value in the first group at the same grid position. In other words, this means that all reference patterns are deleted from further consideration in the search, which have a prime index vector, whose two or three first prime index vector components do not completely match.
  • step 111 After sorting-out in step 111, the step of comparing is performed so that the best candidates from the remaining reference patterns are determined, as indicated by step 112 in Fig. 11. Based on these remaining candidates, the step of comparing 113 of this encoded test rhythmic pattern and the corresponding second groups of the reference rhythmic patterns is performed, wherein this procedure can be repeated until all groups have been processed. Then, at the end of the process, the search engine 92 will generate a quantitative or qualitative search result 114.
  • the inventive encoded rhythmic pattern allows to perform a sequential database search, such that the first component of the encoded test rhythmic pattern is compared to a first component of an encoded reference rhythmic pattern, so that after each velocity value, a lot of reference patterns can be cancelled, so that one never has to perform simultaneously comparing many velocity values from the test pattern to many velocity values to the reference pattern.
  • This sequential processing in the search engine is made possible by the sorting of the velocity values in accordance with their importance to the rhythmic gist of a piece of music.
  • Figs. 12a and 12b illustrate the situation having a query pattern, which not only consists of a single encoded rhythmic pattern for a single instrument, but which includes several encoded rhythmic patterns from several instruments.
  • the encoded rhythmic patterns for the instruments have already been re- expanded, so that the functionality of the zero eliminator from Fig. 1 is cancelled.
  • a run of zeros at the end of the pattern i.e., from prime index 3 to prime index 8 of the second instrument has not been re-expanded.
  • This function has to take place until the highest prime index from all music instruments is reached.
  • the highest prime index is given by the instrument having the instrument ID 4.
  • Fig. 12a therefore, shows an expanded, but ordered representation of the rhythmic patterns in accordance with the order of the prime index, wherein the matrix of Fig. 12a is obtained for more than one instrument.
  • the matrix of Fig. 12a is obtained for more than one instrument.
  • Fig. 12b it becomes clear from Fig. 12b that one only has to search the fields shown in Fig. 12b in the database and one can fully ignore the other fields. This reduces the search overhead in a database, too.
  • the inventive concept of encoded rhythmic patterns allows describing rhythmical pattern information in a very flexible and general way.
  • the microtime is defined as an element within the audio rhythmic pattern type.
  • the description of audio rhythmic pattern types is expanded to the representation of several consecutive rhythmic patterns and an arbitrary number of rhythmic patterns that occur in parallel at similar time instances.
  • a very flexible representation of rhythmic patterns is made possible by the inventive rhythmic pattern encoding.
  • a quantization of the velocity values to seven loudness states as used in classic music notation can be used for being in conformance with classical music notation, but leads to loss of information, for example, in comparison to standard MIDI notation, since the velocity values degenerate to only seven different quantized states.
  • the inventive rhythmic pattern encoding is a lossless encoding scheme, which, therefore, can be reversed or decoded, so that a decoded rhythmic pattern is obtained.
  • the functionalities of the sorter, the zero eliminator and the processor from the Fig. 1 encoder scheme have to be reversed.
  • the prime index one would, first of all, perform a prime index/grid position index resorting step. In this case, one would mark the beat positions and the offbeat positions in an empty bit position. One would then start with the highest prime index. When the highest prime index has a non-zero velocity value, this velocity value is sorted into the grid position having the first beat in the grid.
  • the second highest prime index is then used which, when same has a velocity value not equal to zero, is attributed to the second beat in the bar.
  • the second prime index vector component has an associated velocity value of zero, this means that such a second prime index value does not exist. Therefore, the grid position for the second beat receives a velocity value of zero, etc.
  • Whether a grid position value receives a zero or not is determined by checking out as to whether the sequence of prime index values is a non-disturbed sequence from one to the pattern length in one-increment steps or not. When one encounters a missing prime index value, this indicates that the grid position associated to this missing prime index value receives a zero velocity value.
  • the inventive methods can be implemented in hardware, software or in firmware. Therefore, the invention also relates to a computer readable medium having store a program code, which when running on a computer results in one of the inventive methods.
  • the present invention is a computer program having a program code, which when running on a computer results in an inventive method.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Electrophonic Musical Instruments (AREA)

Abstract

L'invention concerne un motif rythmique codé possédant plusieurs groupes de valeurs de vitesse, ces valeurs de vitesse étant triées, de sorte que les groupes soient contenus dans une séquence d'un motif rythmique codé. A présent, les valeurs de vitesse concentrées au début du motif rythmique codé possèdent une plus grande importance dans la caractérisation du but rythmique d'une pièce de musique que les valeurs de vitesse contenues dans les groupes supplémentaires de valeurs de vitesse. En utilisant ce motif rythmique codé, un accès efficace à une base de données peut être mis en oeuvre.
PCT/EP2004/011266 2004-10-08 2004-10-08 Appareil et procede destines a generer un motif rythmique code WO2006037366A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP04790208A EP1797507B1 (fr) 2004-10-08 2004-10-08 Appareil et procede destines a generer un motif rythmique code
PCT/EP2004/011266 WO2006037366A1 (fr) 2004-10-08 2004-10-08 Appareil et procede destines a generer un motif rythmique code

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2004/011266 WO2006037366A1 (fr) 2004-10-08 2004-10-08 Appareil et procede destines a generer un motif rythmique code

Publications (1)

Publication Number Publication Date
WO2006037366A1 true WO2006037366A1 (fr) 2006-04-13

Family

ID=34958991

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2004/011266 WO2006037366A1 (fr) 2004-10-08 2004-10-08 Appareil et procede destines a generer un motif rythmique code

Country Status (2)

Country Link
EP (1) EP1797507B1 (fr)
WO (1) WO2006037366A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011051279A1 (fr) 2009-10-30 2011-05-05 Dolby International Ab Estimation perceptuelle de tempo adaptable en complexité
JP2014515124A (ja) * 2011-04-28 2014-06-26 ドルビー・インターナショナル・アーベー 効率的なコンテンツ分類及びラウドネス推定

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5918223A (en) * 1996-07-22 1999-06-29 Muscle Fish Method and article of manufacture for content-based analysis, storage, retrieval, and segmentation of audio information
US20040094019A1 (en) * 2001-05-14 2004-05-20 Jurgen Herre Apparatus for analyzing an audio signal with regard to rhythm information of the audio signal by using an autocorrelation function

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5918223A (en) * 1996-07-22 1999-06-29 Muscle Fish Method and article of manufacture for content-based analysis, storage, retrieval, and segmentation of audio information
US20040094019A1 (en) * 2001-05-14 2004-05-20 Jurgen Herre Apparatus for analyzing an audio signal with regard to rhythm information of the audio signal by using an autocorrelation function

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CHRISTIAN UHLE, JUERGEN HERRE: "Estimation of Tempo, Micro Time and Time Signature from Percussive Music", PROC OF THE 6TH INT CONFERENCE ON DIGITAL AUDIO EFFECTS (DAFX-03), 11 September 2003 (2003-09-11), LONDON, UK, XP002321810 *
SCHEIRER ERIC D: "Tempo and beat analysis of acoustic musical signals", JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, AMERICAN INSTITUTE OF PHYSICS. NEW YORK, US, vol. 103, no. 1, January 1998 (1998-01-01), pages 588 - 601, XP012000051, ISSN: 0001-4966 *
SEPPDNEN J: "Tatum grid analysis of musical signals", APPLICATIONIS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS, 2001 IEEE WORKSHOP ON THE OCT. 21-24, 2001, PISCATAWAY, NJ, USA,IEEE, 21 October 2001 (2001-10-21), pages 131 - 134, XP010566892, ISBN: 0-7803-7126-7 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011051279A1 (fr) 2009-10-30 2011-05-05 Dolby International Ab Estimation perceptuelle de tempo adaptable en complexité
CN102754147A (zh) * 2009-10-30 2012-10-24 杜比国际公司 复杂度可缩放的感知节拍估计
JP2013508767A (ja) * 2009-10-30 2013-03-07 ドルビー インターナショナル アーベー 複雑さがスケーラブルな知覚的テンポ推定
RU2507606C2 (ru) * 2009-10-30 2014-02-20 Долби Интернешнл Аб Перцептивная оценка темпа с масштабируемой сложностью
KR101370515B1 (ko) 2009-10-30 2014-03-06 돌비 인터네셔널 에이비 복합 확장 인지 템포 추정 시스템 및 추정방법
CN102754147B (zh) * 2009-10-30 2014-10-22 杜比国际公司 复杂度可缩放的感知节拍估计
CN104157280A (zh) * 2009-10-30 2014-11-19 杜比国际公司 复杂度可缩放的感知节拍估计
TWI484473B (zh) * 2009-10-30 2015-05-11 Dolby Int Ab 用於從編碼位元串流擷取音訊訊號之節奏資訊、及估算音訊訊號之知覺顯著節奏的方法及系統
EP2988297A1 (fr) * 2009-10-30 2016-02-24 Dolby International AB Estimation de tempo perceptive échelonnable en complexité
US9466275B2 (en) 2009-10-30 2016-10-11 Dolby International Ab Complexity scalable perceptual tempo estimation
JP2014515124A (ja) * 2011-04-28 2014-06-26 ドルビー・インターナショナル・アーベー 効率的なコンテンツ分類及びラウドネス推定
US9135929B2 (en) 2011-04-28 2015-09-15 Dolby International Ab Efficient content classification and loudness estimation

Also Published As

Publication number Publication date
EP1797507A1 (fr) 2007-06-20
EP1797507B1 (fr) 2011-06-15

Similar Documents

Publication Publication Date Title
US7342167B2 (en) Apparatus and method for generating an encoded rhythmic pattern
Burred et al. Hierarchical automatic audio signal classification
US9313593B2 (en) Ranking representative segments in media data
Dixon et al. Towards Characterisation of Music via Rhythmic Patterns.
KR100838674B1 (ko) 오디오 핑거프린팅 시스템 및 방법
Baluja et al. Waveprint: Efficient wavelet-based audio fingerprinting
Typke Music retrieval based on melodic similarity
KR100659672B1 (ko) 핑거프린트를 생성하는 방법과 장치 및 오디오 신호를 식별하는 방법과 장치
CN100454298C (zh) 旋律数据库搜索
Casey et al. The importance of sequences in musical similarity
US20050247185A1 (en) Device and method for characterizing a tone signal
KR20080054393A (ko) 음악 분석
US20060155399A1 (en) Method and system for generating acoustic fingerprints
Arzt et al. Fast Identification of Piece and Score Position via Symbolic Fingerprinting.
Welsh et al. Querying large collections of music for similarity
JP2004530153A6 (ja) 信号を特徴付ける方法および装置、および、索引信号を生成する方法および装置
EP1797507B1 (fr) Appareil et procede destines a generer un motif rythmique code
Paulus Signal processing methods for drum transcription and music structure analysis
Cherla et al. Automatic phrase continuation from guitar and bass guitar melodies
JP3934556B2 (ja) 信号識別子の抽出方法及びその装置、信号識別子からデータベースを作成する方法及びその装置、及び、検索時間領域信号を参照する方法及びその装置
JP2004531758A5 (fr)
CN120183441B (zh) 基于数据码头部识别的音乐信息分类方法及系统
JP2007536586A (ja) 音信号の特徴を記述する装置および方法
Gruhne et al. Extraction of Drum Patterns and their Description within the MPEG-7 High-Level-Framework.
Kendrick Music Data Mining

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2004790208

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWP Wipo information: published in national office

Ref document number: 2004790208

Country of ref document: EP