Title of Invention

"AN INFORMATION PROCESSING APPARATUS AND A METHOD THEREOF"

Abstract The present invention relates to an information processing apparatus which creates a feature detection algorithm for detecting features from content data, said apparatus comprising a low-level feature extraction expression list creation means for creating next-generation expression lists each constituted by a plurality of low-level feature extraction expressions through learning based on latest-generation expression lists, said low-level feature extraction expressions being expressions to which either said content data or metadata corresponding to said content data is input and from which low-level features are output; computation means for computing said low-level features using said next-generation expression lists created by said low-level feature extraction expression list creation means; and high-level feature extraction expression creation means for creating high-level feature extraction expressions through learning based on training data constituted by previously furnished true high-level features corresponding .to said content data, said high-level feature extraction expressions being expressions to which said low-level features computed by said computation means are input and from which high-level features characteristic of said content data are output.
Full Text [Name of Document] Description
[Title of the Invention] INFORMATION PROCESSING APPARATUS,
INFORMATION PROCESSING METHOD, AND PROGRAM
[Technical Field]
[0001]
The present invention relates to an information processing apparatus, an information processing method, and a program. More particularly, the invention relates to an information processing apparatus, an information processing method, and a program for automatically creating an algorithm used to extract features from content data such as song data.
[Background Art]
[0002]
There have been proposed inventions regarding the automatic creation of algorithms to which song data is input and from which features (speed, brightness, gaiety, etc. of songs) of the song data are output (e.g., see Patent Document 1. [0003]
[Patent Document 1] U.S. Patent No. 0181401A1 [Disclosure of the Invention] [Problems to be Solved by the Invention]

[0004]
The above-cited patent application, as shown in Fig, 1, proposes creating feature extraction algorithms for extracting features by feature type. The computations involved in extracting features are enormous and include numerous redundancies. [0005]
It has thus been desired to develop a method for creating an algorithm capable of quickly extracting features from song data through computations with a minimum of redundancies. [0006]
The present invention has been made in view of the above circumstances and provides arrangements for creating an algorithm, capable of quickly extracting features with high accuracy from input content data such as song data.
[Means for Solving the Problems] [0007]
In carrying out the present invention and according to one embodiment thereof, there is provided an information processing apparatus which creates a feature detection algorithm for detecting features from content data, the information processing apparatus including:

low-level feature extraction expression list creation means for creating next-generation expression lists each constituted by a plurality of low-level feature extraction expressions through learning based on latest-generation expression lists, the low-level feature extraction expressions being expressions to which either the content data or metadata corresponding to the content data is input and from which low-level features are output; computation means for computing the low-level features using the next-generation expression lists created by the low-level feature extraction expression list creation means; and high-level feature extraction expression creation means for creating high-level feature extraction expressions through learning based on training data constituted by previously furnished true high-level features corresponding to the content data, the high-level feature extraction expressions being expressions to which the low-level features computed by the computation means are input and from which high-level features characteristic of the content data are output. [0008]
Preferably, the high-level feature extraction expression creation means may compute at least either accuracy levels of the created high-level feature

extraction expressions or contribution ratios of the low-level features in the high-level feature extraction expressions; and the low-level feature extraction expression list creation means may update the low-level feature extraction expressions constituting the low-level feature extraction expression lists at least on the basis of either the accuracy levels of the high-level feature extraction expressions or the contribution ratios of the low-level features in the high-level feature extraction expressions, the accuracy levels and the contribution ratios having been computed by the high-level feature extraction expression creation means. [0009]
Preferably, the low-level feature extraction expression list creation.means may randomly create first-generation expression lists. [0010]
Preferably, the low-level feature extraction expression list creation means may create the next-generation expression lists using a genetic algorithm based on the latest-generation expression lists through at least one of a selection process, a cross process, and a mutation process. [0011]

Preferably, the low-level feature extraction expression list creation means may create the next-generation expression lists each constituted by a predetermined constant number of low-level feature extraction expressions. [0012]
Preferably, the low-level feature extraction expression list creation means may create the next-generation expression lists each constituted by a predetermined constant number of low-level feature extraction expressions randomly determined every time each of the lists is created. [0013]
Preferably, the high-level feature extraction expression creation means may compute at least either evaluation values of the created high-level feature extraction expressions or contribution ratios of the low-level features in the high-level feature extraction expressions; and the low-level feature extraction expression list creation means may update the low-level feature extraction expressions constituting the low-level feature extraction expression lists at least on the basis of either the evaluation values of the high-level feature extraction expressions or the contribution ratios of the

low-level features in the high-level feature extraction expressions, the evaluation values and the contribution ratios having been computed by the high-level feature extraction expression creation means. [0014]
According to another embodiment of the present invention, there is provided an information processing method for use with an information processing apparatus which creates a feature detection algorithm for detecting features from content data, the information processing method including the steps of: creating next-generation expression lists each constituted by a plurality of low-level feature extraction expressions through learning based on latest-generation expression lists, the low-level feature extraction expressions being expressions to which either the content data or metadata corresponding to the content data is input and from which low-level features are output; computing the low-level features using the created next-generation expression lists; and creating high-level feature extraction expressions through learning based on training data constituted by previously furnished true high-level features corresponding to the content data, the high-level feature extraction expressions being expressions to which the

computed low-level features are input and from which high-level features characteristic of the content data are output. [0015]
According to a further embodiment of the present invention, there is provided a program for controlling an information processing apparatus which creates a feature detection algorithm for detecting features from content data, the program causing a computer of the information processing apparatus to carry out a procedure including the steps of: creating next-generation expression lists each constituted by a plurality of low-level feature extraction expressions through learning based on latest-generation expression lists, the low-level feature extraction expressions being expressions to which either the content data or metadata corresponding to the content data is input and from which low-level features are output; computing the low-level features using the created next-generation expression lists; and creating high-level feature extraction expressions through learning based on training data constituted by previously furnished true high-level features corresponding to the content data, the high-level feature extraction expressions being expressions to which the computed low-

level features are input and from which high-level features characteristic of the content data are output. [0016]
Where any one of the above-outlined embodiments of the present invention is in use, next-generation expression lists each constituted by a plurality of low-level feature extraction expressions are first created through learning based on latest-generation expression lists, the low-level feature extraction expressions being expressions to which either the content data or metadata corresponding to the content data is input and from which low-level features are output. The low-level features are computed using the created next-generation expression lists. High-level feature extraction expressions are then created through learning based on training data constituted by previously furnished true high-level features corresponding to the content data, the high-level feature extraction expressions being expressions to which the computed low-level features are input and from which high-level features characteristic of the content data are output. [Effects of the Invention] [0017]
The embodiments of the present invention, as

outlined above, make it possible to create an algorithm
capable of quickly extracting features with high accuracy
from input content data such as song data.
[Brief Description of the Drawings]
[0018]
[Fig. 1] Fig. 1 is a schematic view explanatory of
feature extraction algorithms in the past.
[Fig. 2] Fig. 2 is a schematic view outlining a feature
extraction algorithm created by a feature extraction
algorithm creation apparatus according to the present
invention. ,
[Fig. 3] Fig. 3 is a schematic view showing typical low-level feature extraction expressions.
[Fig. 4] Fig. 4 is a schematic view showing typical high-level feature extraction expressions.
[Fig. 5] Fig. 5 is a block diagram showing a first structural example of the feature extraction algorithm creation apparatus according to ,the present invention. [Fig. 6] Fig. 6 is a block diagram showing a structural example of a high-level feature computation section constituting part of the feature extraction algorithm creation apparatus. ,
[Fig. 7] Fig. 7 is a flowchart of steps constituting a feature extraction algorithm learning process.

[Fig. 8] Fig. 8 is a schematic view showing typical low-level feature extraction expression lists. [Fig. 9] Fig. 9 is a flowchart of steps constituting a low-level feature extraction expression list creation process.
[Fig. 10] Fig. 10 is a flowchart of steps constituting a first-generation list random creation process performed by a low-level feature extraction expression list creation section as part of the structure in Fig. 5. [Fig. 11] Fig. 11 is a schematic view showing how a low-level feature extraction expression is typically described.
[Fig. 12] Fig. 12 is a tabular view listing examples of input data.
[Fig. 13] Fig. 13 is a schematic view explanatory of input data "Wav".
[Fig. 14] Fig. 14 is a schematic view explanatory of input data "Chord"
[Fig. 15] Fig. 15 is a schematic view explanatory of input data "Key".
[Fig. 16] Fig. 16 is a schematic view explanatory of holding dimensions for a low-level feature extraction expression. [Fig. 17] Fig. 17 is a flowchart of steps constituting a

next-generation list genetic creation process; [Fig. 18] Fig. 18 is a flowchart of steps constituting a selection creation process performed by the low-level feature extraction expression list creation section as part of the structure in Fig. 5.
[Fig. 19] Fig. 19 is a flowchart of steps constituting a cross creation process performed by the low-level feature extraction expression list creation section as part of the structure in Fig. 5.
[Fig. 20] Fig. 20 is a flowchart of steps constituting a mutation creation process performed by the low-level feature extraction expression list creation section as part of the structure in Fig. 5.
[Fig. 21] Fig. 21 is a schematic view explanatory of computations using an operator "Mean". [Fig. 22] Fig. 22 is a schematic view explanatory of processing by a low-level feature computation section. [Fig. 23] Fig. 23 is a schematic view explanatory of examples of training data.
[Fig. 24] Fig. 24 is a flowchart of steps constituting a high-level feature extraction expression learning process performed by a high-level feature extraction expression learning section as part of the structure in Fig. 5. [Fig. 25] Fig. 25 is a graphic representation explanatory

of a learning algorithm.
[Fig. 26] Fig. 26 is a graphic representation explanatory
of another learning algorithm.
[Fig. 27] Fig. 27 is a graphic representation explanatory
of another learning algorithm.
[Fig. 28] Fig. 28 is a graphic representation explanatory
of another learning algorithm.
[Fig. 29] Fig. 29 is a graphic representation explanatory
of another learning algorithm.
[Fig. 30] Fig. 30 is a graphic representation explanatory of another learning algorithm.
[Fig. 31] Fig. 31 is a graphic representation explanatory of another learning algorithm.
[Fig. 32] Fig. 32 is a graphic representation explanatory of another learning algorithm.
[Fig. 33] Fig. 33 is a graphic representation explanatory of another learning algorithm.
[Fig. 34] Fig. 34 is a flowchart of steps constituting a learning process based on a learning algorithm and performed by the high-level feature extraction expression learning section as part of the structure in Fig. 5.
[Fig. 35] Fig. 35 is a schematic view showing a typical combination of operators.
[Fig. 36] Fig. 36 is a schematic view showing another

typical combination of operators.
[Fig. 37] Fig. 37 is a flowchart of steps constituting a new operator creation process.
[Fig. 38] Fig. 38 is a flowchart of steps constituting a high-accuracy high-level feature computation process.
[Fig. 39] Fig. 39 is a flowchart of steps constituting a high-accuracy reject process.
[Fig. 40] Fig. 40 is a block diagram showing a second structural example of the feature extraction algorithm creation apparatus according to the present invention. [Fig. 41] Fig. 41 is a flowchart of steps constituting a first-generation list random creation process performed by a low-level feature extraction expression list creation section as part of the structure in Fig. 40. [Fig. 42] Fig. 42 is a flowchart of steps constituting a selection creation process performed by the low-level feature extraction expression list creation section as part of the structure in Fig. 40.
[Fig. 43] Fig. 43 is a flowchart of steps constituting a cross creation process performed by the low-level feature extraction expression list creation section as part of the structure in Fig. 40.
[Fig. 44] Fig. 44 is a flowchart of steps constituting a mutation creation process performed by the low-level

feature extraction'expression list creation section as part of the structure in Fig. 40.
[Fig. 45] Fig. 45 is a flowchart of steps constituting a high-level feature extraction expression learning process performed by a high-level feature extraction expression learning section as part of the -structure in Fig. 40. [Fig. 46] Fig. 46 is a flowchart of steps constituting a learning process based on a learning algorithm and performed by the high-level feature extraction expression learning section as part.of the structure in Fig. 5. [Fig. 47] Fig. 47 is a block diagram showing a typical structure of a general-purpose personal computer. [Description of Reference Numerals] [0019]
20 ••• feature extraction algorithm creation apparatus, 21 ••• low-level feature extraction expression list creation section, 22 ••• operator combination detection section, 23 ••• operator creation section, 24 ••• low-level feature computation section, 25 ••• high-level feature extraction expression learning section, 26 ••• high-level feature computation section, 27 ••• control section, 41 ••• low-level feature computation section, 42 ••• high-level feature computation section, 43 ••• square error computation section, 44 ••• reject area extraction

expression learning section, 45 ••• feature extraction accuracy computation section, 60 ••• feature extraction algorithm creation apparatus, 61 ••• low-level feature extraction expression list creation section, 62 ••• high-level feature computation section, 100 ••• personal computer, 101 ••• CPU, 111 ••• recording medium [Best Mode for Carrying out the Invention] [0020]
Preferred embodiments of the present invention will now be described in reference to the accompanying drawings. [0021]
Fig. 2 is a schematic view outlining a feature extraction algorithm created by,.a feature extraction algorithm creation apparatus 20 (Fig. 5) or 60 (Fig. 40) according to the present invention. [0022]
This feature extraction algorithm 11 is made up of two sections: a low-level feature extraction section 12 to which content data (song data) and metadata (attribute data) corresponding to the content data are input and from which low-level features are output; and a high-level feature extraction section 14 to which the low-level features are input and from which high-level

features are output. [0023]
The low-level feature extraction section 12 has a low-level feature extraction expression list 13 composed of as many as "m" low-level feature extraction expressions each having a combination of at least one operator by which to perform predetermined operations on input data. The low-level feature extraction section 12 thus outputs "m" low-level features to the high-level feature extraction section 14. [0024]
The number "m" of low-level feature extraction expressions constituting the low-level feature extraction expression list 13 is a predetermined constant in the case of the feature extraction algorithm creation apparatus 20 shown in ,Fig. 5. For the feature extraction algorithm creation apparatus 60 in Fig. 40, the number "m" is a randomly determined number. [0025]
Fig. 3 is a schematic view showing typical low-level feature extraction expressions. [0026]
Illustratively, a low-level feature extraction expression fl in part A of Fig. 3 inputs waveform data as

one type of song data, computes mean values of the input waveform data between channels involved (e.g., left and right), subjects the computed mean values to fast Fourier transformation (FFT) on the temporal axis, acquires a standard deviation (StDev) of frequency from the FFT result, and outputs the resulting standard deviation as a low-level feature "a." [0027]
As another example, a low-level feature extraction expression f2 in part B of Fig. 3 inputs chord progression data as another type of song data, acquires the incidence (ratio) of minor chords on the temporal axis, and outputs the resulting incidence as a low-level feature "b." [0028]
The low-level feature output by the low-level feature extraction section 12 need not be a significant value in itself. [0029]
The high-level feature extraction section 14 possesses "k" high-level feature extraction expressions that input "m" low-level features, perform relatively simple computations (arithmetic operations, raising to powers, etc.) on at least one of the input "m" low-level

features, and output the results of the operations as high-level features. The high-level feature extraction section 14 thus outputs "k" high-level features. [0030]
Fig. 4 is a schematic view showing typical high-level feature extraction expressions. [0031]
Illustratively, a high-level feature extraction expression Fl in part A of Fig. 4 performs arithmetic operations on low-level features "a," "b," "c," "d" and "e," and outputs the result of the operations as a speed value constituting a single high-level feature. [0032]
As another example, a low-level feature extraction expression F2 in part B of Fig. 4 performs arithmetic operations and raising to powers on low-level features "a," "c," "d" and "e," and outputs the result of the operations as a brightness value constituting another single high-level feature. [0033]
Fig. 5 is a block diagram showing the first structural example of the feature extraction algorithm creation apparatus 20 according to the present invention. [0034]

This feature extraction algorithm creation apparatus 20 creates optimal low-level and high-level feature extraction expressions through learning based on a genetic algorithm. [0035]
The feature extraction algorithm creation apparatus 20 is made up of five major sections: a low-level feature extraction expression list creation section 21 that creates "n" low-level feature extraction expression lists each composed of "m" low-level feature extraction expressions; a low-level feature computation section 24 that inputs input data (content data and metadata) of "j" songs in "n" low-level feature extraction expression lists supplied by the low-level feature extraction expression list creation section 21 and computes "n" combinations of "m" low-level features corresponding to each input data item; a high-level feature extraction expression learning section 25 that estimates high-level feature extraction expressions through learning based on training data ("k" high level features corresponding to each of "j" songs) corresponding to the "n" combinations of low-level features output by the low-level feature computation section 24; a high-level feature computation section 26 that computes a high-level feature using a

high-level feature extraction expression eventually created through genetic learning; and a control section 27 that controls loops performed by the sections involved. [0036]
In the description that follows, the learning based on a genetic algorithm may also be referred to as genetic learning. [0037]
The low-level feature extraction expression list creation section 21 randomly creates low-level feature extraction expression lists each composed of "m" (a predetermined constant) low-level feature extraction expressions for the first generation. For each of the second and subsequent generations, the low-level feature extraction expression list creation section 21 typically creates low-level feature extraction expression lists through learning using the low-level features based on the low-level feature extraction expression lists of the latest generation. [0038]
An operator combination detection section 22 incorporated in the low-level feature extraction expression list creation section 21 detects a combination of a plurality of operators that frequently appear in the

created low-level feature extraction expressions. An operator creation section 23 registers the combination of multiple operators detected by the operator combination detection section 22 as a new operator. [0039]
The high-level feature extraction expression learning section 25 creates "k" high-level feature extraction expressions corresponding to each of "n" low-level features, computes an estimated accuracy level of each of the high-level feature extraction expressions and the contribution ratio of each of the low-level features in each of the high-level feature extraction expressions, and outputs the results of the computations to the low-level feature extraction expression list creation section 21. For the latest generation! of learning, the high-level feature extraction expression learning section 25 selects from the "n" low-level feature extraction expression lists "m" low-level feature extraction expressions constituting the low-level feature extraction expression list having the highest mean accuracy of the acquired high-level features, and supplies the high-level feature computation section 26 with the selected "m" low-level feature extraction expressions along with "k" high-level feature extraction expressions corresponding to the list.

[0040]
The high-level feature computation section 26
computes a high-level feature using the low-level and
high-level feature extraction expressions eventually
supplied by the high-level feature extraction expression
learning section 25.
[0041]
Fig. 6 is a block diagram showing a detailed
structural example of the high-level feature computation
section 26.
[0042]
The high-level feature computation section 26 is made up of five major sections: a low-level feature computation section 41 that computes low-level features by substituting input data (content data and metadata corresponding to the content data) into the eventually acquired low-level feature extraction expression list; a high-level feature computation section 42 that computes a high-level feature by substituting the low-level features computed by the low-level feature computation section 41 into the eventually acquired high-level feature extraction expression;, a square error computation section 43 that computes a square error between the high-level feature computed by the high-level feature computation

section 42 on the one hand, and training data (true high-level feature corresponding to the input data) on the other hand; a reject area extraction expression learning section 44 that creates through learning a reject area extraction expression to which the low-level features computed by the low-level feature computation section 41 are input and from which the square error computed by the square error computation section 43 is output; and a feature extraction accuracy computation section 45 that substitutes the input data into the reject area extraction expression created by the reject area extraction expression learning section 44, estimates the feature extraction accuracy level (square error) of the high-level feature computed with regard to the input data, and causes the high-level feature computation section 42 to compute the high-level feature only if the estimated feature extraction accuracy level is higher than a predetermined threshold value. [0043]
Described below are the workings of the feature extraction algorithm creation apparatus 20 as a first embodiment of the present invention. [0044]
Fig. 7 is a flowchart of steps constituting a

feature extraction algorithm creation process. This is the basic operation oft the feature extraction algorithm creation apparatus 20. [0045]
In step S1, the control section 27 starts a learning loop by initializing a learning loop parameter G to "1." The learning loop is repeated as many times as a predetermined learning count "g." [0046]
In step S2, the low-level feature extraction expression list creation section 21 creates "n" low-level feature extraction expression lists each composed of "m" low-level feature extraction expressions and outputs the created lists to the low-level feature computation section 24, as shown in Fig. 8. [0047]
The process of step S2 (i.e., low-level feature extraction expression list creation process) is described below in detail with reference to the flowchart of Fig. 9 [0048]
In step Sll, the low-level feature extraction expression list creation section 21 checks to determine whether the low-level feature extraction expression lists to be crated are of the first generation. If the learning

loop parameter G is set to "0," then the low-level feature extraction expression lists to be crated are found to be of the first generation. [0049]
When the low-level feature extraction expression lists to be crated are found to be of the first generation because the learning loop parameter G is set to "0," step S12 is reached. In step S12, the low-level feature extraction expression list creation section 21 randomly creates low-level feature extraction expression lists of the first generation. [0050]
By contrast, if the low-level feature extraction expression lists to be crated are not found to be of the first generation, then step S13 is reached. In step S13, the low-level feature extraction expression list creation section 21 genetically creates low-level feature extraction expression lists of the next generation using a genetic algorithm based on the low-level feature extraction expression lists of the latest generation. [0051]
The process of step S12 (i.e., first-generation list random creation process) is described below in detail with reference to the flowchart of Fig. 10.

[0052]
In step S21, the control section 27 starts a list loop by initializing a list loop parameter N to "1." The list loop is repeated as many times as a predetermined list count "n." [0053]
In step S22, the control section 27 starts an expression loop by initializing an expression loop parameter M to "1." The expression loop is repeated as many times as the number "m" of low-level feature extraction expressions constituting a single low-level feature extraction expression list. [0054]
Explained below in reference to Fig. 11 is how a low-level feature extraction expressions created during the expression loop is typically described. [0055]
Input data is described in the leftmost position of the low-level feature, extraction expression. To the right of the input data are one or a plurality of operators described in the order in which they are subject to computation. Each operator may include a process symmetry axis and a parameter as needed. [0056]

In the case of the low-level feature extraction
expression shown in Fig. 11, the input data is
"12TonesM," and the operators are "32#Differential,"
"32#MaxIndex," "16#LPF_;0.861," etc. Illustratively, the
input data "12TonesM" indicates that monaural PCM (pulse
coded modulation sound source) waveform data appears in
the temporal axis direction. The notation "48#" stands
for the channel axis, "32#" for the frequency and
interval axes, and "16#" for the temporal axis. The
parameter "0.861" in an operator applies to low-pass
filtering and illustratively represents a threshold
frequency for the filtering process.
[0057] .
In step S23 back in Fig. 10, the low-level feature extraction expression list creation section 21 randomly determines the input data for the M-th low-level feature extraction expression (also referred to as the low-level feature extraction expression M hereunder) in the N-th low-level feature extraction expression list (also called the list N hereunder) to be created. [0058]
The typical input data types may include "Wav," "12Tones," "Chord," and "Key," as shown in Fig. 12. [0059]

The input data "WAV" is PCM waveform data such as that shown in Fig. 13, and the holding dimensions involved are the temporal and channel axes. The input data "12Tones" is PCM waveform data analyzed by interval on the temporal axis, and the holding dimensions are the temporal and interval axes. The input data "Chord" represents a song chord progression (C, C#, D, ..., Bm) such as that shown in Fig. 14, and the holding dimensions are the temporal and interval axes. The input data "Key" denotes song keys (C, C#, D, . . . , B), and the holding dimensions are the temporal and interval axes. [0060]
In step S24 back in Fig. 10, the low-level feature extraction expression list creation section 21 randomly determines a process symmetry axis and a parameter for the low-level feature extraction expression M in the list N to be created. [0061]
The parameter types may include mean value (Mean), fast Fourier transformation (FFT), standard deviation (StDev), incidence (Ratio), low-pass filter (LPF), high-pass filter (HPF), absolute value (ABS), differentiation (Differential), maximum value (Maxlndex), and unbiased variance (UVariance). The process symmetry axis may turn

out to be fixed for a particular operator that has been determined. In such a case, the process symmetry axis fixed to the parameter in the operator of interest is adopted. If a determined operator turns out to demand a particular parameter, that parameter is set to either a random value or a predetermined value. [0062]
In step S25, the low-level feature extraction expression list creation section 21 checks to determine whether the result of the computation by the low-level feature extraction,expression M in the list N created up to that point in time is either a scalar (one-dimensional) or at most a predetermined number of dimensions (e.g. ,. small number such as 1 or 2) . If the result of the check in step S25 is negative, then step S24 is reached again and one more operator is added. With steps S24 and S25 repeated, the number of holding dimensions in the computed result decreases as shown in Fig. 16. If in step S25 the result of the computation by the low-level feature extraction expression M in the list N is either a scalar or at most a predetermined small number of dimensions (e.g., 1 or 2), then step S26 is reached. [0063]

In step S26, the control section 27 checks to determine whether the expression loop parameter M is smaller than a maximum value "m." If the expression loop parameter M is found to be smaller than the maximum value "m," then the parameter M is incremented by "1" and step S23 is reached again. Conversely, if the expression loop parameter M is not found to be smaller than the maximum value "m" (i.e., if the parameter M is equal to the maximum value "m"), then the expression loop is exited and step S27 is reached. The N-th low-level feature extraction expression list is thus created by the processing up to this point. [0064]
In step S27, the control section 27 checks to determine whether the list loop parameter N is smaller than a maximum value "n." If the list loop parameter N is found to be smaller than the maximum value "n," then the parameter N is incremented by "1" and step S22 is reached again. On the other hand, if the list loop parameter N is not found to be smaller than the maximum value "n" (i.e., if the parameter N is equal to the maximum value "n"), then the list loop is exited and the first-generation list random creation process is terminated. As many as "n" low-level feature extraction expression lists of the

first generation are thus created by the processing up to
this point.
[0065]
Described below in detail with reference to the flowchart of Fig. 17 is the process for creating low-level feature extraction expression lists of the second and 'subsequent generations (i.e., next-generation list genetic creation process) in step S13 of Fig. 9. [0066]
In step S31, the low-level feature extraction expression list creation section 21 randomly determines three numbers: a selection count "ns" representing the number of lists to which'genetic algorithm selection is applied out of the "n" low level feature extraction expression lists to be created; a cross count "nx" denoting the number of lists to which genetic algorithm crossing is applied; and a mutation count "nm" indicating the number of lists to which genetic algorithm mutation is applied. The total sum of the selection count "ns," cross count "nx," and mutation count "nm" is equal to "n." Alternatively, the selection count "ns," cross count "nx," and mutation count "nm" may each be a predetermined constant. [0067]

In step S32, the low-level feature extraction expression list creation section 21 creates as many as "ns" low-level feature extraction expression lists using the low-level feature.extraction expression lists amounting to the selection count "ns" determined out of the "n" low-level feature extraction expression lists of the latest generation. In step S33, the low-level feature extraction expression list creation section 21 creates as many as "nx" low-level feature extraction expression lists using the low-level feature extraction expression lists amounting to the cross count "nx" determined out of the "n" low-level feature extraction expression lists of the latest generation. In step S34, the low-level feature extraction expression list creation section 21 creates as many as "nm" low-level feature extraction expression lists using the low-level feature extraction expression lists amounting to the mutation count "nm" determined out of the "n" low-level feature extraction expression lists of the latest generation. [0068]
The process in each of steps S32 through S34 is described below in detail. [0069]
The selection creation process of step S32 will now

be described in detail with reference to the flowchart of Fig. 18. This selection creation process creates the low-level feature extraction expression lists amounting to the selection count "ns" out of the "n" low-level feature extraction expression lists of the next generation. [0070]
In step S41, the low-level feature extraction expression list creation section 21 sorts the "n" low-level feature extraction expression lists of the latest generation in descending order of the mean estimated accuracy levels of the high-level feature extraction expressions input from the high-level feature extraction expression learning section 25. In step S42, the low-level feature extraction expression list creation section 21 adopts the top "ns" low-level feature extraction expression lists from the sorted "n" low-level feature extraction expression.lists of the latest generation as the low-level feature extraction expression lists of the next generation. This brings the selection creation process to an end. [0071]
The cross creation process of step S33 in Fig. 17 will now be described in detail with reference to the flowchart of Fig. 19. This cross creation process creates

the low-level feature extraction expression lists amounting to the cross count "nx" out of the "n" low-level feature extraction expression lists of the next generation. [0072]
In step S51, the control section 27 starts a cross loop by initializing a cross loop parameter NX to "1." The cross loop is repeated as many times as the cross count "nx." [0073]
In step 352, the low-level feature extraction expression list creation section 21 weights the low-level feature extraction expression lists of the latest generation so as to induce preferential section of the lists in descending order of the mean estimated accuracy levels of the high-level feature extraction expressions output by the high-level feature extraction expression learning section 25, before randomly selecting two low-level feature extraction expression lists A and B. During the selection of this process, the "ns" low-level feature extraction expression lists selected by the above-described selection creation process may be either excluded from the candidate lists or may be left intact as part of the candidate lists.

[0074]
In step S53, the control section 27 starts the expression loop by initializing the expression loop parameter M to "1." The expression loop is repeated as many times as the number "m" of expressions included in a single low-level feature extraction expression list.
[0075]
In step S54, the low-level feature extraction expression list creation section 21 weights the "2m" low-level feature extraction expressions in the low-level feature extraction expression lists A and B so as to induce preferential section of the expressions in descending order of the contribution ratios of the high-level feature extraction expressions output by the high-level feature extraction expression learning section 25, before randomly selecting a single low-level feature extraction expression for inclusion in a low-level feature extraction expression list of the next generation. [0076]
In step S55, the control section 27 checks to determine whether the expression loop parameter M is smaller than the maximum value "m." If the expression loop parameter. M is found to be smaller than the maximum value "m," then the parameter M is incremented by "1" and

step S54 is reached again. Conversely, if the expression loop parameter M is not found to be smaller than the maximum value "m" (i.e., if the parameter M is equal to the maximum value "m"), then the expression loop is exited and step S56 is reached. A single low-level feature extraction expression list is thus created in steps S53 through S55 constituting the expression loop. [0077]
In step S56, the control section 27 checks to determine whether the cross loop parameter NX is smaller than the maximum value "nx." If the cross loop parameter NX is found to be smaller than the maximum value "nx," then the parameter NX is incremented by "1" and step S52 is reached again. On the other hand, if the cross loop parameter NX is not found to be smaller than the maximum value "nx" (i.e., if the parameter NX is equal to the maximum value "nx"), then the cross loop is exited and the cross creation process is terminated. The low-level feature extraction expression lists amounting to the cross count "nx" are thus created by the cross loop that has been executed. [0078]
The mutation creation process of step S34 in Fig. 17 will now be described in detail with reference to the

flowchart of Fig. 20. This mutation creation process creates the low-level feature extraction expression lists amounting to the mutation count "nm" out of the "n" low-level feature extraction expression lists of the next generation. [0079]
In step S61, the control section 27 starts a mutation loop by initializing a mutation loop parameter NM to "1." The mutation loop is repeated as many times as the mutation count "nm." [0080]
In step S62, the 1ow-level feature extraction expression list creation section 21 weights the low-level feature extraction expression lists of the latest generation so as to induce preferential section of the lists in descending order of the mean estimated accuracy levels of the high-level feature extraction expressions output by the high-level feature extraction expression learning section 25, before randomly selecting a single low-level feature extraction expression list A. During the selection of this process, the "ns" low-level feature extraction expression lists selected by the above-described selection creation process may be either excluded from the candidate lists or may be left intact

as part of the candidate lists. Likewise, the low-level feature extraction expression;lists selected in step S52 of the cross creation process above may be either excluded from the candidate lists or may be left intact as part of the candidate lists. [0081]
In step S63, the control section 27 starts the expression loop by initializing the expression loop parameter M to "1." The expression loop is repeated as many times as the number "m" of expressions included in a single low-level feature extraction expression list. [0082]
In step S64, the low-level feature extraction expression list creation section 21 checks to determine whether the contribution ratio of the low-level feature computed using the M-th low-level feature extraction expression out of "m" low-level feature extraction expressions in the low-level feature extraction expression list A is lower than the contribution ratios of the low-level features computed using the other low-level feature extraction expression expressions in the low-level feature extraction expression list A. More specifically, a check is made to determine whether the contribution ratio of the low-level feature computed

using the M-th- low-level feature extraction expression is lower than a predetermined ratio in sequence of the contribution ratios computed using the "m" low-level feature extraction expressions in the low-level feature extraction expression list A. [0083]
If in step S64 the contribution ratio of the low-level feature computed.using the M-th low-level feature extraction expression is found to be lower than the others, then step S65 is reached. In step S65, the low-level feature extraction expression list creation section 21 randomly mutates the M-th low-level feature extraction expression for inclusion in the low-level feature extraction expression list of the next generation. [0084]
If in step S64 the contribution ratio of the low-level feature computed using the M-th low-level feature extraction expression is not found to be lower than the others, then step S66 is reached. In step S66, the low-level feature extraction expression list creation section 21 adds the M-th low-level feature extraction expression to the low-level feature extraction expression list of the next generation without mutating the expression in question.

[0085]
In step S67, the control section 27 checks to determine whether the expression loop parameter M is smaller than the maximum value "m." If the expression loop parameter M is found to be smaller than the maximum value "m," then the expression loop parameter M is incremented by "1" and step S64 is reached again. On the other hand, if the expression loop parameter M is not found to be smaller than the maximum value "m" (i.e., if the parameter M is equal to the maximum value "m"), then the expression loop is exited and step S68 is reached. A single low-level feature extraction expression list is thus created in steps S63 through S67 constituting the expression loop. . . . [0086]
In step S68, the control section 27 checks to determine whether the mutation loop parameter NM is smaller than the maximum value "nm." If the mutation loop parameter NM is found to be smaller than the maximum value "nm," then the mutation loop parameter NM is incremented by "1" and step S62 is reached again. On the other hand, if the mutation loop parameter NM is not found to be smaller than the maximum value "nm" (i.e., if the parameter NM is equal to the maximum value "nm"),

. then the mutation loop is exited and the mutation creation process is terminated. The low-level feature extraction expression lists amounting to the mutation count "nm" are thus created by the processing up to this point. [0087]
According to the above-described next-generation list genetic creation process, the low-level feature extraction expression lists of the latest generation which have high estimated accuracy levels are transmitted to the next generation and so are the low-level feature extraction expressions of the latest generation which have high contribution ratios. Those with low estimated accurate levels and contribution ratios are discarded without being transmitted to the next generation. That is, as generations progress, the estimated accuracy levels of the low-level feature extraction expression lists are expected to improve and so are the contribution ratios of the low-level feature extraction expressions. [0088]
Returning to Fig. 7, the low-level feature extraction expression lists of the next generation created as discussed above are output by the low-level feature extraction expression list creation- section 21 to

the low-level feature computation section 24. in step S3, the low-level feature computation section 24 computes low-level features by substituting the input data (content data and metadata) of "j" songs Cl through Cj into each of "n" low-level feature extraction expression lists input from the low-level feature extraction expression list creation section 21. [0089]
It is assumed that the input data items of the "j" songs are each furnished beforehand with as many as "k" training data items (i.e., corresponding high-level features). [0090]
Illustratively, suppose that the low-level feature computation section 24 operates on the input data whose holding dimensions are the interval and temporal axes through computations involving the "#16Mean" operator, as shown in part A of Fig. 21. In this case, a mean value at each interval is computed on the temporal axis used as the process symmetry axis, as depicted in part B of Fig. 21. The computations result in "n" combinations of low-level feature extraction expressions (each combination is made up of "m" low-level features), each combination corresponding to each input data item, as indicated in

Fig. 22. The resulting "n" .low-level feature extraction expression combinations are output to the high-level feature extraction expression learning section 25. [0091]
In step S4 back in Fig. 7, the high-level feature extraction expression learning section 25 estimates (i.e., creates) "n" high-level feature extraction expression combinations each composed of "k" high-level feature extraction expressions through learning based on the "n" low-level features input from the low-level feature computation section.24 and on corresponding training data (i.e., "k" types of high-level features corresponding to each input data item (about songs Cl through Cj) as shown in Fig. 23) . The high-level feature extraction expression learning section 25 further computes the estimated accuracy level of each high-level feature extraction expression and the contribution ratio of each low-level feature in each high-level feature extraction expression, and outputs the results of the computations to the low-level feature extraction expression list creation section 21. [0092]
The high-level feature extraction expression learning process of .step S4 is described below in detail

with reference to the flowchart of Fig. 24. [0093]
In step S71, the control section 27 starts the list loop by initializing the list loop parameter N to "1." The list loop is repeated as many times as a predetermined list count "n." In step S72, the control section 27 starts a training data loop by initializing a training data loop parameter K to "1." The training data loop is repeated as many times as a predetermined training data type count "k." [0094]
In step S73, the control section 27 starts an algorithm loop by initializing an algorithm loop parameter A to "1." The algorithm loop is repeated as many times as a predetermined learning algorithm type count "a." [0095]
Illustratively, there may be conceived four learning algorithm types: Regression (Regression analysis), Classify (Classification), SVM (Support Vector Machine), and GP (Genetic Programming). [0096]
There may be provided two "Regression" type learning algorithms. One algorithm, as shown in Fig. 25,

involves learning a parameter "bn" in such a manner that the square error between training data and an item Y is minimized on the assumption that the training data is in linear relation to low-level features. Another algorithm, as shown in Fig. 26, involves learning a parameter "bnm" in such a manner that the square error between training data and an item Y is minimized on the assumption that the training data is in nonlinear relation to low-level features. [0097]
There may be provided three major "Classify" type learning algorithms. One algorithm, as shown in Fig. 27, involves computing a Euclidian distance "d" of a given item from the center of each of different classes (a male vocal class and a female vocal class in the example of Fig. 27) and classifying the item, into the class to which the computed Euclidian distance "d" is the shortest. Another algorithm, as shown in Fig. 28, involves computing a correlation coefficient "correl" of a given item relative to the mean vector of each of different classes (a male vocal class and a female vocal class in the example of Fig. 28) and classifying the item into the class relative to which the correlation coefficient "correl" is the largest. Another algorithm, as shown in

Fig. 29, involves computing a Mahalanobis distance "d" of
a given item from the center of each of different classes (a male vocal class and a female vocal class in the
example of Fig. 29) and classifying the item into the
class to which the computed Mahalanobis distance "d" is
the shortest.
[0098]
It is also possible to conceive two "Classify" type
learning algorithm variations. One variation, shown in part A of Fig. 30, involves having the distribution of each of different class groups (male and female vocal class groups in the example of Fig. 30) represented by a plurality of classes, computing the Euclidian distance "d" of a given item from the center of each of the different classes, and classifying the item into the class to which the computed Euclidian distance "d" is the shortest. Another variation, as shown in part B of Fig. 30, involves having the distribution of each of different class groups (male and female vocal class groups in the example of Fig. 30) represented by a plurality of classes, computing the Mahalanobis distance d" of a given item from the center of each of the different classes, and classifying the item into the class to which the computed Mahalanobis distance "d" is the shortest.

[0099]
There may be provided one "SVM" type learning algorithm. This algorithm, as shown in Fig. 31, involves having the boundary plane between classes (a male and a female vocal class in the example of Fig. 31) represented by support vectors, and learning the parameter "bnm" in such a manner as to maximize the distance (margin) between a discontinuous plane separating the classes on the one hand and the vectors near the boundary plane on the other hand. [0100]
There may be provided three "GP" type learning algorithms. One algorithm, as shown in Fig. 32, involves creating an expression combinir^g low-level features through genetic programming. Another algorithm, as depicted in part A of Fig. 33, involves crossing expressions each combining low-level features. Another algorithm, as indicated in part B of Fig. 33, involves mutating an expression combining low-level features. [0101]
If the above-outlined learning algorithms are all adopted, there will be 11 learning algorithms for use. [0102]
In step S74 back in Fig. 24, the control section 27

starts a cross validation loop by initializing a cross validation loop parameter C to "1." The cross validation loop is repeated as many times as a predetermined cross validation count "c." [0103]
In step S75, the high-level feature extraction expression learning section 25 randomly bisects K-th type training data (true high-level features) about "j" songs out of as many as "k" training data types, into learning-use data and evaluation-use data. In the description that follows, the training data classified for learning use will be referred to as the learning data and the training data classified for evaluation use will be called the evaluation data. [0104]
In step S76, the high-level feature extraction expression learning section 25 estimates a high-level feature extraction expression through learning that involves the application to an a-th learning algorithm of corresponding learning data as well as the combination of "m" low-level features .computed using the N-th low-level feature extraction expression list. During the learning, some of the "m" low-level features are genetically selected for use in order to reduce the amount of

computations and to suppress over-learning (over-fitting) [0105]
As the criterion for selecting the low-level
features, either an information quantity criterion called
AIC (Akaike Information Criterion) that is a function or
an information quantity criterion called BIG (Bayesian
Information Criterion) is utilized. The AIC and BIC are
each used as the criterion for selecting a learning model
(regarding the selected low-level features in this case).
The smaller the value of the learning model, the better
the model (i.e., highly evaluated).
[0106]
The AIC is defined by the following expression: AIC = -2 X maximum: logarithmic likelihood + 2 X free parameter count [0107]
If a "Regression (linear)" type learning algorithm is adopted (as in the case of Fig. 25), then
free parameter count = n + 1 and
logarithmic likelihood = 0.5 X learning data count X ((log 2П) + 1 + log (mean square error)) so that
AIC = learning data count X ((log 2П) + 1 + log (mean

square error))+ 2 X (n + 1) [0108]
The BIG is defined by the following expression: BIC = -2 X maximum logarithmic likelihood + log(learning data count) X free parameter count [0109]
Illustratively, if a "Regression (linear)" type learning algorithm is adopted (as in the case of Fig. 25), then
BIG = learning data count X ((log 211) + 1 + log (mean square error)) + logdearning data count) X (n + 1) [0110]
Compared with the AIC, the BIC is noted for a negligible increase in its value despite the growing number of learning data. [0111]
The learning process based on a learning algorithm in step S76 will now be described in reference to Fig. 34. For this learning process, as discussed above, some of the computed "m" low-level .features are genetically selected for use in order to reduce the amount of computations and to suppress over-learning (over-fitting). [0112]
In step S91, the high-level feature extraction

expression learning section 25 creates as many as "p" initial populations of low-level features through random selection from the "m" low-level features (for use in learning). [0113]
In step S92, the high-level feature extraction
expression learning section 25 starts a feature selection
loop using a genetic algorithm (GA). The GA-based feature
selection loop is repeated until a particular condition
is met in a subsequent step S98.
[0114] .
In step S93, the control section 27 starts an initial population loop by initializing an initial population loop parameter P to "1." The initial population loop is repeated as many times as the number "p" of initial populations of low-level features created in step S91. [0115]
In step S94, .the high-level feature extraction expression learning section 25 estimates a high-level feature extraction expression through learning that involves the application to an A-th learning algorithm of the low-level features included in the P-th initial population and corresponding learning data out of

training data. [0116]
In step S95, the high-level feature extraction expression learning section 25 computes the information quantity criterion AIC or BIC as the evaluation value for the high-level feature obtained as a result of step S94. [0117]
In step S96, the control section 27 checks to determine whether the initial population parameter P is smaller than the maximum value "p." If the initial population parameter P is found to be smaller than the maximum value "p," then the initial population loop parameter P is incremented by l" and step S94 is reached again. On the other hand, if the initial population parameter P is not found .to be smaller than the maximum value "p" (i.e., if the parameter P is equal to the maximum value "p"), then the initial population loop is exited and step S97 is reached. The information quantity criterion AIC or BIG is thus acquired as the evaluation value for the high-level feature extraction expression learned from each initial population in steps S93 through S96 constituting the initial population loop. [0118]
In step S97, the high-level feature extraction

expression learning section 25 genetically updates "p" initial populations made up of low-level features to be used for learning. More specifically, as in steps S32 through S34 of Pig. 17, the initial populations are updated through selection, crossing, and mutation. The update is intended to improve the evaluation value for the high-level feature extraction expression, acquired from the initial populations created initially in random, fashion. [0119]
In step S98, the control section 27 checks to determine whether the high-level feature extraction expression having the highest evaluation value (i.e., the smallest information quantity criterion) among the high-level feature extraction expressions corresponding to the "p" initial populations is, found to have its evaluation value improved (i.e., information quantity criterion is lowered) following each GA-based feature selection loop. If the result of the check in step S98 is affirmative, then step S93 is reached again and the feature selection loop is repeated. On the other hand, it might happen that the high-level feature extraction expression having the highest evaluation value among the expressions corresponding to the, "p" initial populations is not found

to have its evaluation value improved (i.e., information quantity criterion is not reduced) following the repeated GA-based feature selection loop. In this case, the GA-based feature selection loop is exited, and the high-level feature extraction expression having the highest evaluation value is output to a downstream process (i.e., to step S77 in Fig. 24). This brings the learning process based on a learning algorithm to an end. [0120]
The number of low-level features selected in step S91 may be fixed. In this case, the necessary number of low-level features may be obtained using the entropy of relevant training data. Likewise, the training data may be analyzed for main components and the number of low-level features may be set equal to the number of the analyzed main components. [0121]
In step S77 back in Fig. 24, the high-level feature extraction expression learning section 25 evaluates the high-level feature extraction expression having the highest evaluation value acquired in step S76, using relevant evaluation data. More specifically, the high-level feature extraction expression learning section 25 computes the high-level feature using the acquired high-

level feature extraction expression to find the square
error between the result of the computation and the
evaluation data in question.
[0122]
In step S78, the control section 27 checks to determine whether the cross validation loop parameter C is smaller than the maximum value "c." If the cross validation loop parameter C is found to be smaller than the maximum value "c," then the parameter C is incremented by "1" and step S75 is reached again. On the other hand, if the cross validation loop parameter C is not found to be smaller than the,maximum value "c" (i.e., if the parameter C is equal to the maximum value "c"), then the cross validation loop is exited and step S79 is reached. As many as "c" high-level feature extraction expressions are thus acquired as a result of the learning in steps S74 through S78 constituting the cross validation loop. Because the learning data and evaluation data are randomly transformed through the cross validation loop, it is possible to verify that the high-level feature extraction expressions are not over-learned. [0123]
In step S79, the high-level feature extraction expression learning section 25 selects the high-level

feature extraction expression having the highest evaluation value derived from step S77 out of the "c" high-level feature extraction expressions acquired through the cross validation loop. [0124]
In step S80, the control section 27 checks to determine whether the algorithm loop parameter A is smaller than the maximum value "a." If the algorithm loop parameter A is found to be smaller than the maximum value "a," then the parameter A is incremented by "1" and step S74 is reached again. On the other hand, if the algorithm loop parameter A is not found to be smaller than the maximum value "a" (i.e,, 'if the parameter A is equal to the maximum value "a"), then the algorithm loop is exited and step S81 is reached. As many as "a" high-level feature extraction expressions, of the K-th type learned using "a" learning algorithms are thus acquired in steps S73 through S80 constituting the algorithm loop. [0125]
In step S81, the high-level feature extraction expression learning section 25 selects the high-level feature extraction expression having the highest evaluation value derived from step S77 out of the "a" high-level feature extraction expressions learned through

the algorithm loop.
[0126]
In step S82, the control section 27 checks to determine whether the training data loop parameter K is smaller than the maximum value "k." If the training data loop parameter K is found to be smaller than the maximum value "k," then the parameter K is incremented by "1" and step S73 is reached again. On the other hand, if the training data loop parameter K is not found to be smaller than the maximum value "k" (i.e., if the parameter K is equal to the maximum value "k"), then the training data loop is exited and step S83 is reached. As many as "k" high-level feature extraction expressions corresponding to the N-th low-level feature extraction expression list are thus acquired in steps S72 through S82 constituting the training data loop. [0127]
In step S83, the control section 27 checks to determine whether the list loop parameter N is smaller than the maximum value "n." If the list loop, parameter N is found to be smaller than the maximum value "n," then the parameter N is incremented by "1" and step S72 is reached again. On the other hand, if the list loop parameter N is not found to be smaller than the maximum

value "n" (i.e., if the parameter N is equal to the maximum value "n"), then the list loop is exited and step S84 is reached. As many as "k" high-level feature extraction expressions corresponding to each of "n" low-level feature extraction expression lists are thus acquired in steps S71 through S83 constituting the list loop. [0128]
In step S84, the high-level feature extraction expression learning section 25 computes the estimated accuracy level of each of "k" high-level feature extraction expressions corresponding to each of the acquired "n" low-level feature extraction expression lists and the contribution ratio of each low-level feature in each high-level feature extraction expression, and outputs the results of the computations to the low-level feature extraction expression list creation section 21. This brings the high-level feature extraction expression learning process to,an end. [0129]
In step S5 back in Fig. 7,.the control section 27 checks to determine whether the learning loop parameter G is smaller than the maximum value "g." If the learning loop parameter G is found to be smaller than the maximum

value "g," then the parameter G is incremented by "1" and
step S2 is reached again. On the other hand, if the
learning loop parameter G is not found to be smaller than
the maximum value "g" (i.e., if the parameter G is equal
to the maximum value "g"), then the learning loop is
exited and step S6 is reached. Steps S1 through S5
constitute the feature extraction algorithm learning
process. Step S6 following that process is a process for
computing high-level features using feature extraction
algorithms.
[0130]
In step S6, the high-level feature extraction expression learning section 25 supplies the high-level feature computation section 26 with "m" low-level feature extraction expressions constituting the list having the highest mean accuracy level of the acquired high-level features out of "n" low-level feature extraction expression lists of the latest generation obtained through learning, along with "k" high-level feature extraction expressions corresponding to that list. [0131]
In step S7, the high-level feature computation section 26 computes a high-level feature with high accuracy using the low-level feature extraction

expressions and high-level feature extraction expressions supplied most recently from the high-level feature extraction expression learning section 25. The high-accuracy high-level feature computation process of step S7 will be discussed later in reference to Fig. 38 and subsequent drawings. [0132]
The foregoing paragraph completes the description of the feature extraction algorithm creation process performed by the feature extraction algorithm creation apparatus 20. [0133]
What follows is a description of a new operator creation process to be carried out as generations of low-level feature extraction expression lists progress following the repeated learning loop constituted by steps S1 through S6 in the above-described feature extraction algorithm creation process, such as when the contribution ratio of a low-level feature extraction expression has improved or when the estimated accuracy level of a corresponding high-level feature extraction expression has been boosted. [0134]
When the current generation of low-level feature

extraction expression lists is replaced by a new generation, permutations of a plurality of operators (hereinafter simply called operator combinations) may frequently appear in different low-level feature expressions as shown in Pig. 35, In that case, a combination of-a plurality of. operators appearing frequently in the different low-level feature expressions may be regarded as a single new operator and registered as such for use by the low-level feature extraction expression list creation section 21. [0135]
In the example of Fig. 35, a three-operator combination "32#FFT,.Log, 32#FFT" is shown appearing in five low-level feature extraction expressions. If the operator combination "32#FFT, Log, 32#FFT" is registered illustratively as one operator "NewOperator 1," then low-level feature extraction expressions of the next and subsequent generations will include the operator "NewOperator 1" as shown in Fig. 36. [0136]
The new operator creation will is described below in reference to the flowchart of Fig. 37. In step S101, the operator combination detection section 22 creates a permutation (a sequenced combination) of a predetermined

number of operators (e.g., 1 to 5 operators). The number
of operator combinations to be created in this process is
assumed to be "og."
[0137]
In step S102, the control section 27 starts a combination loop by initializing a combination loop parameter OG to "1." The combination loop is repeated as many times as an operator combination count "og."
[0138]
In step S103, ah operator combination occurrence frequency "Count" of the og-th operator combination is initialized to "1." In step,3104, the control section 27 starts the list loop by initializing the list loop parameter N to "0." The list loop is repeated as many times as the, predetermined list count "n." In step S105, the control section 27 starts the expression loop by initializing the expression,loop parameter M to "1." The expression loop is repeated as many times as the number "m" of low-level feature extraction expressions constituting a single low-level feature extraction expression list. [0139]
In step S106, the operator combination detection section 22 checks to determine whether the og-th operator

combination exits in the M-th low-level feature extraction expression as part of the N-th low-level feature extraction expression list. If the og-th operator combination is found to exist in the expression, then step S107 is reached and the occurrence frequency "Count" is incremented by "1." On the other hand, if the og-th operator combination is not found to exist, then step S107 is skipped and step S108 is reached. [0140]
In step S108, the control section 27 checks to determine whether the expression loop parameter M is smaller than the maximum value "m." If the expression loop parameter M is found to be smaller than the maximum value "m," then the parameter M is incremented by "1" and step S106 is reached again. On.the other hand, if the expression loop parameter M is not fount to be smaller than the maximum value "m" (i.e., if the parameter M is equal to the maximum value "m"), then the expression loop is exited and step S109 is reached. [0141]
In step S109, the control section 27 checks to determine whether the list loop parameter N is smaller than the maximum value. , "n." If the list loop parameter N is found to be smaller than the maximum value "n," then

the list loop parameter N is incremented by "1" and step S105 is reached again. On the other hand, if the list loop parameter N is not found to be smaller than the maximum value "n" (i.e., if the parameter N is equal to the maximum value "n"), then the list loop is exited and step S110 is reached. [0142]
In step S110, the control section 27 checks to determine whether the combination loop parameter OG is smaller than the maximum value "og." If the combination loop parameter OG is found to be smaller than the maximum value "og," then the parameter OG is incremented by "1" and step S103 is reached again. On the other hand, if the combination loop parameter OG is not found to be smaller than the maximum value "og" (i.e., if the parameter OG is equal to the maximum value "og"), then the combination loop is exited and step S110 is reached. The occurrence frequency "Count" of each of all operator combinations is thus detected by the processing up to this point. [0143]
In step S1ll, the operator combination detection section 22 extracts the operator combination of which the occurrence frequency "Count" is higher than a

predetermined threshold value, and outputs the extracted combination to the operator.creation section 23. In step S112, the operator creation section 23 registers the operator combination input, from the operator combination detection section 22 as a new operator. This brings the new operator creation process to an end. [0144]
According to the new operator creation process, as described above, the combination of frequently occurring operators, i.e., the combination of operators considered effective in computing high-level features is registered as a single operator for .use in low-level feature extraction expressions of the next and subsequent generations. This boosts the speed at which to create low-level feature extraction expressions as well as the speed of progress of expression generations. It is thus possible to detect effective low-level feature extraction expressions at early stages. Another benefit of the new operator creation process is automatic detection of the effective operator combinations that used to be detected manually. [0145]
The above-outlined high-accuracy high-level feature computation process in step.S7 of Fig. 7 is described

below in more detail with reference to the flowchart of
Fig. 38.
[0146]
In step S141, the high-level feature computation section 26 performs a high-accuracy reject process for selecting only the high-level feature extraction expressions yielding high-accuracy results out of those expressions eventually supplied from the high-level feature extraction expression learning section 25. [0147]
The high-accuracy reject process is based on the assumption that there is a causal relationship between the accuracy levels of high-level features on the one hand and the values of low-level features on the other hand. On that premise, the process involves acquiring through learning a reject area extraction, expression to which low-level features are input and from which the accuracy level of a high-level feature is output. The high-accuracy reject process is described below in detail with reference to the flowchart of Fig. 39. [0148]
In step S151, the low-level feature computation section 41 in the high-level feature computation section 26 acquires an eventual low-level feature extraction

expression list. The high-level feature computation section 42 in the high-level feature computation section 26 obtains an eventual high-level feature extraction expression. [0149]
In step S152, the control section 27 starts a content loop by initializing a content loop parameter J to "1." The content loop is repeated as many times as the number "j" of input data items (content data and metadata) that may be provided beforehand for carrying out the high-accuracy reject process. It is assumed that the high-level features corresponding to the prepared input data are also provided as training data. [0150]
In step S153, the low-level feature computation section 41 substitutes the L-th input data item into the eventual low-level feature extraction expression list acquired in step S151, and outputs "m" low-level features resulting from the computations to the high-level feature computation section 42 and to the reject area extraction expression learning section 44. The high-level feature computation section 42, substitutes the "m" low-level features input from the low-level feature computation section 41 into the eventual high-level feature

extraction expression acquired in step S151, and outputs
the high-level feature resulting from the computations to
the square error computation section 43.
[0151]
In step S154, the square error computation section 43 computes a square error between the high-level feature input from the high-level feature computation section 42 and the training data (i.e., true high-level feature corresponding to the input data), and outputs the result of the computation to the reject area extraction expression learning section 44. The square error thus computed represents .the accuracy of the high-level feature extraction expression computed by the high-level feature computation section 42 (the computed accuracy is called the feature extraction.accuracy). [0152]
In step S155, the .control section 27 checks to determine whether the content loop parameter J is smaller than the maximum value "j." If the content loop parameter J is found to be smaller than the maximum value "j," then the parameter J is incremented by "1" and step S153 is reached again. On the other hand, if the content parameter J is not found to be smaller than the maximum value "j" (i.e., if the parameter J is equal to the

maximum value "j"), then the content loop is exited and step S156 is reached. The square error between the computed high-level feature and the training data corresponding to each input data item is thus acquired in steps S151 through S155 constituting the content loop. [0153]
In step S156, the reject area extraction expression learning section 44 creates a reject area extraction expression to which the low-level features are input and from which the feature extraction accuracy of the high-level feature computed.using the input quantities is output, through learning based on the low-level features input from the low-level .feature computation section 41 and on the square error input from the square error computation section 43. The reject area extraction expression thus created is supplied to the feature extraction accuracy computation section 45. This completes the high-accuracy reject process, and control is passed on to step S142 in Fig. 38. [0154]
In step S142, the low-level feature computation section 41 substitutes, the input data of the song of which the high-level feature is desired to be acquired, into the eventual low-level feature extraction expression

list so as to obtain the low-level features. The low-level feature computation section 41 outputs the results of the computations to the high-level feature computation section 42 and to the feature extraction accuracy computation section 45. [0155]
In step S143, the feature extraction accuracy computation section 45 substitutes the low-level features input from the low-level feature computation section 41 into the reject area extraction expression supplied from the reject area extraction expression learning section 44, so as to compute the feature extraction accuracy of the high-level feature .computed using the input low-level features (i.e., square error estimated in regard to the high-level feature computed by the high-level feature computation section 42). [0156]
In step S144, the feature extraction accuracy computation section 45 .checks to determine whether the feature extraction accuracy computed in step S143 is higher than a predetermined threshold value. If the computed feature extraction accuracy is found to be higher than the threshold value, then step S145 is reached. In step S145, the feature extraction accuracy

computation section 45 causes the high-level feature computation section 42 to compute the high-level feature. The high-level feature computation section 42 substitutes the "m" low-level features input from the low-level feature computation section 41 in step S142 into the eventual high-level feature extraction expression so as to compute the high-level feature. The high-level feature computed at this point is the eventual output. This completes the high-accuracy high-level feature computation process. [0157]
If in step S144. the computed feature extraction accuracy is found to be lower"than the predetermined threshold value, then step S145 is skipped and the high-accuracy high-level feature computation process is brought to an end. [0158]
The high-accuracy high-level feature computation process described above makes it possible to estimate the accuracy of the high-level feature using the high-level feature extraction expression. Because those high-level features that are not expected to achieve high accuracy levels are excluded from computations, wasteful computations are eliminated.

[0159]
As described, the feature extraction algorithm learning process performed by the feature extraction algorithm creation apparatus 20 practiced as the first embodiment of the present invention permits quick and highly accurate creation of algorithms for extracting relevant features from song data. The learning process also makes it possible to acquire only the high-level features of high accuracy levels with a minimum of computations. [0160]
Described below are the workings of the feature extraction algorithm creation apparatus 60 practiced as a second embodiment of the present invention. Fig. 40 is a block diagram showing a typical structure of the feature extraction algorithm creation apparatus 60. [0161]
The feature extraction algorithm creation apparatus
60 replaces the low-level feature extraction expression
list creation section 21 and high-level feature
extraction expression learning section 25 in the feature
extraction algorithm creation apparatus 20 with a low-
level feature extraction expression list creation section
61 and a high-level feature extraction expression

learning section 65 respectively. [0162]
There are some differences between the feature extraction algorithm creation apparatus 60 and the feature extraction algorithm creation apparatus 20. One difference is that whereas the number "m" of expressions in the low-level feature extraction expression list created by the low-level feature extraction expression list creation section. 21 of the,feature extraction algorithm creation apparatus 20 is a constant, the number "m" of expressions in each low-level feature extraction expression list created, by the low-level feature extraction expression list creation section 61 of the feature extraction algorithm creation apparatus 60 is randomly determined. [0163]
Another difference is that while the high-level feature extraction expression learning section 25 of the feature extraction algorithm creation apparatus 20 outputs estimated accuracy levels and contribution ratios as a feedback to the low-level feature extraction expression list creation section 21, the high-level feature extraction expression learning section 65 of the feature extraction algorithm creation apparatus 60

outputs evaluation values and contribution ratios as a
feedback to the low-level feature extraction expression
list creation section 61.
[0164]
The components of the feature extraction algorithm creation apparatus 60 other than the low-level feature extraction expression list creation section 61 and high-level feature extraction expression learning section 65 have substantially similar counterparts in the feature extraction algorithm creation apparatus 20. The similar components are designated by like reference numerals and will not be described further. [0165]
The workings of the feature extraction algorithm creation apparatus 60 will now be described by again referring to some of the drawings used earlier to explain the feature extraction algorithm creation apparatus 20 above. [0166]
The feature extraction algorithm creation process, a basic operation of the feature extraction algorithm creation apparatus 60, is described below in reference to the flowchart of Fig. 7. [0167]

In step S1, the control section 27 starts the learning loop by initializing the learning loop parameter G to "1." The learning loop is repeated as many times as the learning count "g" determined beforehand illustratively by the user. [0168]
In step S2, the low-level feature extraction
expression list creation section 61 creates "n" low-level
feature extraction expression lists and outputs the
created lists to the low-level feature computation
section 24.
[0169]
The process of step S2 (low-level feature extraction expression list creation process) is described below in detail by referring to the flowchart of Fig. 9. [0170]
In step Sll, the low-level feature extraction expression list creation section 61 checks to determine whether the low-level feature extraction expression lists to be crated are of the first generation. If the learning loop parameter G is set to "0," then the low-level feature extraction expression lists to be crated are found to be of the first generation. [0171]

When the low-level feature extraction expression lists to be crated are found to be of the first generation because the learning loop parameter G is set to "0," step S12 is reached. In step S12, the low-level feature extraction expression list creation section 61 randomly creates low-level feature extraction expression lists of the first generation. [0172]
By contrast, if the low-level feature extraction expression lists to be crated are not found to be of the first generation, then step S13 is reached. In step S13, the low-level feature extraction expression list creation section 61 genetically creates low-level feature extraction expression lists of the next generation using a genetic algorithm based .on the low-level feature extraction expression lists of the latest generation.
[0173]
The first-generation list random creation process in step S12 performed by the.low-level feature extraction expression list creation section 61 is described below in detail with reference to the flowchart of Fig. 41. [0174]
In step S171, the control section 27 starts the list loop by initializing the list loop parameter N to

"1." The list loop is repeated as many times as the predetermined list count ,"n." [0175]
In step S172, the low-level feature extraction expression list creation section 61 randomly determines the number "m" of low-level feature extraction expressions constituting the N-th low-level feature extraction expression list to be created for the first generation. [0176]
In step S173, the .control section 27 starts the expression loop by initializing the expression loop parameter M to "1." The expression loop is repeated as many times as the number "m" of low-level feature extraction expressions constituting a single low-level feature extraction expression list. [0177]
In step S174, the low-level feature extraction expression list creation section 61 randomly determines the input data for the M-th low-level feature extraction expression (also referred to as the low-level feature extraction expression M) in the N-th low-level feature extraction expression list,(also called the list N) to be created.

[0178]
In step S175, the low-level feature extraction expression list creation section 61 randomly determines a process symmetry axis and a parameter for the low-level feature extraction expression M in the list N to be created. [0179]
In step S176, the low-level feature extraction expression list creation section 61 checks to determine whether the result of the computation by the low-level feature extraction expression M in the list N created up to that point in time is either a scalar (one-dimensional) or at most a predetermined number of dimensions (e.g., small number such as 1 or 2). If the result of the check in step, S176 is negative, then step S175 is reached again and one more operator is added. With steps S175 and S176 repeated, the number of holding dimensions in the computed result decreases as shown in Fig. 16. If in step S176 the result of the computation by the low-level feature extraction expression M in the list N is either a scalar or at most a predetermined small number of dimensions (e.g., 1 or 2), then step S177 is reached. [0180]

In step Sill, the control section 27 checks to determine whether the expression loop parameter M is smaller than the maximum value "m." If the expression loop parameter M is found to be smaller than the maximum value "m," then the parameter M is incremented by "1" and step S174 is reached again. Conversely, if the expression loop parameter M is not found to be smaller than the maximum value "m" (i.e., if the parameter M is equal to the maximum value "m"), then the expression loop is exited and step S178 is reached. The N-th low-level feature extraction expression list of the first generation is thus created in steps S173 through S177 above. [0181]
In step S178, the control section 27 checks to determine whether the list loop parameter N is smaller than the maximum value "n." If the list loop parameter N is found to be smaller than the maximum value "n," then the parameter N is incremented, by "1" and step S172 is reached again. On the other hand, if the list loop parameter N is not found to be smaller than the maximum value "n" (i_.e., if the parameter N is equal to the maximum value "n"), then the list loop is exited and the first-generation list random creation process is

terminated. As many as "n" low-level feature extraction expression lists of the first generation, each list having a different number "m" of expressions, are thus created by the processing up to this point. [0182]
Described below in detail with reference to the flowchart of Fig. 17 is the process performed by the low-level feature extraction expression list creation section 61 for creating low-level feature extraction expression lists of the second and subsequent generations (i.e., next-generation list genetic creation process),in step S13 of Fig. 9. [0183]
In step S31, the low-level feature extraction expression list creation section 61 randomly determines three numbers: a selection count "ns" representing the number of lists to,which genetic algorithm selection is applied out of the "n" low level feature extraction expression lists to be created; a cross count "nx" denoting the number of lists to which genetic algorithm crossing is applied; and a mutation count "nm" indicating the number of lists to which genetic algorithm mutation is applied. The total sum of the selection count "ns," cross count "nx," and mutation count "nm" is equal to

"n." Alternatively, the selection count "ns," cross count "nx," and mutation count nnm" may each be a predetermined constant. [0184]
In step S32, the low-level feature extraction expression list creation section 61 creates as many as "ns" low-level feature extraction expression lists using the low-level feature extraction expression lists amounting to the selection count "ns" determined out of the "n" low-level feature extraction expression lists of the latest generation. In step S33, the low-level feature extraction expression list creation section 61 creates as many as "nx" low-level feature extraction expression lists using the low-level feature extraction expression lists amounting to the,cross count "nx" determined out of the "n" low-level feature.extraction expression lists of the latest generation. In step S34, the low-level feature extraction expression list creation section 61 creates as many as "nm" low-level feature extraction expression lists using the low-level feature extraction -expression lists amounting to the mutation count "nm" determined out of the "n" low-level feature extraction expression lists of the latest generation. [0185]

The process in each of steps S32 through S34 is described below in detail. [0186]
The selection creation process of step S32
performed by the low-level feature extraction expression
list creation section 61 will now be described in detail
with reference to the flowchart of Fig. 42. This
selection creation process creates the low-level feature
extraction expression lists amounting to the selection
count "ns" out of the "n" low-level feature extraction
expression lists of the next generation.
[0187] ..
In step S181, the low-level feature extraction expression list creation section 61 sorts the "n" low-level feature extraction expression lists of the latest generation in descending order of the mean evaluation values of the high-level feature extraction expressions input from the high-level feature extraction expression learning section 65. In step S182, the low-level feature extraction expression list creation section 61 adopts the top "ns" low-level feature extraction expression lists from the sorted "n" low-level feature extraction expression lists of the. latest generation as the low-level feature extraction expression lists of the next

generation. This brings the selection creation process to
an end.
[0188]
The cross creation process of step S33 in Fig. 17 performed by the low-level feature extraction expression list creation section 61 will now be described in detail with reference to the flowchart of Fig. 43. This cross creation process creates the low-level feature extraction expression lists amounting to the cross count "nx" out of the "n" low-level feature extraction expression lists of the next generation., [0189]
In step S191, the control section 27 starts the cross loop by initializing the cross loop parameter NX to "1." The cross loop is repeated as many times as the cross count "nx." [0190]
In step S192, the low-level feature extraction expression list creation section 61 weights the low-level feature extraction expression lists of the latest generation so as to induce preferential section of the lists in descending order of the mean evaluation values of the high-level feature extraction expressions output
f
by the high-level feature extraction expression learning

section 65, before randomly selecting two low-level feature extraction expression lists A and B. During the selection of this process, the "ns" low-level feature extraction expression lists selected by the above-described selection creation process may be either excluded from the candidate lists or may be left intact as part of the candidate lists. [0191]
In step S193, the low-level feature extraction expression list creation section 61 randomly determines the number "m" of expressions in each low-level feature extraction expression list to be created through a subsequent expression loop, within the range defined as follows:
m = ((number of expressions in list A + number of
expressions in list B)/2)±mr
where, "mr" is a predetermined value.
[0192]
In step S194, the control section 27 starts the expression loop by initializing the expression loop parameter M to "1." The expression loop is repeated as many times as the expression count "m" randomly determined in step S193.
[0193]

In step S195, the low-level feature extraction
expression list creation section 61 weights all low-level
feature extraction expressions in the low-level feature
extraction expression lists A and B so as to induce
preferential section of the expressions in descending
order of the contribution ratios of the high-level
feature extraction expressions output by the high-level
feature extraction expression learning section 65, before
randomly selecting a single low-level feature extraction
expression for inclusion in a low-level feature
extraction expression list of the next generation.
[0194]
In step S196, the, control section 27 checks to

determine whether the expression, loop parameter M is smaller than the maximum value "m." If the expression loop parameter M is found to be smaller than the maximum value "m," then the parameter M is incremented by "1" and step S195 is reached again. Conversely, if the expression loop parameter M is not found to be smaller than the maximum value "m" (i.e., if the parameter M is equal to the maximum value "m"), then the expression loop is exited and step S197 is reached. A single low-level feature extraction expression list is thus created in steps S194 through. S196 constituting the expression loop.

[0195]
In step S197, the control section 27 checks to determine whether the cross loop parameter NX is smaller than the maximum value "nx." If the cross loop parameter NX is found to be. smaller than the maximum value "nx," then the parameter NX is incremented by "1" and step S192 is reached again. On the other hand, if the cross loop parameter NX is not found to be smaller than the maximum value "nx" (i.e., if the parameter NX is equal to the maximum value "nx"), then the cross loop is exited and the cross creation process is terminated. The low-level feature extraction expression lists amounting to the cross count "nx" are thus created by the cross loop that has been executed. [0196]
The mutation creation.process of step S34 in Fig. 17 performed by the,low-level feature extraction expression list creation section 61 will now be described in detail with reference to the flowchart of Fig. 44. This mutation creation process creates the low-level feature extraction expression lists amounting to the mutation count "nm" out of the "n" low-level feature extraction expression lists of the next generation. [0197]

In step S201, the control section 27 starts the mutation loop by initializing the mutation loop parameter NM to l." The mutation loop is repeated as many times as the mutation count "nm." [0198]
In step S202, the low-level feature extraction expression list creation section 61 weights the low-level feature extraction expression lists of the latest generation so as to induce preferential section of the lists in descending order of the mean evaluation values of the high-level feature extraction expressions output by the high-level feature extraction expression learning section 65, before randomly selecting a single low-level feature extraction expression list A. During the selection of this process, the "ns" low-level feature extraction expression lists selected by the above-described selection/creation.process may be either excluded from the candidate lists or may be left intact as part of the candidate lists. Likewise, the low-level feature extraction expression lists selected in step S192 of the cross creation process above may be either excluded from the candidate lists or may be left intact as part of the candidate.lists. [0199]

In step S203, the low-level feature extraction expression list creation section 61 randomly determines the number "m" of expressions in each low-level feature extraction expression list to be created through a subsequent expression loop. [0200]
More specifically, the low-level feature extraction expression list creation section 61 randomly determines in step S203 the expression count "m" for each low-level feature extraction expression 'list within the range defined as follows:
m = number of expressions in list A ± mr where, "mr" is a predetermined value. [0201]
In step S204, the control section 27 starts the expression loop by initializing the expression loop parameter M to "1." The expression loop is repeated as many times as the expression count "m" randomly determined in step S203. [0202]
In step S205, the low-level feature extraction expression list creation section 61 checks to determine whether the contribution ratio of the low-level feature computed using the M-th low-level feature extraction

expression out of all low-level feature extraction expressions in the low-level feature extraction expression list A is lower than the contribution ratios of the low-level features computed using the other low-level feature extraction expression expressions in the low-level feature extraction expression list A. More specifically, a check is made to determine whether the contribution ratio of the low-level feature computed using the M-th low-level feature extraction expression is lower than a predetermined ratio in sequence of the contribution ratios computed using all low-level feature extraction expressions in. the low-level feature extraction expression list A. [0203]
If in step S205 .the contribution ratio of the low-level feature computed.using the M-th low-level feature extraction expression is found to be lower than the others, then step S206 is reached. In step S206, the low-level feature extraction expression list creation section 61 randomly mutates the M-th low-level feature extraction expression for inclusion in the low-level feature extraction expression list of the next generation. [0204]
If in step S205 the contribution ratio of the low-

level feature computed using the M-th low-level feature extraction expression is not found to be lower than the others, then step S207 is reached. In step S207, the low-level feature extraction expression list creation section 61 adds the M-th low-level feature extraction expression to the low-level feature extraction expression list of the next generation without mutating the expression in question. [0205]
If the randomly determined expression count "m" is found to be larger than the number of low-level feature extraction expressions constituting the low-level feature extraction expression list A, then the check of step S205 is omitted when the expression loop parameter M exceeds the low-level feature extraction expression count for the list A. At this point, a,low-level feature extraction expression is randomly created and added to the low-level feature extraction expression list of the next generation. [0206]
In step S208, the control section 27 checks to determine whether the .expression loop parameter M is smaller than the maximum value "m." If the expression loop parameter M is found to be smaller than the maximum value "m," then the expression loop parameter M is

incremented by "1" and step S205 is reached again. On the other hand, if the expression.loop parameter M is not found to be smaller than the maximum value "m" (i.e., if the parameter M is equal to the maximum value "m"), then the expression loop is exited and step S209 is reached. [0207]
A single low-level feature extraction expression list is thus created in steps S203 through S208 above. [0208]
In step S209, the control section 27 checks to determine whether the mutation loop parameter NM is smaller than the maximum value nm." If the mutation loop parameter NM is found to be smaller than the maximum value "nm," then the mutation ,loop parameter NM is incremented by "l",and step S202, is reached again. On the other hand, if the mutation loop parameter NM is not found to be smaller than the maximum value "nm" (i.e., if the parameter NM is equal to the maximum value "nm"), then the mutation loop is exited and the mutation creation process is terminated. The low-level feature extraction expression lists amounting to the mutation count "nm" are thus created by the processing up to this. point. [0209]

According to the above-described next-generation list genetic creation process performed by the low-level feature extraction expression 1ist creation section 61, the low-level feature extraction expression lists of the latest generation which have high evaluation values are transmitted to the next generation and so are the low-level feature extraction expressions of the latest generation which have high contribution ratios. Those with low evaluation values and contribution ratios are discarded without being transmitted to the next generation. That is,as generations progress, the evaluation values of the low-level feature extraction expression lists are expected to improve and so are the contribution ratios ofthe,low-level feature extraction expressions. [0210]
Returning to Fig. 7, the low-level feature extraction expression lists of the next generation created as discussed above are output by the low-level feature extraction expression list creation section 61 to the low-level feature computation section 24. In step S3, the low-level feature computation section 24 computes low-level features by substituting the input data (content data and metadata) of "j" songs Cl through Cj

into each of "n" low-level feature extraction expression lists input from the low-level feature extraction expression list creation section 61. The resulting "n" low-level features are output to the high-level feature extraction expression learning section 65. [0211]
In step S4, the high-level feature extraction expression learning section 65 estimates (i.e., creates) nn" high-level feature extraction expression combinations each composed of "k" high-level feature extraction expressions through learning based on the "n" low-level features input from the low-level feature computation section 24 and on corresponding training data. The high-level feature extraction expression learning section 65 further computes the evaluation value of each high-level feature extraction expression and the contribution ratio of each low-level feature in each high-level feature extraction expression, and outputs the results of the computations to the low-level feature extraction expression list creation section 61. [0212]
The high-level feature extraction expression learning process of step S4 performed by the high-level feature extraction expression learning section 65 is

described below in detail with reference to the flowchart
of Fig. 45.
[0213]
In step S211, the control section 27 starts the list loop by initializing the list loop parameter N to "1." The list loop'is repeated as many times as the predetermined list count "n." In step S212, the control section 27 starts the training data loop by initializing the training data loop parameter K to "1." The training data loop is repeated as many times as the predetermined training data type count,"k." [0214]
In step S213, the control section 27 starts the algorithm loop by initializing the algorithm loop parameter A to "1" ;The algorithm loop is repeated as many times as the predetermined learning algorithm type count "a." Used in the.algorithm loop are the same kinds of algorithms as discussed above in connection with the feature extraction algorithm creation apparatus 20. [0215]
In step S214, the control section 27 starts the cross validation loop by initializing the cross validation loop parameter C to l." The cross validation loop is repeated as many times as the predetermined cross

validation count "c."
[0216]
In step S215, the high-level feature extraction
expression learning section 65 randomly bisects K-th type training data (true high-level features) about "j" songs out of as many as "k" training data types, into learning-use data and evaluation-use data. In the ensuing description, the training data classified for learning use will be referred to as the learning data and the training data classified for evaluation use will be called the evaluation data. [0217]
In step S216, the high-level feature extraction expression learning section 65 estimates a high-level feature extraction expression through learning that involves the application to an a-th learning algorithm of corresponding learning data as well as a combination of multiple low-level features computed using the N-th low-level feature extraction expression list. [0218]
The learning process above, unlike the one carried out by the high-level feature extraction expression learning section 25, utilizes all low-level features computed using the N-th low-level feature extraction

expression list (the result of the computation is called the low-level feature combination). This turns the evaluation value of the high-level feature extraction expression into a value that gives due consideration to whether or not the number of original low-level features is sufficiently large. [0219]
Described below with reference to Fig. 46 is the learning process based on a learning algorithm in step S216 performed by the high-level feature extraction expression learning section 65. [0220]
In step S231, the high-level feature extraction expression learning section 65 estimates a high-level feature extraction expression through learning that involves the application to an A-th learning algorithm of the low-level feature combination and the learning data out of the training data. [0221]
In step S232, the high-level feature extraction expression learning section 65 computes the information quantity criterion AIC or BIG as the evaluation value of the high-level feature obtained from the preceding step. This brings the learning process based on a learning

algorithm to an end. [0222]
In step S217 back in Fig. 45, the high-level feature extraction expression learning section 65 evaluates the high-level feature extraction expression acquired in step S216 using the evaluation data. More specifically, the high-level feature extraction expression learning section 65 computes the high-level feature using the acquired high-level feature extraction expression and finds the square error between the computed high-level feature and the evaluation data. [0223]
In step S218, the control section 27 checks to determine whether the cross validation loop parameter C is smaller than the maximum value "c." If the cross validation loop parameter C is found to be smaller than the maximum value "c," then the parameter C is incremented by "1" and step S215 is reached again. On the other hand, if the cross validation loop parameter C is not found to be smaller than the maximum value "c" (i.e., if the parameter C is equal to the maximum value "c"), then the cross validation loop is exited and step S219 is reached. As many as "c" high-level feature extraction expressions are thus acquired as a result of

the learning in steps S214 through S218 constituting the cross validation loop. Because the learning data and evaluation data are randomly transformed through the cross validation loop, it is possible to verify that the high-level feature extraction expressions are not over-learned. [0224]
In step S219, the high-level feature extraction expression learning section 65 selects the high-level feature extraction expression having the highest evaluation value derived.from step S217 out of the "c" high-level feature extraction expressions acquired through the cross validation loop. [0225]
In step S220, the control section 27 checks to determine whether the algorithm loop parameter A is smaller than the maximum value "a." If the algorithm loop parameter A is found to be smaller than the maximum value "a," then the parameter A is incremented by "1" and step S214 is reached again. On the other hand, if the algorithm loop parameter A is not found to be smaller than the maximum value "a" (i.e., if the parameter A is equal to the maximum value "a"),, then the algorithm loop is exited and step S221 is reached. As many as "a" high-

level teature extraction expressions of the K-th type learned using "a" learning algorithms are thus acquired in steps S213 through S220 constituting the algorithm loop. [0226]
In step S221, the high-level feature extraction expression learning section 65 selects the high-level feature extraction expression having the highest evaluation value derived from step S217 out of the "a" high-level feature extraction expressions learned through the algorithm loop. [0227]
In step S222, the control section 27 checks to determine whether the training data loop parameter K is smaller than the maximum value "k." If the training data loop parameter K is found to be smaller than the maximum value "k," then the parameter K is incremented by "1" and step S213 is reached again. On the other hand, if the training data loop parameter K is not found to be smaller than the maximum value "k" (i.e., if the parameter K is equal to the maximum value "k"), then the training data loop is exited and step S223 is reached. As many as "k" high-level feature extraction expressions corresponding to the N-th low-level feature extraction expression list

are thus acquired in steps S212 through S222 constituting
the training data loop.
[0228]
In step S223, the control section 27 checks to determine whether the list loop parameter N is smaller than the maximum value "n." If the list loop parameter N is found to be smaller than the maximum value "n," then the parameter N is incremented by "1" and step S212 is reached again. On the other hand, if the list loop parameter N is not found to be smaller than the maximum value un" (i.e., if the parameter N is equal to the maximum value "n"), then the list loop is exited and step S224 is reached. As many as wk" high-level feature extraction expressions corresponding to each of "n" low-level feature extraction expression lists are thus acquired in steps S221 through S223 constituting the list loop. [0229]
In step S224, the high-level feature extraction expression learning section 65 computes the contribution ratio of each of "k" high-level feature extraction expressions corresponding to each of the acquired *n" low-level feature extraction expression lists, and outputs the results.of the computations to the low-level

feature extraction expression list creation section 61 together with the evaluation value of the high-level feature extraction expression computed in step S217. This brings the high-level feature extraction expression learning process to an end [0230]
In step S5 back in Fig. 7, the control section 27 checks to determine whether the learning loop parameter G is smaller than the maximum value "g." If the learning loop parameter G is found to be smaller than the maximum value "g," then the parameter G,is incremented by "1" and step S2 is reached again. On the other hand, if the learning loop parameter G is not found to be smaller than the maximum value "g" (i.e., if the parameter G is equal to the maximum value, "g"), then .the learning loop is exited and step S6 is reached. Steps S1 through S5 constitute the feature extraction algorithm learning process. Step S6 following that process is a process for computing high-level features using feature extraction algorithms. [0231]
In step S6, the high-level feature extraction expression learning section 65 supplies the high-level feature computation section 26 with the low-level feature

extraction expressions constituting the list having the highest mean evaluation value of the acquired high-level features out of "n" low-level feature extraction expression lists of the latest generation obtained through learning, along with "k" high-level feature extraction expressions corresponding to that list. [0232]
In step S7, the high-level feature computation section 26 computes a high-level feature with high accuracy using the low-level feature extraction expressions and high-level feature extraction expressions supplied most recently from the high-level feature extraction expression learning section 65. The high-accuracy high-level feature computation process of step S7 was already discussed in connection with the feature extraction algorithm creation apparatus 20 and thus will
X
not be described further. [0233]
The foregoing paragraph completes the description of the feature extraction, algorithm creation process performed by the feature extraction algorithm creation apparatus 60. [0234]
According to the above-described feature extraction

algorithm learning process performed by the feature extraction algorithm creation apparatus 60 as the second embodiment of the present invention, it is possible to create accurately and quickly algorithms for extracting relevant features from song data. The process also permits acquisition of highly 'accurate high-level features with a significantly small amount of computations. [0235]
Particularly noteworthy is the ability to randomly determine the number."m" pf expressions constituting each low-level feature extraction-expression list. This, unlike the feature extraction algorithm learning process performed by the feature.extraction algorithm creation apparatus 20, helps avoid handling too many low-level features through onerpus and potentially inaccurate processing and thereby contributes to yielding results with high accuracy. [0236]
The present invention can be applied not only to acquiring high-level features from song data but also to obtaining high-level features from content data in diverse categories including video data. [0237]

The series of steps and processes described above may be executed either by hardware or by software. For the software-based processing to take place, the programs constituting the software may be either incorporated beforehand in dedicated hardware of a computer for program execution or installed upon use from a suitable recording medium into a general-purpose personal computer such as one shown in Fig. 47 or like equipment capable of executing diverse functions based on the installed programs. [0238]
A personal computer 100 shown in Fig. 47 incorporates a CPU (Central Processing Unit) 101. An input/output interface 105 is connected to the CPU 101 through a bus 104. The bus 104 is connected to a ROM (Read Only Memory) 102 and a RAM (Random Access Memory) 103. [0239]
The input/output interface 105 is connected to an input device 106, an output device 107, a storage device 108, and a communication device 109. The input device 106 is made up of such input device elements as a keyboard and a mouse to be operated by the user to enter operation commands. The output device 107 is composed of a display

device such as a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display) on which to display operation screens and other information. ;The storage device 108 is typically constituted by a hard disk drive for accommodating programs and data. The communication device 109 is formed by a modem and/or a LAN (Local Area Network) adapter for conducting communications over networks such as the Internet. The input/output interface 105 is further connected to a drive 110 that writes and reads data to and from a recording medium 111 such as magnetic disks (including flexible disks), optical disks (including CD-ROM (Compact.Disc-Read Only Memory) and DVD (Digital Versatile Disc) ), magneto-optical disks (including MD (Mini Disc) ) , or semiconductor memories. [0240]
The programs for causing the personal computer 100 to execute the above-described steps and processes are stored on the recording medium 111 before being submitted to the computer 100. The programs are read by the drive 110 from the recording medium 100 and installed onto the hard disk drive in the storage device 108. The programs held in the storage device 108 are loaded from there into the RAM 103 for execution based on instructions issued by the CPU 101 in response to the commands entered by the

user through the input device 106.
[0241]
In this description, the steps to be executed on
the basis of the stored programs represent not only the
processes that are to be carried out in the depicted
sequence (i.e., on a time series basis) but also
processes that may be performed parallelly or
individually and not chronologically.
[0242]
The programs may be processed either by a single
computer or by a plurality of computers on a distributed basis. The programs may also be transferred to a remotely located computer or computers for execution.
[0243]
In this specification, the term "system" refers to an entire configuration made up of a plurality of component devices.
[0244]
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factor in so far as they are within the scope of the appended claims or the equivalents thereof.




We Claim:
1. An information processing apparatus (20) which creates a feature detection algorithm for
detecting features from content data, said information processing apparatus comprising:
low-level feature extraction expression list creation means (21) for creating next-generation expression lists each constituted by a plurality of low-level feature extraction expressions through learning based on latest-generation expression lists, said low-level feature extraction expressions being expressions to which either said content data or metadata corresponding to said content data is input and from which low-level features are output;
computation means (24) for computing said low-level features using said next-generation expression lists created by said low-level feature extraction expression list creation means (21); and
high-level feature extraction expression creation means (25) for creating high-level feature extraction expressions through learning based on training data constituted by previously furnished true high-level features corresponding to said content data, said high-level feature extraction expressions being expressions to which said low-level features computed by said computation means (24) are input and from which high-level features characteristic of said content data are output.
2. The information processing apparatus as claimed in claim 1, wherein said high-level
feature extraction expression creation means (25) computes at least either accuracy levels of the
created high-level feature extraction expressions or contribution ratios of said low-level features
in said high-level feature extraction expressions; and
wherein said low-level feature extraction expression list creation means (21) updates said low-level feature extraction expressions constituting said low-level feature extraction expression lists at least on the basis of either the accuracy levels of said high-level feature extraction expressions or the contribution ratios of said low-level features in said high-level feature extraction expressions, said accuracy levels and said contribution ratios having been computed by said high-level feature extraction expression creation means (25).
3. The information processing apparatus as claimed in claim 1, wherein said low-level feature extraction expression list creation means (21) randomly creates first-generation expression lists.
4. The information processing apparatus as claimed in claim 1, wherein said low-level feature extraction expression list creation means (21) creates said next-generation expression lists using a genetic algorithm based on said latest-generation expression lists through at least one of a selection process, a cross process, and a mutation process.
5. The information processing apparatus as claimed in claim 1, wherein said low-level feature extraction expression list creation means (21) creates said next-generation expression

lists each constituted by a predetermined constant number of low-level feature extraction expressions.
6. The information processing apparatus as claimed in claim 1, wherein said low-level feature extraction expression list creation means (21) creates said next-generation expression lists each constituted by a predetermined constant number of low-level feature extraction expressions randomly determined every time each of said lists is created.
7. The information processing apparatus as claimed in claim 6, wherein said high-level feature extraction expression creation means (25) computes at least either evaluation values of the created high-level feature extraction expressions or contribution ratios of said low-level features in said high-level feature extraction expressions; and
wherein said low-level feature extraction expression list creation means updates said low-level feature extraction expressions constituting said low-level feature extraction expression lists at least on the basis of either the evaluation values of said high-level feature extraction expressions or the contribution ratios of said low-level features in said high-level feature extraction expressions, said evaluation values and said contribution ratios having been computed by said high-level feature extraction expression creation means.
8. An information processing method for use with an information processing apparatus
which creates a feature detection algorithm for detecting features from content data, said
information processing method comprising the steps of:
creating next-generation expression lists each constituted by a plurality of low-level feature extraction expressions through learning based on latest-generation expression lists, said low-level feature extraction expressions being expressions to which either said content data or metadata corresponding to said content data is input and from which low-level features are output;
computing said low-level features using the created next-generation expression lists; and
creating high-level feature extraction expressions through learning based on training data constituted by previously furnished true high-level features corresponding to said content data, said high-level feature extraction expressions being expressions to which the computed low-level features are input and from which high-level features characteristic of said content data are output.

Documents:

4277-DELNP-2007-Abstract-(24-07-2012).pdf

4277-delnp-2007-abstract.pdf

4277-DELNP-2007-Claims-(24-07-2012).pdf

4277-delnp-2007-claims.pdf

4277-DELNP-2007-Correspondence Others-(24-07-2012).pdf

4277-delnp-2007-correspondence others.pdf

4277-delnp-2007-correspondence-others 1.pdf

4277-delnp-2007-description (complete).pdf

4277-DELNP-2007-Drawings-(24-07-2012).pdf

4277-delnp-2007-drawings.pdf

4277-DELNP-2007-Form-1-(24-07-2012).pdf

4277-delnp-2007-form-1.pdf

4277-delnp-2007-form-18.pdf

4277-DELNP-2007-Form-2-(24-07-2012).pdf

4277-delnp-2007-form-2.pdf

4277-DELNP-2007-Form-3-(24-07-2012).pdf

4277-delnp-2007-form-3.pdf

4277-delnp-2007-form-5.pdf

4277-DELNP-2007-GPA-(06-06-2007).pdf

4277-DELNP-2007-GPA-(24-07-2012).pdf

4277-delnp-2007-pct-210.pdf

4277-delnp-2007-pct-301.pdf

4277-delnp-2007-pct-304.pdf

4277-delnp-2007-pct-306.pdf


Patent Number 258506
Indian Patent Application Number 4277/DELNP/2007
PG Journal Number 03/2014
Publication Date 17-Jan-2014
Grant Date 16-Jan-2014
Date of Filing 05-Jun-2007
Name of Patentee SONY CORPORATION
Applicant Address 1-7-1 KONAN, MINATO-KU, TOKYO 108-0075, JAPAN
Inventors:
# Inventor's Name Inventor's Address
1 KOBAYASHI YOSHIYUI C/O SONY CORPORATION,1-7-1 KONAN, MINATO-KU, TOKYO 108-0075, JAPAN
PCT International Classification Number G10L 11/00
PCT International Application Number PCT/JP2006/321260
PCT International Filing date 2006-10-25
PCT Conventions:
# PCT Application Number Date of Convention Priority Country
1 2006-281267 2006-10-16 Japan
2 2005-310410 2005-10-25 Japan