[go: up one dir, main page]

WO2018180499A1 - Structure de réseau neuronal, circuit électronique, système de traitement d'informations, procédé et programme - Google Patents

Structure de réseau neuronal, circuit électronique, système de traitement d'informations, procédé et programme Download PDF

Info

Publication number
WO2018180499A1
WO2018180499A1 PCT/JP2018/010011 JP2018010011W WO2018180499A1 WO 2018180499 A1 WO2018180499 A1 WO 2018180499A1 JP 2018010011 W JP2018010011 W JP 2018010011W WO 2018180499 A1 WO2018180499 A1 WO 2018180499A1
Authority
WO
WIPO (PCT)
Prior art keywords
neuron
unit
neural network
network structure
synapse
Prior art date
Application number
PCT/JP2018/010011
Other languages
English (en)
Japanese (ja)
Inventor
正之 廣口
Original Assignee
株式会社日本人工知能研究開発機構
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日本人工知能研究開発機構 filed Critical 株式会社日本人工知能研究開発機構
Priority to JP2019509227A priority Critical patent/JP6712399B2/ja
Publication of WO2018180499A1 publication Critical patent/WO2018180499A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/042Knowledge-based neural networks; Logical representations of neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/02Knowledge representation; Symbolic representation

Definitions

  • the present invention relates to a neural network structure, an electronic circuit, an information processing system, a method, and a program.
  • the human brain nerve cells (hereinafter referred to as neurons) have functions for human recognition, judgment, memory, and the like.
  • a neuron memorizes an event when it is repeatedly input, recognizes whether it is the same event as that event, and reacts to it when it is the same event. A signal is output.
  • the memory becomes easier, and when the information is input, it becomes possible to easily react and the signal is more easily output.
  • a neuron is connected to many neurons via a synapse, and transmission from neuron to neuron corresponding to input information is performed.
  • input information matches stored information, other neurons To output a signal.
  • connection between a predetermined neuron and a neuron becomes stronger. For this reason, when information that has been repeatedly input is input again, it is likely that transmission from a given neuron to a neuron will easily occur, and that the neuron will react and output signals to other neurons more easily. ing.
  • a neural network is a model of the function of a neuron using mathematical formulas and circuits.
  • a neuron has a function of emitting a pulse (igniting) when a potential applied by an input signal exceeds a threshold value.
  • a function is realized using a nonlinear function, a capacitor, or the like. Synapses that connect neurons have different temporal and spatial propagation efficiencies, and change depending on the connection between neurons and the propagation efficiency between neurons, depending on the results of recognition, judgment, and memory.
  • this can be done by weighting the output value of the nonlinear function or the filling rate of the capacitor with the coupling strength (weighting factor) and substituting it into the nonlinear function of the subsequent neuron or adding it to the filling rate.
  • a hierarchical neural network is a typical model.
  • the hierarchical neural network includes an input layer including a plurality of input layer neurons that receive an input signal from the outside, an output layer including a plurality of output layer neurons that transmit an output signal to the outside, an input layer neuron, and an output layer neuron. And one or more intermediate layers including a plurality of intermediate layer neurons.
  • Patent Document 1 discloses a learning system and method for a hierarchical neural network that realizes accurate and high-speed learning of a neural network.
  • This hierarchical neural network learning system selects a route having the highest degree of matching between the propagation signal and the coupling strength value from among the routes connected to each neuron of each map constituting each layer of the hierarchical neural network. Processes other than are treated as sparse in units of the map, the input signal input to the hierarchical neural network is propagated in the forward direction to obtain the output signal, and the resulting output signal and input signal are paired The target signal is compared with each other, and the degree of matching between the signal propagating on the selected path and the coupling strength value is adjusted according to the degree of coincidence.
  • a back-propagation method is known as a learning method for synaptic weighting factors for obtaining a high correct answer rate.
  • a back-propagation method is known as a learning method for synaptic weighting factors for obtaining a high correct answer rate.
  • it is necessary to learn from a large amount of data, so it is difficult and time-consuming to use a hierarchical neural network in a new application field.
  • the application area of the hierarchical neural network is suitable for specific areas such as games, image recognition, automatic driving, and investment judgment that can be handled by individual algorithms (so-called weak AI).
  • weak AI arithmetic and mathematical operations
  • one neuron can be associated with a concept, but the neuron has a so-called meaning such as shape, property, feature, and name related to the concept. It is not possible (symbol grounding problem).
  • the excitement state also called firing or stimulation
  • the excitement state can be propagated one after another from the input to the output of the entire system, but when the final output is not reached Can't ask for more results.
  • sequential execution is performed by selecting subsequent steps according to whether or not the procedure is executed and whether or not the execution result is in accordance with a series of procedures according to the timing. It is not possible. That is, procedural processing can be executed by a program, but cannot be executed by a conventional neural network.
  • a neural network structure simulating a mechanism of a neural circuit in a human brain, an electronic circuit including the neural network structure, and information processing A system, method, and program are provided.
  • a plurality of neuron units and a synapse unit that connects the plurality of neuron units are provided, and a product of an output value of firing or unfired of one neuron unit and a weight coefficient of the synapse unit is 1
  • a neural network structure in which other neuron units connected to one neuron unit via a synapse unit ignite according to the sum added more than once, and the output value due to the firing of one neuron unit is another
  • a neural network structure having a loop structure that propagates to one neuron unit itself via a neuron unit and a synapse unit is provided.
  • a neuron part included in the loop structure when a neuron part included in the loop structure receives an output value, that is, a stimulus by firing, the stimulus propagates one after another in the loop structure. Therefore, the function is made so that the stimulus continuously propagates in the loop structure, and the neuron part in the loop structure is repeatedly excited at a certain time interval every time the stimulus in the loop structure is propagated.
  • a neural network structure that can function as described above can be provided.
  • the above-described neural network structure and another neural network structure that receives as an input value an output value output when one neuron unit ignites by propagating the loop structure may be provided. According to this, in other neural network structures that receive a stimulus from a certain neuron part included in the loop structure and are not included in the loop structure, the excitement of the neuron part can be increased each time the stimulus is repeatedly received.
  • the series of neuron units receives the input from the output value by firing, and the sum becomes equal to or more than a threshold value. It may be characterized by expressing a function. According to this, the excitement of the neuron part is gradually increased by repeatedly receiving the stimulus in other neural network structures not included in the loop structure that receive the stimulus from a certain neuron part included in the loop structure, and the excitement is increased. The number of neuron portions that have fired beyond the threshold value increases, and a series of neuron portions in which the stimulus propagates over a wider range can be formed, and a predetermined function can be expressed.
  • a series of neuron parts connected in the neural network structure are formed, and a predetermined function corresponding to the series of neuron parts is expressed, so that a neuron part related to a concept is connected to a neuron part related to the concept related to the concept.
  • a so-called “meaning” can be given in the neural network structure by the related concept group.
  • a logical product calculation unit that calculates a logical product of one neuron unit and one or more other neuron units
  • a logical sum calculation unit that calculates a logical sum of one neuron unit and one or more other neuron units.
  • a logic negation calculation unit for calculating the logic negation of one neuron unit.
  • a series of neuron units that express higher-order functions may be formed by any one or combination of a logical product calculation unit, a logical sum calculation unit, and a logical negation calculation unit. According to this, by calculating the neuron part, a series of neuron parts that express more complicated functions can be formed.
  • a series of neuron parts that express one function is pre-connected to one or more series of neuron parts that express another function, and fires when the sum exceeds a threshold value. Also good. According to this, it is possible to form a situation in which it is easy to ignite higher-order “meaning” as a wider concept group by igniting “meaning” that is a related concept group.
  • first neuron unit, the second neuron unit, and other neuron units, and the output value generated by firing the second neuron unit is transmitted to the second neuron unit itself via the other neuron units.
  • An oscillating neuron unit having a propagating structure, and a synapse unit connected in a direction from the first neuron unit to the second neuron unit has an ignition weight coefficient for firing the second neuron unit in one summation.
  • the synapse part that connects the neuron parts in the oscillation neuron part has a firing weight coefficient that fires the neuron part in one summation, and connects in the direction from the neuron part in the oscillation neuron part to the first neuron part
  • the synapse unit may include an oscillating unit having an unfired weighting coefficient that does not fire the first neuron unit because it is less than the threshold value. According to this, it is possible to maintain the potential excitement state for a certain period of time in the first neuron unit that is once excited by being stimulated.
  • a long-term memory unit composed of a series of neuron units and a short-term memory unit composed of a loop structure and one or more oscillation units may be provided. According to this, long-term and short-term memory can be held by the neuron unit.
  • a neuron included in the first neural network structure further including: a plurality of neuron units, the first neural network structure; and a plurality of neuron units, the second neural network structure.
  • a synapse unit that connects the unit and a neuron unit included in the second neural network structure may be provided. According to this, it is possible to link between neural network structures of different systems by providing synapse parts that connect neuron parts included in different neural network structures that retain long-term memory or short-term memory. Become.
  • the first neural network structure is a master having a function of controlling one or more second neural network structures
  • the second neural network structure is a slave functioning under the control of the first neural network structure. It may be a feature.
  • one main neural network structure can have a larger neural network structure that is controlled by the hierarchical structure by controlling other neural network structures.
  • first neural network structure may have a general purpose or planned function
  • second neural network structure may have an individual or execution function.
  • individual or executable functions can be expressed under a neural network structure that expresses general-purpose or planned functions.
  • the weighting factor of the synapse part that connects the neuron part included in the first neural network structure and the neuron part included in the second neural network structure increases when the neuron parts at both ends connected by the synapse part fire simultaneously. It may be characterized by. According to this, strong connection between different series of neuron parts or between different neural network structures is promoted, and it becomes easy to form a series of neuron parts that express a predetermined function.
  • a series of neuron units belong to other neural network structures other than the other neural network structure to which they belong.
  • the sum may be equal to or greater than a threshold value, and a predetermined function may be exhibited.
  • output information can be obtained by giving related information or the like as an additional stimulus to a series of neuron units in a quasi-excited state in which output can be obtained a little more.
  • a series of neuron units whose sum is equal to or greater than the threshold value by receiving input from the output value of the firing of the neuron unit belonging to another neural network structure other than the other neural network structure to which the self belongs
  • input is received only from the output value from the firing of the neuron part belonging to the neural network structure, from the output value from the firing of the neuron part fired in a series of neuron parts and the neuron part belonging to another neural network structure to which it does not belong
  • evaluation is performed by comparing with a neuron unit fired in a series of neuron units when an input is received, and determining whether or not the coincidence ratio of the fired neuron units is a predetermined value or more. According to this, the validity of the output information output in the associative function can be ensured.
  • At least one neuron unit in the first series of neuron units expressing a predetermined function and a predetermined function different from the first series of neuron units are expressed.
  • the weight coefficient of the synapse unit connecting the first series of neuron units and the second series of neuron units increases. May be a feature. According to this, strong connection between different series of neuron parts or between different neural network structures is promoted, and it becomes easy to connect the relations between the neural network structures.
  • an electronic circuit including the above-described neural network structure includes an arithmetic element that constitutes a neuron part and a synapse part, a storage element that stores a weight coefficient, and an adder circuit that calculates a sum.
  • an electronic circuit capable of forming a neural network structure including a loop structure can be provided.
  • the memory element may be implemented as a memristor element.
  • a memory element that can take not only a digital value of 0 or 1 but also a plurality of values can be formed by only one memristor element that is a passive element, and the stored value is used as an electric resistance.
  • the stored value can be increased or decreased depending on the accumulated value of the flowed current, and the memory of the memristor element is non-volatile, so that the memory can be retained even when the power is turned off.
  • the electronic circuit an input unit that receives an optical input, an acoustic input, an electrical input, or a physical operation input from the outside and converts the input into an electrical signal;
  • An output unit that converts the signal into an optical output, an acoustic output, an electrical output, and an output of physical operation, and the input unit inputs the electrical signal to a neuron unit included in the neural network structure, and outputs an output unit.
  • a so-called conceptual neuron unit having visual, audio, conceptual, or physical meaning corresponding to various external inputs is formed, and a neuron unit related to this neuron unit is connected.
  • Provides a broader and richer meaning to the conceptual neuron realizes a neural network structure based on the meaning of the real world, and provides an information processing system that allows the output of the neural network structure to act externally can do.
  • a method for propagating firing through the neuron unit and the synapse unit the first neuron And the (N-1) th neuron unit is ignited when the firing from the (N-2) th neuron unit is propagated. And the step of propagating to the Nth neuron unit through the synapse unit, and the Nth neuron unit ignites when the firing from the (N ⁇ 1) th neuron unit is propagated (N is 3 or more). Natural number) and propagating to the first neuron part via the synapse part.
  • the function of the stimulus continues to propagate and the neural network structure It is possible to provide a method capable of functioning to repeatedly excite a constant time interval each time a stimulus that is continuously propagated by a neuron in the (loop structure) is propagated.
  • a program for executing simulation of the neural network structure on a computer a storage step for storing a weighting factor, an adding step for calculating a sum, a stored weighting factor and a calculation Based on the sum, it is possible to provide a program for executing a processing step for performing a simulation of whether one neuron unit propagates or does not propagate the firing to another neuron unit via the synapse unit. According to this, it is possible to provide a program for simulating a neural network structure including a loop structure using a normal Neumann computer.
  • a plurality of neuron units and a synapse unit that connects the plurality of neuron units are provided, and a product of an output value of firing or unfired of one neuron unit and a weight coefficient of the synapse unit is 1
  • a neural network structure that mimics the mechanism of a neural circuit in the human brain, an electronic circuit including the neural network structure, an information processing system, a method, and a program.
  • FIG. 4 is a schematic diagram showing a (NOT circuit), (D) a schematic diagram showing a latch structure (Latch circuit), and (E) a schematic diagram showing a flip-flop structure (Flip-Flop circuit).
  • FIG. The schematic diagram which shows the example of the latch structure in the neural network structure of 2nd Example which concerns on this invention.
  • the schematic diagram which shows the example of the reinforcement
  • the schematic diagram which shows the learning function in the neural network structure of 4th Example based on this invention The schematic diagram which shows the associative function in the neural network structure of 4th Example which concerns on this invention.
  • the schematic diagram which shows a learning function in the neural network structure of 4th Example which concerns on this invention by making the cooperation between the territory of a brain into an example.
  • neural network structure refers to an architecture configured to simulate biological neurons. Neural network structure means a structure that is approximately functionally equivalent to neurons and synapses of biological brain, ie, the connection between elements. Therefore, the electronic circuit and information processing system including the neural network structure according to the embodiment of the present invention can include various elements and electronic circuits that model biological neurons. Further, a computer including a neural network structure according to an embodiment of the present invention can include various processing elements and algorithms (including computer simulations) that model biological neurons.
  • the neural network structure is an engineering model of the information processing function of the brain, and has a structure in which a number of elements corresponding to nerve cells called neuron portions are connected.
  • the neural network structure includes a plurality of neuron parts and a plurality of synapse parts. Note that this figure shows a state in which two neuron parts are connected to each other via a synapse part for the sake of easy viewing, but in reality, a large number of neuron parts are connected via a large number of synapse parts. Are connected to each other.
  • one neuron unit describes inputs from three synapse units and outputs to the three synapse units, but it goes without saying that the present invention is not limited to this.
  • one neuron unit When one neuron unit receives inputs In1 to In3 (also referred to as stimuli) from other neuron units via the synapse unit, it accumulates a potential inside itself.
  • the inputs In1 to In3 are given asynchronously from a plurality of other neuron units.
  • the neuron unit ignites and generates outputs Out1 to Out3.
  • the neuron unit that has received the output Out2 as an input (input In2 in the figure) accumulates a potential inside and can further propagate the stimulus to the connected neuron unit.
  • This neuron unit includes, for example, a capacitor and an operational amplifier (not shown) constituting an integration circuit, and stores a potential therein by charging the capacitor using a current output from the synapse unit.
  • a capacitor and an operational amplifier constituting an integration circuit
  • the neuron unit When the internal potential exceeds a predetermined threshold value, the neuron unit outputs (fires) outputs Out1 to Out3, and resets or attenuates the internal potential over time.
  • the neuron unit When the neuron unit frequently receives inputs In1 to In3, the state in which the internal potential is close to the threshold value is maintained, and such a state is easy to ignite even with a small amount of input, and thus is easily excited (potential excitation state). It can be said.
  • a similar integration circuit corresponding to each synapse part may be provided in the part of the receiving port from the synapse part in the synapse part or the neuron part. These function as the coupling strength (also referred to as a weighting coefficient or weighting) of the synapse portion depending on the capacity of the integration circuit and the level of the filling rate. Then, the product of the output value of a certain neuron unit and the weighting coefficient of the synapse unit and the like is repeatedly added and summed, thereby affecting the degree and speed of accumulation of the internal potential of the neuron unit.
  • the neural network structure according to the present invention has a plurality of neuron units and a synapse unit that connects the plurality of neuron units, and the output value of firing or unfired one neuron unit and the weight of the synapse unit
  • This is a neural network structure in which another neuron unit connected to one neuron unit is ignited via a synapse unit according to a sum obtained by adding a product with a coefficient one or more times.
  • processing elements as an algorithm of software executed on a computer.
  • the function F indicating the internal potential of the neuron part can be expressed by Expression (1).
  • F In1 * w1 + In2 * w2 + In3 * w3 Formula (1) (However, In1 to In3: 0 or 1)
  • Software including such an algorithm can be executed on a Neumann computer.
  • This neural network structure has a loop structure.
  • the output value due to the firing (1) of the neuron unit 1 propagates to the original neuron unit 1 itself through the other neuron units 2 to 10 and their synapse units.
  • the loop structure is composed of ten neuron parts, but the number is not limited in the present invention. These ten neuron portions are connected in a loop via a synapse portion.
  • the neuron unit 1 becomes excited by some kind of trigger and is ignited (1).
  • the neuron unit 2 to which the firing (1) is input is in an excited state, and firing (2) is performed. Thereafter, similarly, the neuron part N is in an excited state and firing N is performed.
  • firing (10) is performed in the neuron unit 10
  • the firing (10) is input to the neuron unit 1.
  • the neuron unit 1 to which the firing (10) is input becomes excited again and firing (1) is performed. Therefore, once any one of the neurons in the loop structure is in an excited state, the excited state is propagated one after another and is repeated, so that the excited state is maintained.
  • the human brain is known to have four loops: a motor loop, an eye movement loop, a limbic loop, and a prefrontal loop.
  • the loop structure in the neural network structure in the present invention is a prefrontal cortex. It is considered to correspond to a loop. If the prefrontal loop corresponds to a beta wave in the electroencephalogram, the loop is about 20 laps per second, which requires a transmission time of 2 milliseconds per neuron, and is considered to correspond to 25 neurons.
  • the weighting coefficient of the synapse part in the neuron part included in the loop structure is a weighting factor that is always in an excited state and fires only for the subsequent neuron part as shown in FIG. 9 ") is set.
  • the neuron portion is excited when the sum of products of the output value and the weighting factor of the synapse portion is added once or more is 8 or more.
  • a certain neuron part in the loop structure (the neuron part 1 in this figure (A)) is also connected to a neuron part that is not included in this loop structure, and when it becomes excited and ignites, the loop structure Firing is also transmitted to the outer neuron.
  • the neuron unit 1 propagates the firing (1) and the firing of the second round (11) to the neuron unit 2, and also fires (1) to the neuron part outside the loop structure (1) and the firing of the second round ( 11) Propagation in the third round of ignition (21), and so on. Therefore, by having the loop structure, it is possible to propagate the firing so as to repeatedly excite at certain time intervals to other neuron parts not included in the loop structure.
  • the firing can be propagated repeatedly to the neuron portion outside the loop structure at a constant time interval. Then, in the first lap, the neuron part outside the loop structure may have stopped by firing (1) and firing (2), but in the second lap, firing (3) and firing (4) are performed. In the eyes, firing (5) and firing (6) are performed, and the number of neuron portions that are in an excited state increases, and the firing spreads one after another.
  • the number of neuron portions in an excited state or a state in which they are easily excited increases, and a series of neuron portions that express a predetermined function can be easily formed.
  • the connection between the neuron part and the neuron part becomes strong, and it becomes easy to form a series of neuron parts that express a predetermined function.
  • a neural network structure when a neural network structure includes a loop structure, when a neuron part included in the loop structure receives an output value, i.e., a stimulus by firing, the stimulus propagates in the loop structure one after another. Therefore, it is made to function so that the stimulus continuously propagates in the loop structure, and the neuron part in the loop structure is repeatedly excited at a certain time interval every time the stimulus in the loop structure is propagated. It is possible to provide a neural network structure capable of functioning.
  • the neural network structure in this embodiment includes the above-described neural network structure of the loop structure and another neural network that receives as an input value an output value that is output when a certain neuron unit ignites by propagating the loop structure. And a structure.
  • the neuron portion in an excited state or a state in which it is easily excited increases by receiving a stimulus repeatedly at a constant time interval. It becomes easy to express the function.
  • the excitement of the neuron part can be increased each time the stimulus is repeatedly received.
  • the series of neuron units receives the input from the output value by firing, and the sum exceeds the threshold value.
  • a given function that has become a series of neuron parts when the sum is greater than or equal to a threshold is, for example, that when the sum of the neuron parts is greater than or equal to a threshold and fires without fail, it becomes a so-called connected state and forms a group of concepts. is there.
  • a series of neuron units having a state in which the neuron units are fired (connected state) has “meaning”.
  • a “Zebra” neuron portion is formed.
  • the “Zebra” neuron is a series of Japanese “Sound” neuron, “Shi” neuron, “Ma” neuron, “U” neuron, and “Ma” neuron.
  • the Japanese word “word” neuron part is connected to the “sumama” neuron part.
  • the “stripes” neuron part is a “word” neuron part with a Japanese name in which a “shi” neuron part that is a Japanese “sound” neuron part and a “ma” neuron part are connected in series. Shimashima "is connected to the neuron.
  • the “Zebra” neuron is the “Animal” neuron that has the concept of “Animal”. It becomes the state connected with. And if you have been taught more than a certain number of times that the concept of the “animal” neuron part is a concept such as “lion” or “giraffe”, the “animal” neuron part with the concept of “animal” It becomes connected to the “lion” neuron and “giraffe” neuron.
  • the “Animal” neuron unit is a series of “Do” neuron unit, “U” neuron unit, “Bu” neuron unit, and “Tsu” neuron unit, which are Japanese “sound” neuron units. It becomes connected to the “animal” neuron part, which is the “word” neuron part of the Japanese name.
  • zebra neuron is determined by another neuron that is connected to the “zebra” neuron.
  • the concept and meaning are given by connecting to such a neuron part, and the efficiency is dramatically improved as compared with the prior art. For example, if you try to achieve the same thing with a conventional program on a Neumann computer, check whether it is meaningful if you break it in the middle of a word, or check the concept of "pattern” or "shape" It is necessary to store the features in advance and perform a search. However, since there are various combinations for separating words, the amount of calculation is enormous. Also, a large storage area is required to comprehensively store the features. In either case, it is not very efficient.
  • the excitation of the neuron part is gradually increased by receiving the stimulation repeatedly, and the excitement becomes a threshold value.
  • the number of neuron portions that have fired beyond the range increases, and a series of neuron portions in which the stimulus propagates over a wider range can be formed, and a predetermined function can be expressed.
  • a series of neuron units connected in the neural network structure are formed, and a predetermined function corresponding to the series of neuron units is expressed, so that a neuron unit related to a concept changes to a neuron unit related to a concept related to the concept.
  • the so-called “meaning” can be given to the neural network structure by the related concept group.
  • a series of neuron portions in which a stimulus propagates in a wider range in a neural network structure can be formed, and so-called “meaning” can be provided in the neural network structure.
  • the disputed symbol grounding problem can be solved.
  • FIG. 5A in a hierarchical neural network, one neuron part can be associated with a certain concept, but the shape, property, feature, and name associated with that concept are associated with that neuron part. It is said that it has a symbol grounding problem that it cannot give a so-called meaning.
  • the neural network structure of the present embodiment includes the neural network structure of the above embodiment.
  • the neural network structure of this embodiment can hold a logical product structure for calculating a logical product of two neuron parts, a logical sum structure for calculating a logical sum of two neuron parts, a logical negation structure of a neuron part, and a state. It has a latch structure and a flip-flop neuron part. These structures are also the expression of a predetermined function that becomes a series of neuron portions when the sum is equal to or greater than a threshold value.
  • FIG. 6A shows a logical product structure (AND circuit).
  • the neuron unit 1 and the neuron unit 3 and the neuron unit 2 and the neuron unit 3 are connected by a synapse unit having a weighting coefficient 4, and both the neuron unit 1 and the neuron unit 2 are in an excited state.
  • the neuron unit 3 fires. When either one does not fire, the neuron unit 3 does not fire.
  • FIG. 6B shows a logical OR structure (OR circuit).
  • OR circuit a logical OR structure
  • the neuron unit 1 and the neuron unit 3 and the neuron unit 2 and the neuron unit 3 are connected by a synapse unit having a weighting coefficient 9, and either the neuron unit 1 or the neuron unit 2 is excited. When it becomes a state and fires, the neuron part 3 fires. If neither fires, the neuron unit 3 does not fire.
  • FIG. 6C shows a logical negation structure (NOT circuit).
  • the neuron unit 1 and the neuron unit 2 are connected by a synapse unit having a weight coefficient of -9, and when the neuron unit 1 is excited and ignites, Even if there is a stimulus, the neuron part 2 is difficult to ignite. If the neuron unit 1 does not fire, the neuron unit 2 may fire if there is a stimulus from another neuron unit.
  • 6 (D) and 6 (E) show a latch structure (Latch circuit) and a flip-flop structure (Flip-Flop circuit) for holding values and states.
  • the latch structure even if the neuron unit 1 fires, the neuron unit 2 does not fire unless fire is transmitted to the neuron unit 2 from others.
  • the flip-flop structure the excited state of the neuron unit 3 is maintained when the neuron unit 1 fires, and the excited state of the neuron unit 4 is maintained when the neuron unit 2 fires.
  • the neural network structure has a logical product structure, a logical sum structure, and a logical negation structure, a structure equivalent to the latch structure and the flip-flop structure can be formed.
  • logical product operation unit and the logical sum operation unit between the neuron units and the logical negation operation unit of the neuron unit
  • all the arithmetic processing of the neuron unit can be performed.
  • a combination of logical product calculation unit, logical sum calculation unit, and logical negation calculation unit forms a series of neuron units that express complex logical structures and higher-order functions such as latch structures and flip-flop structures. It comes out.
  • FIG. 7A illustrates an example of a logical product structure.
  • the neuron part A having the concept of “sima” and the neuron part B having the concept of “horse” are input to the logical product structure, the neuron part C having the concept of “zebra” is excited.
  • a “zebra” neuron is not excited only by a “sima” neuron or only by a “horse” neuron.
  • (B) is an example of a logical sum structure.
  • the neuron part C having the concept of “zebra” is excited.
  • the neuron part C having the concept of “zebra” Excited.
  • (C) is an example of a logical negation structure.
  • the neuron part C having the concept of “zebra” is input, the neuron part F having the concept of “not a real horse” is not excited.
  • (D) is an example of a flip-flop structure.
  • the neuron part D having the sound of “zebra” is input, the excitement of the neuron part C having the concept of “zebra” is sustained, and when the neuron part G having the sound of “horse” is input, The excitement of the neuron part B having the concept of “is sustained”.
  • FIG. 8 shows an example of a latch structure. This example shows the addition of two numbers.
  • the neuron unit 11 corresponding to the word “2”, the neuron unit 12 corresponding to the word “3”, and the neuron unit 13 corresponding to the word “add (sum)” are each called “2”.
  • the neuron unit 21 of the number concept, the neuron unit 22 of the number concept “3”, and the neuron unit 23 of the calculation concept “add” are excited, and a timing signal is sent at the timing when all the neurons are excited, The neuron part 3 having the concept of the number “5” in the latter stage is excited.
  • a series of neuron parts that express a certain function may be preliminarily connected to one or more series of neuron parts that express other functions, and may be ignited when the sum exceeds a threshold value.
  • a “sound” neuron unit consisting of a “sound” neuron unit consisting of a “sound” neuron unit called “shi” and a “sound” neuron unit called “ma” When a “sound” neuron unit consisting of a “sound” neuron unit consisting of a “sound” neuron unit and a “sound” neuron unit called “ma” is input to a logical structure, The "sound” neuron part with a new "zebra” sound can be formed. In this way, by igniting the “meaning” that is a related concept group, it is possible to form a situation that facilitates the firing of a higher-order “meaning” as a wider concept group.
  • the neural network structure of the present embodiment may further include an oscillation circuit (oscillation unit) shown in FIG.
  • the oscillation circuit is composed of a first neuron unit 1, a second neuron unit 2, and another neuron unit 3.
  • the other neuron part 3 is shown in this figure, it is not limited to this, There may be one or more.
  • the oscillation circuit includes a neuron unit 1, an oscillation neuron unit that includes a neuron unit 2 and a neuron unit 3 and has a structure in which an output value generated by firing of the second neuron unit 2 is propagated to the neuron unit 2 itself via the neuron unit 3.
  • the synapse part connected in the direction from the neuron part 1 to the neuron part 2 has a firing weight coefficient 9 which is a connection strength that the neuron part 2 fires in one sum.
  • the neuron units in the oscillation neuron unit that is, the synapse unit that connects the neuron unit 2 and the neuron unit 3 have a firing weight coefficient 9 that is a coupling strength at which each neuron unit fires in one sum.
  • the synapse unit connected in the direction from the neuron unit 2 and the neuron unit 3 to the neuron unit 1 in the oscillation neuron unit is an unfired weight coefficient 4 that is a connection strength at which the first neuron unit does not fire because the sum of one time is less than the threshold value.
  • the neuron unit 1 When the neuron unit 1 receives a stimulus (1) from the outside, the neuron unit 2 always fires (2) and does not fire the neuron unit 3.
  • the neuron unit 2 that has received the firing (2) i.e., the stimulus (2) has a weighting coefficient 9 that fires in a single sum because the synapse unit connected in the direction from the neuron unit 1 to the neuron unit 2 has a neuron unit.
  • firing (3) is performed, and the synapse part linked to the neuron part 1 has an unfired weight coefficient that is not fired by one summation of the weighting coefficient 4.
  • firing (3) is performed, the neuron part 1 is not in an excited state by itself.
  • the neuron unit 3 that has received the firing (3) i.e., the stimulus (3)
  • ignition (4) is performed.
  • the neuron unit 2 that has received the stimulus (4) fires (5) the neuron unit 1 and the neuron unit 3, and the neuron unit 1 that has received the stimulus (4) is not in an excited state by itself.
  • this neural network structure is composed of the first neuron unit 1, the second neuron unit 2 and the other neuron unit 3, and the output value due to the firing of the second neuron unit passes through the other neuron units.
  • An oscillating neuron unit having a structure that propagates to the second neuron unit itself, and the synapse unit connected in the direction from the first neuron unit to the second neuron unit is the second neuron unit in one summation
  • a synapse part that connects the neuron parts in the oscillating neuron part has a firing weight coefficient that fires the neuron part in a single sum
  • the first neuron part in the oscillating neuron part The synapse part connected in the direction to the neuron part has an unfired weight coefficient that the first neuron part does not fire because the sum is less than the threshold value. It includes an oscillating unit. According to this, it is possible to maintain the potential excitement state, that is, the state of being easily excited, for a certain period
  • this neural network structure can be provided with a short-term memory part that can store events and states for a certain period in the neuron part. Become. Further, by combining the oscillation circuit with a loop structure, it is possible to perform short-term memory for several tens of seconds by repeating a certain period in which the oscillation circuit oscillates, and it is possible to connect to the thinking described later.
  • the neural network structure of the present embodiment may further include a mechanism for strengthening the coupling strength of the synapse part.
  • this strengthening mechanism includes a connection strength conversion table, and when the neuron unit 1 on the input side and the neuron unit 2 on the output side are simultaneously excited within a predetermined time, the synapse between the neuron units is related. It strengthens the strength of the bond. Neurons in the human brain memorize an event when it is repeatedly input, then recognize whether it is the same event, and react to it if it is the same event Then, a signal is output. And it becomes easy to memorize
  • This strengthening mechanism is a model of the function in the brain.
  • a certain event is repeatedly input.
  • the fact that it becomes easier to memorize as the same information is repeated represents a state in which the coupling strength is strengthened and the state becomes more excited and more easily ignited.
  • Long-term memory can be formed by a series of neuron parts that have synaptic parts thus enhanced in connection strength and are prone to excitement.
  • long-term memory is formed by approximately seven stimulations.
  • long-term memory is set by three stimulations.
  • the setting is such that long-term memory is formed three times, but is not limited to three times.
  • this neural network structure includes a long-term memory unit composed of a series of neuron units.
  • the neural network structure includes a short-term memory unit including a loop structure and one or more oscillation circuits. According to this, long-term and short-term memory can be held by the neuron unit. It is said that the human brain has sensory memory in addition to short-term memory and long-term memory.
  • this sensory memory can be realized as an input buffer in an input unit for inputting information to the neural network structure.
  • the neural network structure shown in FIG. 11 includes a loop structure and another neural network structure that receives, as an input value, an output value that a neuron unit fires and outputs by propagating the loop structure.
  • the synapse part in the loop structure has a weight coefficient of 8 or more that becomes an excited state by one summation, and all the neuron parts have strong connection strength (strong connection).
  • the neuron units 1 to 6 in other neural network structures are each provided with the oscillation circuit described above.
  • the neuron part in the loop structure and the neuron parts 1 to 6 in the other neural network structures are weakly coupled at a synapse part having a weighting coefficient that becomes an excited state by summing a plurality of times. Further, it is assumed that the neuron unit 1 and the neuron unit 2, the neuron unit 2 and the neuron unit 3, the neuron unit 4 and the neuron unit 5, and the neuron unit 6 and the neuron unit 7 are strongly coupled, and the others are weakly coupled.
  • a strong stimulus above the threshold is given to the neuron unit 1 of another neural network structure through a strong-coupled synapse. Then, the neuron unit 1 becomes excited and ignites with respect to the neuron unit 2, and the neuron unit 2 also ignites and the first stimulus is propagated to the neuron unit 3. However, since the firing timings of the neuron unit 1 and the neuron unit 3 are shifted, the neuron unit 4 does not fire. Note that the neuron units 1 to 3 have an oscillation circuit and thus hold a potential excited state for a certain period.
  • a weak stimulus below the threshold is received from the neuron in the loop structure via the weakly connected synapse.
  • the stimulus is transmitted to the neuron parts 1 to 6, and the neuron parts 1 to 3 are kept in a potential excitement state. Therefore, even if the stimulus is weak, the neuron part 1 and the neuron part 3 fire simultaneously.
  • the neuron part 4 is connected by a synapse part having only weak connection, the neuron part 4, the neuron part 1, and the neuron part 3 in the loop structure are simultaneously stimulated, and thus become excited and ignite.
  • the neuron unit 4 fires, it propagates to the neuron unit 5.
  • the neuron unit 6 does not fire. Since the neuron units 1 to 6 have an oscillation circuit, they hold a potential excited state for a certain period.
  • a weak stimulus below the threshold is received from the neuron in the loop structure via the weakly connected synapse.
  • the stimulation is transmitted to the neuron parts 1 to 6, and the neuron parts 1 to 5 are kept in the potential excitement state. Therefore, even if the stimulus is weak, the neuron part 3 and the neuron part 5 are fired simultaneously.
  • the neuron part 6 is connected by a synapse part having only weak connection, but since it is simultaneously stimulated by the neuron part, the neuron part 3 and the neuron part 5 in the loop structure, it becomes excited and ignites.
  • the neuron unit 6 fires, it propagates to the last neuron unit 7 that propagates in another neural network structure.
  • a neural network structure when a neural network structure includes a loop structure, when a neuron part included in the loop structure receives an output value, i.e., a stimulus by firing, the stimulus propagates in the loop structure one after another. Therefore, it is made to function so that the stimulus continuously propagates in the loop structure, and the neuron part in the loop structure is repeatedly excited at a certain time interval every time the stimulus in the loop structure is propagated. Can function. Moreover, when it inputs into a certain neuron part, it will ignite and the output value, ie, irritation
  • a series of neurons in an excited state or a potential excited state increases, and it is easy to express a predetermined function.
  • the brain information processing function it is easier to draw a conclusion when thinking deepens. It is also possible to think that they are thinking in the same way as humans.
  • the neural network structure of this embodiment includes the neural network structure of the above embodiment.
  • the neural network structure of the present embodiment includes a first neural network structure and a second neural network structure, and connects the neuron part included in the first neural network structure and the neuron part included in the second neural network structure.
  • a synapse is provided. That is, the present neural network structure includes two or more different neural network structures and has a synapse portion that connects these neural network structures.
  • the neural network structure shown in FIG. 12 includes a loop structure, a first neural network structure that receives as an input value an output value that is fired and output by a neuron unit that propagates through the loop structure, and a first neural network.
  • a second neural network structure including a neuron unit connected by a synapse unit and a neuron unit in the structure is provided.
  • the synapse part in the loop structure has a weight coefficient of 8 or more that is excited in the sum of one time along the loop shape, and all the neuron parts forming the loop are adjacent with strong connection strength (strong connection) It is connected to the neuron part that does.
  • Each of the neuron units 1 to 10 in the first neural network structure includes the oscillation circuit described above. Then, the neuron part in the loop structure and the neuron parts 2 to 9 in the first neural network structure are weakly coupled at a synapse part having a weighting coefficient that becomes an excited state by a total of a plurality of times.
  • the second neural network structure includes procedures 1 to 4.
  • Procedures 1 to 4 have a structure in which several neuron parts are all strongly coupled. Therefore, procedures 1 to 4 are a state in which a series of neuron parts that express a predetermined function are formed, and are a state having memory and meaning according to procedures 1 to 4.
  • a strongly coupled stimulus is applied to the neuron unit 1 of the first neural network structure. Then, the neuron unit 1 becomes excited and ignites with respect to the neuron unit 2, and the first stimulus is propagated to the neuron unit in the procedure 1 of the second neural network structure. Since all the neuron parts in the procedure 1 are strongly coupled, they propagate to the neuron part 3 in the first neural network structure. However, since the firing timings of the neuron unit 2 and the neuron unit 3 are shifted, the neuron unit 4 does not fire. Note that the neuron unit 2 and the neuron unit 3 have an oscillation circuit, and thus hold a potential excited state for a certain period.
  • a weak connection is stimulated from the neurons in the loop structure.
  • the stimulation is transmitted to the neuron units 2 to 9, and the neuron unit 1 and the neuron unit 3 maintain the potential excitement state. Fire at the same time.
  • the neuron part 4 is connected by a synapse part having only weak connection, the neuron part 4, the neuron part 2, and the neuron part 3 in the loop structure are simultaneously stimulated, and thus become excited and ignite.
  • the neuron unit 4 fires, it propagates to the neuron unit in the procedure 2 in the second neural network structure. Since all the neuron parts in the procedure 2 are strongly coupled, they propagate to the neuron part 5 in the first neural network structure. However, since the firing timings of the neuron unit 4 and the neuron unit 5 are shifted, the neuron unit 6 does not fire. Since the neuron units 2 to 9 have the oscillation circuit, they hold a potential excited state for a certain period.
  • a weak connection is stimulated from the neurons in the loop structure.
  • the stimulation is transmitted to the neuron units 2 to 9, and the neuron unit 4 and the neuron unit 5 maintain the potential excitement state.
  • the neuron unit 6 is connected by a synapse unit having only weak connection, but since it receives stimuli from the neuron unit, the neuron unit 4 and the neuron unit 5 in the loop structure at the same time, it becomes excited and ignites.
  • the neuron unit 6 fires, it propagates to the neuron unit in the procedure 3 in the second neural network structure. Since all the neuron parts in the procedure 3 are strongly coupled, they propagate to the neuron part 7 in the first neural network structure. However, since the firing timings of the neuron unit 6 and the neuron unit 7 are shifted, the neuron unit 8 does not fire. Since the neuron units 2 to 9 have the oscillation circuit, they hold a potential excited state for a certain period.
  • a weak connection is stimulated from the neurons in the loop structure.
  • the stimulation is transmitted to the neuron units 2 to 9, and the neuron unit 6 and the neuron unit 7 maintain the potential excitement state. Fire at the same time.
  • the neuron part 8 is connected by a synapse part having only weak connection, the neuron part, the neuron part 6 and the neuron part 7 in the loop structure are simultaneously stimulated and thus become excited and ignite.
  • the neuron unit 8 fires, it propagates to the neuron unit in the procedure 4 in the second neural network structure. Since all the neuron parts in the procedure 4 are strongly coupled, they propagate to the neuron part 9 in the first neural network structure. Then, the neuron unit 9 also fires and propagates to the neuron unit 10 that propagates last in the first neural network structure.
  • a neural network structure when a neural network structure includes a loop structure, when a neuron part included in the loop structure receives an output value, i.e., a stimulus by firing, the stimulus propagates in the loop structure one after another. Therefore, it can function so that the stimulus continues to propagate continuously in the loop structure.
  • a synapse unit that connects neuron units included in different neural network structures in which long-term memory, short-term memory, and the like are held, it becomes possible to perform cooperation between neural network structures of different systems.
  • this neural network structure is suitable for executing procedural processing.
  • the first neural network structure is called a plan level neural network structure
  • the second neural network structure is called an execution level neural network structure.
  • the neural network structure shown in FIG. It is possible to execute detailed actions (procedures) based on a large plan.
  • firing (1) is performed in the first neuron section at the planning level.
  • firing (2), firing (3), and firing (4) are performed for searching for an algorithm that can be conceived to solve a puzzle in the neuron portion at the execution level.
  • firing (2) is directed to a neuron section having a procedure of “raising straight up”
  • firing (3) is directed to a neuron section having a procedure of “twisting”
  • firing (4) is “rotated”.
  • ⁇ Stimulation of weak connection is given from the neuron part at the planning level to the neuron part at the execution level.
  • the connection strength to the neuron portion is a state (strong connection) where the connection strength from the neuron portion having the procedure of “rotate” is the highest.
  • firing (6) is performed from the execution level to the planning level, and an algorithm of “rotate” is returned.
  • the leading algorithm for solving the puzzle is to “rotate”.
  • the stimulus propagates to the “execution” neuron unit, and the “execution” neuron unit fires against the “execute rotation” neuron unit of the execution level.
  • a predetermined procedure is executed at the execution level, and the result (“disconnected” in the figure) is propagated to the neuron unit at the planning level.
  • the first neural network structure has a general purpose or planned function
  • the second neural network structure has a general or planned function, thereby expressing the general purpose or planned function.
  • individual or executable functions can be expressed.
  • the neural network structure at the planning level can select an algorithm from the neural network structure at the execution level.
  • the conventional information processing system has a frame problem in which an algorithm suitable for the current situation cannot be selected from a plurality of algorithms and executed.
  • a conventional information processing system takes an infinite amount of time in consideration of everything that can happen, so a frame (frame) is attached to a specific theme or range and processing is performed only within that frame.
  • the first neural network structure is a master having a function of controlling one or more second neural network structures
  • the second neural network structure is the same as the first neural network structure. It may be a slave that functions under control.
  • one main neural network structure can have a larger neural network structure that is controlled by the hierarchical structure by controlling other neural network structures.
  • a dual master-slave configuration in which two neural network structures recognize each other as a master and a partner as a slave may be employed. According to this, it is possible to achieve a predetermined purpose of one's own neural network structure while mutually controlling the other neuron portion.
  • one of the master slaves may be a conventional program instead of the neural network structure.
  • the program is provided by incorporating instinctive “unconscious” behavior, and a series of neuron units that express a predetermined function provide innate “conscious” behavior as a function.
  • the learning function is a function for increasing the connection strength of the synapse parts in the neural network structure, that is, the weight coefficient of the synapse parts for connecting the neuron parts is increased, and the inter-neuron part or the neural network
  • This is a function for facilitating formation of a series of neuron portions that promote a strong connection between structures and express a predetermined function.
  • the associative function is not able to form and output a series of neuron parts only by the input, but it does not output a different neural network structure or a stimulus from a series of neuron parts ( A function to make it easier to obtain output by receiving additional stimuli.
  • the evaluation function is a function for evaluating the validity of the output information obtained by the associative function.
  • the learning function in this embodiment is that even if there are two unrelated neural networks, if there are neurons that are excited (fired) at the same time, Considers that there is some relation, forms a synapse part that connects both neuron parts, and increases the weight coefficient of the synapse part. In this way, it is possible to output what could not be output by only one neural network structure by connecting the neuron parts that happened to be excited simultaneously.
  • At least one neuron unit in the first series of neuron units expressing a predetermined function and a predetermined function different from the first series of neuron units are expressed.
  • the weight coefficient of the synapse unit that connects the first series of neuron units and the second series of neuron units increases. Further, such a learning function is the same between a neuron unit included in a certain neural network structure described in the above embodiment and a neuron unit included in a different neural network structure.
  • the weight coefficient of the synapse part that connects the neuron part included in the first neural network structure and the neuron part included in the second neural network structure different from the first neural network structure is the neuron at both ends connected by the synapse part. It may be increased if parts fire simultaneously.
  • Such a learning function promotes strong connection between different series of neuron parts or between different neural network structures, making it easy to form a series of neuron parts that express a given function, and the relationship between neural network structures. It becomes easy to tie. For example, in the human cerebrum, it is said that when the input side neuron and the output side neuron are excited simultaneously, the synaptic connection that connects them is strengthened. By doing this, it becomes possible to make a new “discovery” even if it was not understood by one system. For example, knowledge about the physical relationship between the earth and the moon is related to the phenomenon of apple falling.
  • the screen used in this verification and shown in this figure was created on an integrated development environment called Unity.
  • Unity an integrated development environment
  • the neuron part in the concept network that joins the visual cortex and the auditory cortex integrates the attributes attached to the neuron part in the concept network by regarding the same concept when fired at the same time.
  • the upper part of the screen shows the visual cortex. From the left, the visual cortex corresponds to apple, mandarin, melon, persimmon, banana, persimmon, persimmon, carrot, radish, and eggplant.
  • the shape neuron part to be formed is already formed and displayed.
  • the auditory cortex is shown.
  • the auditory cortex includes apple, mandarin orange, melon, persimmon, banana, persimmon, persimmon, carrot and radish.
  • Word neuron portions corresponding to the lions are already formed and displayed.
  • Many neurons in the conceptual network that connect the visual cortex and auditory cortex are displayed between the visual cortex and auditory cortex.
  • the concept network prepares an intermediate neuron part that is not connected to connect an equivalent concept or a different concept, and triggered by the simultaneous firing of a plurality of neuron parts belonging to different neural network structures. It is a mechanism for linking multiple neuron parts to form new concepts and promote learning.
  • step 1 the neuron portion corresponding to the shape of the banana in the visual cortex and the neuron portion corresponding to the word banana in the auditory cortex are simultaneously excited (fired) so that the infant can hear the word banana while showing the banana.
  • step 2 one of the intermediate neuron parts in the concept network changes to a neuron part corresponding to the concept of banana.
  • the input neurons are surrounded by an ellipse, and the output neurons are surrounded by a rectangular dotted line.
  • step 2 the neuron part corresponding to the shape of the banana in the visual cortex and the neuron part corresponding to the concept of sweet under the concept of taste are taught to teach the concept that the banana taste is sweet while showing the infant a banana.
  • step 2 one of the intermediate neurons in the concept network changes to a neuron corresponding to the concept of sweet taste, and a synapse between the neuron corresponding to the concept of sweet taste and the neuron corresponding to the concept of banana.
  • the concept of banana is linked to the concept of sweet taste in both directions.
  • step 3 what is the taste of bananas for young children?
  • the neuron part corresponding to the word “banana” in the auditory cortex and the neuron part corresponding to the taste are stimulated (excited).
  • a series of neurons connected by the synapse that the taste of banana is sweet due to the stimulation of the visual cortex has already been formed, so from the neuron corresponding to the word banana in the auditory cortex, neurons of the concept of banana Through the department, it was linked to the concept of sweet taste, so the answer was that the taste of banana was sweet.
  • FIG. 19 shows the flow of the associative function.
  • the associative function receives an input (additional stimulus) from a different neural network structure or a series of neuron parts in addition to an input to one neural network structure, and easily obtains an output. For example, given a banana image in a food image, you can extract features such as a long, slightly curved yellow-green food, but there is a series of neurons that are semi-excited but cannot output the concept of bananas By inputting the concept of sweet taste as an additional stimulus, a series of neurons that output the concept of banana is formed.
  • This figure (A) shows that it is going to excite a series of neuron parts in addition to input information to a certain neural network structure.
  • the associative function is such that a series of neuron units receives an input from an output value by firing of a neuron unit belonging to another neural network structure other than the other neural network structure to which the self-association unit belongs.
  • the sum of products of the output value and the weighting factor of the synapse part is equal to or greater than a threshold value, and functions so as to develop a predetermined function.
  • output information can be obtained by giving related information or the like as an additional stimulus to a series of neuron units in a quasi-excited state in which output can be obtained in a little more amount.
  • the evaluation function is a function for collating the output information obtained by the associative function with the input information, obtaining the accuracy of the output information, and extracting the reason for determining the output information.
  • the excitation pattern of the neuron that was excited by receiving input information like the neural network structure on the left side of this figure, and the output information when output information was output after receiving additional stimuli like the neural network structure on the right side of this figure The excitation pattern of the neuron part traced in the reverse direction from the neuron part corresponding to is compared. In the evaluation function, these are compared, and the higher the degree of coincidence of the excited neuron portion, the higher the accuracy of the output information is evaluated.
  • the evaluation function is a series of neurons whose sum is equal to or greater than the threshold by receiving input from the output value of the firing of the neuron part belonging to another neural network structure other than the other neural network structure to which the evaluation function belongs. It functions to the part.
  • the evaluation function is the neuron part that fires in a series of neuron parts when receiving input only from the output value by firing of the neuron parts belonging to other neural network structure to which it belongs, and other neural network to which it does not belong To compare whether the firing ratio of the firing neuron is equal to or greater than a predetermined value by comparing the firing neuron with a series of neurons when input is received from the output value from firing of the neuron that belongs to the structure Evaluate by having such an evaluation function, it is possible to ensure the validity of the output information output in the associative function.
  • All the neural network structures described above can be broadly realized in two ways.
  • One is a method of building a simulator with software on a Neumann computer.
  • the other is a method of constructing hardware in which a neuron is modeled with an electronic circuit and a large number of such neurons are mounted.
  • the former is suitable for small-scale and trial neural network structures, and the latter is suitable for large-scale and full-scale neural network structures.
  • An electronic circuit for realizing the above-described neural network structure by hardware includes an arithmetic element that forms a neuron part and a synapse part, a storage element that stores a weight coefficient, and an adder circuit that calculates a sum.
  • Arithmetic elements constituting the neuron part and the synapse part are multi-input single-output type semiconductor elements, and a large number of these elements are integrated to constitute a neural network structure.
  • the adder circuit for calculating the sum is a circuit for adding the weighting coefficients of the synapse part when the neuron part fires.
  • the configuration of the circuit can be configured by a known circuit that adds digital values expressed by binary values of 0 and 1.
  • the storage element that stores the weighting coefficient is a semiconductor element having a function of holding a value, and there exists at least one corresponding to each synapse portion.
  • this memory element may be implemented as a memristor element.
  • a memory element that can take not only a digital value of 0 or 1 but also a plurality of values can be formed by only one memristor element that is a passive element, and the stored value is used as an electric resistance.
  • the stored value can be increased or decreased depending on the accumulated value of the flowed current, and the memory of the memristor element is non-volatile, so that the memory can be retained even when the power is turned off.
  • An electronic circuit including the neural network structure can be provided by configuring a neural network structure including the loop structure described above in a part of the plurality of arithmetic elements constituting the neuron part and the synapse part.
  • the electronic circuit may be a so-called neurochip, which is a circuit in which a neural network structure is mounted by a large number of electronic elements and wirings on a substrate made of a semiconductor such as silicon.
  • the information processing system 100 includes a neurochip 10 that is the above-described electronic circuit, a neuroboard 20 provided with a plurality of neurochips 10, and a region having a logical configuration in which a plurality of neuroboards 20 are gathered.
  • the functional unit 30, the back panel 60 for the plurality of area functional units 30 to function in an integrated manner, and externally accepting optical input, acoustic input, electrical input, or physical operation input
  • an input unit 40 that converts the electric signal to the outside
  • an output unit 50 that converts the electric signal into an optical output, an acoustic output, an electrical output, and an output of physical operation.
  • the area function unit 30 is a logical unit for realizing a function having a plurality of neuro boards 20.
  • the area function unit 30 is an area having a voice analysis function or an image analysis function, a field area of a word or shape, a field of a concept, a field of a grammar function, or a decision making function.
  • There are various areas such as Since each area requires a large number of neuron chips 10, it is realized by using a plurality of neuroboards 20.
  • the input unit 40 is a device that receives as input any stimulus such as light, sound, electrical or physical stimulus given from the outside of the information processing system 100 and converts it into an electrical signal.
  • any stimulus such as light, sound, electrical or physical stimulus given from the outside of the information processing system 100 and converts it into an electrical signal.
  • the output unit 50 outputs light, sound, electrical, and physical signals that are electrically output from the neuron chip 10.
  • the related functional unit 30 functions in response to external stimuli input from the input unit 40, and reacts by the output unit 50.
  • the information processing system 100 is integrated by a loop structure formed in the neural network structure, so that the neuron part in the information processing system 100 is easily excited and the stimulus is easily propagated between the neuron parts.
  • a neural network structure according to the present invention as a software simulator on a Neumann computer.
  • this simulator software in a neural network structure having a plurality of neuron units and a synapse unit that connects the plurality of neuron units, a method for propagating firing through the neuron unit and the synapse unit is implemented.
  • the simulator software Causing the first neuron to fire and propagate to the second neuron through the synapse;
  • the (N-1) neuron unit ignites when the firing from the (N-2) neuron unit is propagated, and propagates to the Nth neuron unit via the synapse unit;
  • the Nth neuron unit ignites when the firing from the (N ⁇ 1) th neuron unit is propagated (N is a natural number of 3 or more), and propagates to the first neuron unit via the synapse unit; , including. According to this, when a stimulus is received, the stimulus is repeatedly transmitted to itself.
  • the function of the stimulus continues to propagate and the neural network structure It is possible to provide simulator software and a method thereof that can function to repeatedly excite a certain time interval each time a stimulus that is continuously propagated in a (loop structure) is propagated.
  • this program may be a program for executing a simulation of the neural network structure on a computer.
  • This program Memory enhancement / degradation step to increase / decrease the weight factor of the synapse part,
  • Neurochip electronic circuit
  • neuro board 30 territory functional unit
  • input unit 50 output unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention a pour objet de réaliser une structure de réseau neuronal, et similaire, qui modélise la structure des circuits neuronaux du cerveau humain. L'invention réalise à cet effet une structure de réseau neuronal qui comprend une pluralité de parties de neurone et une partie de synapse qui relie la pluralité de parties de neurone, et dans laquelle une deuxième partie de neurone connectée à une première partie de neurone par le biais de la partie de synapse se décharge en fonction d'une somme obtenue en additionnant, au moins une fois, le produit de la valeur de sortie de la décharge ou de l'absence de décharge de la partie de neurone et d'un facteur de pondération de la partie de synapse. La structure de réseau neuronal est pourvue d'une structure en boucle dans laquelle une valeur de sortie en fonction de la décharge de la première partie de neurone est propagée vers une telle partie de neurone par le biais de la deuxième partie de neurone et de la partie de synapse.
PCT/JP2018/010011 2017-03-28 2018-03-14 Structure de réseau neuronal, circuit électronique, système de traitement d'informations, procédé et programme WO2018180499A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019509227A JP6712399B2 (ja) 2017-03-28 2018-03-14 ニューラルネットワーク構造、電子回路、情報処理システム、方法、およびプログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-062748 2017-03-28
JP2017062748 2017-03-28

Publications (1)

Publication Number Publication Date
WO2018180499A1 true WO2018180499A1 (fr) 2018-10-04

Family

ID=63675435

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010011 WO2018180499A1 (fr) 2017-03-28 2018-03-14 Structure de réseau neuronal, circuit électronique, système de traitement d'informations, procédé et programme

Country Status (2)

Country Link
JP (1) JP6712399B2 (fr)
WO (1) WO2018180499A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109657785A (zh) * 2018-11-28 2019-04-19 北京工业大学 可变神经元兴奋度的神经网络电路结构
JP2022538592A (ja) * 2019-07-03 2022-09-05 マイクロン テクノロジー,インク. ニューラルネットワークメモリ
CN116542291A (zh) * 2023-06-27 2023-08-04 北京航空航天大学 一种记忆环路启发的脉冲记忆图像生成方法和系统
CN120065877A (zh) * 2025-04-25 2025-05-30 南京师范大学 大规模离散神经网络同步控制的mcu电路实现方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05128081A (ja) * 1991-01-29 1993-05-25 Wacom Co Ltd ニユ―ロンの結合方法及びニユ―ラルネツトワ―ク
US6424961B1 (en) * 1999-12-06 2002-07-23 AYALA FRANCISCO JOSé Adaptive neural learning system
WO2015001697A1 (fr) * 2013-07-04 2015-01-08 パナソニックIpマネジメント株式会社 Circuit de réseau neuronal et procédé d'apprentissage associé

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05128081A (ja) * 1991-01-29 1993-05-25 Wacom Co Ltd ニユ―ロンの結合方法及びニユ―ラルネツトワ―ク
US6424961B1 (en) * 1999-12-06 2002-07-23 AYALA FRANCISCO JOSé Adaptive neural learning system
WO2015001697A1 (fr) * 2013-07-04 2015-01-08 パナソニックIpマネジメント株式会社 Circuit de réseau neuronal et procédé d'apprentissage associé

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Deep Learning Programing", vol. 89, 25 November 2015, ISBN: 978-4-7741-7638-3, article MURATA, KENTA, pages: 47 - 52 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109657785A (zh) * 2018-11-28 2019-04-19 北京工业大学 可变神经元兴奋度的神经网络电路结构
JP2022538592A (ja) * 2019-07-03 2022-09-05 マイクロン テクノロジー,インク. ニューラルネットワークメモリ
CN116542291A (zh) * 2023-06-27 2023-08-04 北京航空航天大学 一种记忆环路启发的脉冲记忆图像生成方法和系统
CN116542291B (zh) * 2023-06-27 2023-11-21 北京航空航天大学 一种记忆环路启发的脉冲记忆图像生成方法和系统
CN120065877A (zh) * 2025-04-25 2025-05-30 南京师范大学 大规模离散神经网络同步控制的mcu电路实现方法

Also Published As

Publication number Publication date
JP6712399B2 (ja) 2020-06-24
JPWO2018180499A1 (ja) 2020-02-13

Similar Documents

Publication Publication Date Title
Graupe Principles of artificial neural networks: basic designs to deep learning
US11055609B2 (en) Single router shared by a plurality of chip structures
Denning Computational thinking in science
Doncieux et al. Evolutionary robotics: what, why, and where to
WO2018180499A1 (fr) Structure de réseau neuronal, circuit électronique, système de traitement d'informations, procédé et programme
US9697462B1 (en) Synaptic time multiplexing
Bak et al. Adaptive learning by extremal dynamics and negative feedback
Taiji et al. Dynamics of internal models in game players
Sher Handbook of neuroevolution through Erlang
KR20160138002A (ko) 스파이킹 dbn (deep belief network) 에서의 트레이닝, 인식, 및 생성
TW201535277A (zh) 以陰影網路監視神經網路
Negnevitsky The history of artificial intelligence or from the
Goldstone Becoming cognitive science
Chivers How to train an all-purpose robot: DeepMind is tackling one of the hardest problems for AI
Luger Modern AI and how we got here
Lukyanova et al. Neuronal topology as set of braids: information processing, transformation and dynamics
CN112352248A (zh) 用于利用表征神经网络连接的参数基因组作为构建块来构造具有前馈和反馈路径的神经网络的装置和方法
Ikegami et al. Joint attention and dynamics repertoire in coupled dynamical recognizers
Lisovskaya et al. Processing of Neural System Information with the Use of Artificial Spiking Neural Networks
Durkin History and applications
JPH09185596A (ja) パルス密度型信号処理回路網における結合係数更新方法
Doughan et al. Biomimetic Cells: A New Frontier in Brain Informatics
Teuscher Turing’s connectionism
Cabessa et al. Neural computation with spiking neural networks composed of synfire rings
Piskur et al. Braincrafter: An investigation into human-based neural network engineering

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18777758

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019509227

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18777758

Country of ref document: EP

Kind code of ref document: A1