dmca-activists
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[DMCA-Activists] Patent on Autonomous Robots based on Abstract Self-Trai


From: Seth Johnson
Subject: [DMCA-Activists] Patent on Autonomous Robots based on Abstract Self-Training Neural Networks
Date: Sun, 19 Sep 2004 08:20:20 -0400

> http://www.emediawire.com/releases/2004/9/emw159636.htm


A Fully Autonomous Robot Builds Its Own Brain and Learns from
Scratch 

For the first time in history, a robot has built its own
synthetic central nervous system and then learned not only to
walk, but how to autonomously enter and navigate the corridors of
complex buildings.

(PRWEB) September 19, 2004 -- This dramatic experiment was
recently conducted at Imagination Engines, Inc. (IEI) in St.
Louis, Missouri. Company President & CEO, Dr. Stephen Thaler
points out that heretofore, scientists in the field of artificial
intelligence have grossly over exaggerated claims that their
robots are autonomous when in fact, immense scholarly efforts
have been poured into writing what he calls “if-then-else”
computer programs. Alternately, he points out that genetic
programmers have devised schemes wherein neural circuitry evolves
to enable robots to perform moderately challenging tasks.
However, close analysis of the engineering results reveal that
these feats are not so amazing, nor are they accomplished in
convenient time scales. Tasks as simple as navigating a simple
racetrack maze typically requires about 48 hours, not to mention
the month invested in writing and perfecting the underlying
computer program!

In stark contrast, Thaler and his assistants simply sit back,
fold their arms and watch neural networks spontaneously connect
themselves in a matter of seconds into the neural circuitry
required for extremely ambitious robotic brains (US Patent
5,852,815, “Neural Network Based Prototyping System and Method”).
The resulting neural network architecture both resembles and
functions like a brain, a collection of individual neural
networks fused into a contemplative system that can form complete
models of their worlds, consider alternative scenarios, and
finally choose that alternative best suited to a given problem.
…Thaler quickly points out that the neural circuitry developed
through genetic programming are only “reactive.” They are
tantamount to reflex reactions in the brain or spinal chord
wherein a stimulus simply triggers a response. The self-forming
brains of IEI’s robots are entirely different. Like human brains,
they think, experiment, and automatically perfect their behaviors
to produce downright unexpected results, what can only be called
creativity.

Even more exciting is the methodology used to accelerate the
learning and the bootstrap toward creative behaviors. Rather than
carry robots to various world settings and wait for different
kinds of challenging scenarios to arise in order to enrich their
learning experience, IEI’s robots effectively go to sleep and
enter a virtual dream world wherein they experience myriad
settings and situations against which they may pit their
accumulating knowledge and creativity. When they awake, they may
apply their dream state training to the real world, or use such
experience to devise even more ambitious responses to newly
arising environmental scenarios.

Recently, in a dramatic experiment conducted for the DoD, IEI
scientists and engineers built a complex hexapod robot that
effectively began life as a kind of “cybernetic road kill,”
essentially a heap of tangled legs and electronics that learned
how to walk in a period of only minutes. Continuing its learning
in virtual reality, it self-originated new methodologies for
navigating complex facilities and landscapes, as well as novel
kinds of locomotion wherein it assumed bipedal stances to quickly
evade threats. Awakened from its virtual reality test
environment, it could then carry out similar behaviors in
reality. The military is likewise considering such creative
robots as sensor platforms for force protection and urban warfare
scenarios. Visionary military thinkers see them fulfilling roles
ranging from that of brilliant swarm munitions to the fully
autonomous neural network based cyber-warriors anticipated by
science fiction.

All of this truly revolutionary robotic technology is based upon
IEI’s expansive suite of fundamental neural network patents.

---

> http://www.imagination-engines.com/corporate.htm

Imagination Engines Inc. has a series of related Neural Net
products 

 The Creativity MachineTM (CM, US Patent 5,659,666) - A whole new
paradigm in neural network technology that learns known
principles by example and then ventures off to develop startling
innovations. Here is the broadest suite of patented neural
network optimization techniques in the world. If you produce
neural network products or offer consulting in this area, you
need to be aware of this intellectual property.  

 The Self-Training Artificial Neural NetworkTM (STANN, US Patent
5,845,271) - Neural nets that require absolutely no training
algorithms. Training is as simple as cutting and pasting the
STANN among its target data. This patent completely obviates the
need for conventional neural network trainers. STANNs are so
automated that they form the backbone of many of the interactive
demos on this site.  

 The DataBotTM (US Patents US 5,852,815 and 5,852,816) - A novel
form of neural network that has developed its own forms of
locomotion and reproduction. It may patrol databases,
autonomously learning and extracting every conceivable discovery.
Most importantly, DataBots may autonomously link themselves into
successively larger brain structures that harness the Creativity
Machine Paradigm to generate ideas.  

 Plus Bleeding Edge Data Mining and Knowledge Discovery Tools -
We're sincere in saying that conventional datamining
methodologies can't hold a candle to our patented technologies.
Besides, why do humans need to savor the "whys and wherefores"
when a machine can simply recommend and then implement the very
best products and services.  

. . .

The World's First Company Dedicated to Producing Neural Networks
Capable of Human Level Invention, Discovery, and Artistic
Creativity

In explaining IEI's primary mission we note a very important
characteristic which human minds and neural networks share. Both
dream. That is, cut off from any kind of sensory inputs from the
external world, both systems generate impressions borrowed from
their surroundings, without the relevant features actually being
present. In waking humans this virtual reality is termed
'internal imagery' to distinguish it from observed reality and
the corresponding process of 'perception'. Humans use this
phenomenon to their advantage by modifying, exaggerating, and
combining such internal images to arrive at discoveries,
concepts, and inventions. Likewise, artificial neural networks
may be coaxed to carry out this very same creative process when
subjected to controlled perturbations to their internal
architectures. Thus, isolated from any kind of meaningful inputs
a neural network, thusly exposed to internal noise, may
indefinitely produce a continuous parade of diverse images,
concepts, or general impressions that can be gradually contorted
and combined to produce new and potentially useful
juxtapositional notions. This is a landmark discovery made at IEI
and elaborated into a profound invention:

Imagine now that we allow one particular neural network to freely
dream in this way, while allowing a waking network to monitor the
dreams of the first, ever alert to the appearance of some useful
or interesting concept. In this way we produce a so-called
'Creativity Machine', a device which autonomously creates useful
information, all in exchange for a constant stream of
unintelligible noise fed into this system. As these systems
inexorably crank out one idea after another, the alert partner
network ultimately captures that long sought solutions or finds
an application for some novel concept or image. The use of
focusing techniques are employed to deliberately narrow dream
content to produce a stream of only the most relevant concepts in
solving a given problem.

We anticipate the Creativity Machine to be a powerful new
paradigm in both software and hardware design, allowing computers
to tackle problems involving not only exceedingly complex
systems, but also issues in aesthetics and emotions. As an added
asset, the Creativity Machine is capable of displaying all the
characteristics of free will and the accompanying initiative to
take new directions in the course of discovery. As a result, we
are now approaching a major crossroads in technology where
machines will use these principles to attain a potential
independence and autonomy never before seen. At last, mankind
will have that long sought conversational partner that can not
only exchange verbally, but generate whole, vivid, alternative
realities, fulfilling every purpose imaginable.

---

> http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=/netahtml/srchnum.htm&r=1&f=G&l=50&s1=5,845,271.WKU.&OS=PN/5,845,271&RS=PN/5,845,271

Non-algorithmically implemented artificial neural networks and
components thereof 

Abstract

Constructing and simulating artificial neural networks and
components thereof within a spreadsheet environment results in
user friendly neural networks which do not require algorithmic
based software in order to train or operate. Such neural networks
can be easily cascaded to form complex neural networks and neural
network systems, including neural networks capable of
self-organizing so as to self-train within a spreadsheet, neural
networks which train simultaneously within a spreadsheet, and
neural networks capable of autonomously moving, monitoring,
analyzing, and altering data within a spreadsheet. Neural
networks can also be cascaded together in self training neural
network form to achieve a device prototyping system. 

. . .

Claims

What is claimed is: 

1. A computer based neural network training system, comprising: 

a computer including a spreadsheet application program operable
therewith for electronically generating a spreadsheet including a
plurality of spreadsheet cells arranged in a column and row
format such that each spreadsheet cell is identifiable by a
column and row designation, said computer and spreadsheet
application program operable to enable interrelating of said
plurality of spreadsheet cells through relative cell referencing; 

a first functional neural network constructed within said
spreadsheet and including a plurality of imaging cells for
relatively referencing a set of training inputs to said first
neural network, said first neural network further including at
least one hidden layer including a first plurality of neurons and
an output layer including a second plurality of neurons, wherein
each neuron of said hidden layer and said output layer is formed
by a first plurality of cells each containing a numeric weight
value of said neuron and an activation cell containing an
activation function which activation function relatively
references each of said first plurality of cells such that when a
calculate function of said spreadsheet is performed a numeric
value which is representative of an activation level of said
neuron is determined, said hidden layer and output layer neurons
interrelated through relative cell referencing to form said first
neural network; 

a training network constructed within said spreadsheet, said
training network including a second functional neural network
constructed within said spreadsheet and having substantially the
same configuration as the first neural network; and 

wherein, when a calculate function of said spreadsheet is
performed, a given set of training inputs is applied to said
first neural network and each training input of the given set of
training inputs is adjusted by a predetermined incremental amount
before being applied to said second neural network. 

2. A computer based neural network training system in accordance
with claim 1 wherein at least a portion of said training network
is integrated with said first neural network within said
spreadsheet. 

3. A computer based neural network training system in accordance
with claim 1 wherein said training network further includes a
derivative module constructed within said spreadsheet such that
when the calculate function of said spreadsheet is performed said
derivative module is operable to determine, for each of said
hidden layer neurons and each of said output layer neurons, a
partial derivative of the activation level thereof with respect
to a net input thereto based at least in part on a difference in
the activation levels of corresponding activation cells of said
first neural network and said second neural network. 

4. A computer based neural network training system in accordance
with claim 3 wherein said training network further includes an
error module constructed within said spreadsheet such that when
the calculated function of the spreadsheet is performed said
error module is operable to determine an error vector associated
with said given set of training inputs applied to said first
neural network. 

5. A computer based neural network training system in accordance
with claim 4 further comprising a program associated with said
training network and said first neural network, at least a
portion of said program operable to effect alteration of said
numeric weight values of said first neural network based upon
weight update terms calculated by said training network. 

6. A computer based neural network training system in accordance
with claim 5 wherein sets of training inputs are stored as
numeric values associated with cells of said spreadsheet and at
least a portion of said program is operable to effect movement of
both said first neural network and said training network to a new
location within said spreadsheet such that for a given movement
of said neural network and said training network to a given new
location a calculate function of said spreadsheet is performed
and at least some of said numeric weight values of each neuron of
said first neural network are altered to incorporate a knowledge
domain represented by a given set of training inputs associated
with said given new location within said spreadsheet. 

7. A computer based neural network training system in accordance
with claim 1, further comprising means for providing a dynamic
data exchange between said spreadsheet and an external system so
that sets of training inputs are input into predetermined cells
within said spreadsheet, and, as said sets of training inputs
flow through said spreadsheet a calculate function of said
spreadsheet is repeatedly performed. 

8. A computer based neural network training system in accordance
with claim 1, further comprising means for dynamically pruning at
least one of said hidden layer neurons from said first neural
network in an automatic manner during training. 

9. A computer based neural network training system in accordance
with claim 8 wherein said means for dynamically pruning at least
one hidden layer neuron from said first neural network includes a
program associated with said first neural network, said program
effecting determination of whether said at least one hidden layer
neuron is significantly involved in training, and, if said at
least one hidden layer neuron is not significantly involved in
training, to set the activation function associated with said at
least one hidden layer neuron to zero (0). 

10. A computer based neural network training system in accordance
with claim 1, further comprising means for adding a new hidden
layer neuron to said first neural network in an automatic manner
during training. 

11. A computer based neural network training system in accordance
with claim 10 wherein said means for adding a new hidden layer
neuron to said first neural network includes a program associated
with said first neural network, said program effecting
determination of whether an error value associated therewith
exceeds a predetermined threshold. 

12. A computer based neural network training system in accordance
with claim 11 wherein said program further effects, at
predetermined intervals during a training operation, addition of
a new hidden layer neuron to said first neural network if said
error value exceeds said predetermined threshold. 

13. A self training neural network object implemented utilizing a
computer including processing means operable to run a spreadsheet
application, comprising: 

a first functional neural network constructed in a spreadsheet of
the spreadsheet application, said first neural network including
a plurality of neurons each formed of a plurality of spreadsheet
cells including a first plurality of cells each with an
associated numeric weighting value of such neuron entered therein
and an activation cell having an activation function of such
neuron entered therein which activation function makes relative
reference to each of said first plurality of cells, wherein said
neurons are interrelated through relative cell referencing to
form said first neural network; 

a training network constructed in the spreadsheet, said training
network including a second functional neural network having the
same configuration as said first neural network, said training
network further including at least one other module constructed
within the spreadsheet for calculating weight update terms, 

a program associated with said training network and said first
neural network, said training network operable in conjunction
with said program during a training operation to alter said
numeric weighting value associated with at least some of said
first plurality of cells of each neuron of said neural network
based upon the weight update terms calculated by said training
network, 

wherein a given set of training inputs is applied to said self
training neural network object by initiating a calculate function
of said spreadsheet and said numeric weighting value associated
with at least some of said plurality of cells of each neuron is
altered to incorporate into said neural network a knowledge
domain represented by said given set of applied training inputs. 

14. A self training neural network object in accordance with
claim 13 wherein, for said given set of applied training inputs
said program is operable to effect addition of one of said weight
update terms to said numeric weighting value associated with each
cell of said first plurality of cells of each neuron of said
first neural network. 

15. A self training neural network object in accordance with
claim 13 wherein, for said given set of applied training inputs
said program is operable to effect replacement of said numeric
weighting value associated with each cell of said first plurality
of cells of each neuron of said first neural network with one of
said calculated weight update terms. 

16. A method of training a neural network, utilizing a computer
including a processing means and an associated spreadsheet
application operable therewith, said method comprising the steps
of: 

(a) constructing a first neural network to be trained within a
spreadsheet of the spreadsheet application by interrelating cells
of the spreadsheet through relative cell referencing, wherein
each hidden layer neuron and each output layer neuron of the
constructed first neural network is formed by a plurality of
cells each having a respective weight value of such neuron
associated therewith and an activation cell containing an
activation function of such neuron, such that for a given
calculate operation of the spreadsheet the first neural network
functions to produce outputs in accordance with its then current
structure; 

(b) constructing a training network within the spreadsheet of the
spreadsheet application, the training network including a second
neural network constructed within the spreadsheet and having the
same configuration as the first neural network, the training
network further including a plurality of interrelated cells
containing equations for calculating weight update terms for the
first neural network being trained, such that for a given
calculate operation of the spreadsheet during a training
operation the training network functions to produce such weight
update terms; 

(c) applying a set of training inputs to the first neural network
being trained, 

(d) adjusting each training input of the plurality of training
inputs by an incremental amount and applying each of the adjusted
training inputs to the second neural network; 

(e) establishing weight update terms within the training network
based at least in part upon a difference in activation levels
between corresponding activation cells of the first and second
neural networks; 

(f) altering the weight values associated with each neuron of the
first neural network being trained based upon the weight update
terms established by the training network to reflect a knowledge
domain represented by the set of training inputs. 

17. A method of training a neural network in accordance with
claim 16 wherein step (f) includes adding each weight update term
to one of the weight values of the neural network being trained. 

18. A method of training a neural network in accordance with
claim 16 wherein step (f) includes replacing each weight value of
the neural network being trained with one of the weight update
terms. 

19. A method of training a neural network in accordance with
claim 16 wherein the training network includes a derivative
module constructed in the spreadsheet and in step (e) the
derivative module calculates, for each of the activation cells, a
derivative of activation level with respect to net input. 

20. A method of training a neural network according to claim 16
wherein the training network includes an error module constructed
within the spreadsheet and in step (e) the error module
calculates an error representative of a difference between a set
of outputs produced by the first neural network being trained and
a set of training outputs corresponding to the set of training
inputs applied thereto. 

21. A method of training a neural network in accordance with
claim 20, further comprising the step of: 

(g) repeating steps (c), (d), (e) and (f) until said error falls
below a predetermined value. 

22. A method of training a neural network in accordance with
claim 16 wherein step (c) includes providing relative movement
within the spreadsheet between the first neural network and a
plurality of sets of training data located within the
spreadsheet. 

23. A method of training a neural network in accordance with
claim 16 further comprising the step of scanning the spreadsheet
for a set of training data prior to initiating the calculate
function. 

24. A method of simultaneously training at least two neural
networks, utilizing a computer including processing means and an
associated spreadsheet application operable therewith, said
method comprising the steps of: 

(a) constructing a first functional neural network to be trained
within a spreadsheet produced by the spreadsheet application by
interrelating cells of the spreadsheet through relative cell
referencing, wherein each hidden layer neuron and each output
layer neuron of the first neural network is formed by plurality
of cells each having a respective weight value of such neuron
associated therewith and an activation cell containing an
activation function of such neuron, 

(b) constructing a first training network within the spreadsheet
of the spreadsheet application for use in training the first
neural network, the first training network including a plurality
of interrelated cells containing equations for calculating weight
update terms for the first neural network, 

(c) constructing a second functional neural network to be trained
within the spreadsheet produced by the spreadsheet application by
interrelating cells of the spreadsheet through relative cell
referencing, wherein each hidden layer neuron and each output
layer neuron of the second neural network is formed by plurality
of cells each having a respective weight value of such neuron
associated therewith and an activation cell containing an
activation function of such neuron, 

(d) constructing a second training network within the spreadsheet
of the spreadsheet application for use in training the second
neural network, the second training network including a plurality
of interrelated cells containing equations for calculating weight
update terms for the second neural network, 

(e) simultaneously applying training data located within the
spreadsheet to both the first neural network and the second
neural network by initiating a calculate function of the
spreadsheet, 

(f) altering at least a portion of the first neural network in
accordance with weight update terms produced by the first
training network, and 

(g) altering at least a portion of the second neural network in
accordance with weight update terms produced by the second
training network. 

25. A method of simultaneously training at least two neural
networks in accordance with claim 24 wherein step (e) includes
applying a first set of training data to the first neural network
and simultaneously applying a second set of training data to the
second neural network, said first set of training data and said
second set of training data having at least one variable in
common. 

26. A method of simultaneously training at least two neural
networks in accordance with claim 24 wherein step (e) includes
applying a first set of training data to the first neural network
and simultaneously applying a second set of training data to the
second neural network, said first set of training data and said
second set of training data made up of distinct variables. 

27. A computer based neural network training system, comprising: 

processing means operable to electronically generate a data space
including a plurality of cells; 

means associated with said data space and said processing means
for maintaining a numeric value associated with each cell, 

means associated with said data space and said processing means
for interrelating said cells through relative cell referencing, 

a neural network constructed within said data space, said neural
network including a plurality of imaging cells for relatively
referencing a plurality of training inputs to said neural
network, at least one hidden layer including a plurality of
neurons, and an output layer including a plurality of neurons,
each neuron of said hidden layer and said output layer formed by
a plurality of cells including a first plurality of cells each
for containing a numeric weight value of said neuron and an
activation cell containing an activation function which makes
relative reference to each of said first plurality of cells to
establish a numeric value which is dependent upon said numeric
weight values and is representative of an activation level of
said neuron, 

means associated with said neural network for altering said
numeric weight values of said neurons during training of said
neural network, 

whereby, for a given set of training inputs and corresponding
training outputs on which said neural network is being trained,
at least some of said numeric weight values of each neuron are
altered to incorporate into said neural network a knowledge
domain represented by said given set; and 

a data filtering neural network including an autoassociative
neural network constructed in said data space, said
autoassociative neural network having been trained on a plurality
of control sets of inputs thereto, whereby, for a given set of
inputs within a knowledge domain represented by said plurality of
control sets of inputs, said autoassociative neural network is
operable to map said given set of inputs to themselves. 

28. A computer based neural network training system in accordance
with claim 27, further comprising a program associated with said
data filtering neural network, said neural network and said
training network, at least a portion of said data filtering
neural network operable to determine an error between a given set
of inputs and a resulting set of outputs of said autoassociative
neural network, at least a portion of said program operable to
determine if said error exceeds a predetermined value, and, only
if said error exceeds said predetermined value, to alter at least
some of said numeric weight values of each neuron of said neural
network, so that said neural network is trained on only novel
sets of training inputs and corresponding training outputs. 

29. A self training neural network object implemented utilizing a
computer including processing means operable to run a spreadsheet
application, comprising: 

a neural network constructed in a spreadsheet of the spreadsheet
application, said neural network including a plurality of neurons
each formed of a plurality of cells including a first plurality
of cells each with an associated numeric weighting value entered
therein and an activation cell having a function entered therein
which makes relative reference to each of said first plurality of
cells, 

a training network constructed in the spreadsheet, 

a program associated with said training network and said neural
network, said training network operable in conjunction with said
program during a training operation to alter said numeric
weighting value associated with at least some of said first
plurality of cells of each neuron of said neural network, 

whereby, for a given set of training inputs and corresponding
training outputs applied to said self training neural network
object, said numeric weighting value associated with at least
some of said plurality of cells of each neuron is alterable to
incorporate into said neural network a knowledge domain
represented by said given set of applied training inputs and
corresponding training outputs; and 

an autoassociative neural network constructed in said
spreadsheet, a plurality of the variables making up said given
set of training inputs and corresponding training outputs being
applied as inputs to said autoassociative neural network, said
autoassociative neural network operable during training to
determine, for a given set of inputs thereto, an error value,
said error value representing a difference between said given set
of inputs thereto and a resulting set of outputs therefrom,
wherein said program is operable to effect determination of
whether said error exceeds a predetermined value and, if said
error is less than said predetermined value, to prevent
alteration of said numeric weighting value associated with each
cell of said plurality of cells of each neuron of said neural
network. 
--------------------------------------------------------------------------------

Description

--------------------------------------------------------------------------------


FIELD OF THE INVENTION 

This invention relates generally to artificial neural networks
and more particularly, to artificial neural networks implemented
in a non-algorithmic fashion in a data space, such as a
spreadsheet, so as to facilitate cascading of such artificial
neural networks and so as to facilitate artificial neural
networks capable of operating within the data space, including
networks which move through the data space and self-train on data
therewithin. 

BACKGROUND OF THE INVENTION 

This application is related to applicant's co-pending application
Ser. No. 08/323,238 filed Oct. 13, 1994, entitled Device For The
Autonomous Generation Of Useful Information, in which the
"creativity machine" paradigm was introduced. The creativity
machine paradigm involves progressively purturbing a first neural
network having a predetermined knowledge domain such that the
perturbed network continuously outputs a stream of concepts, and
monitoring the outputs or stream of concepts with a second neural
network which is trained to identify only useful concepts. The
perturbations may be achieved by different means, including the
introduction of noise to the network, or degradation of the
network. Importantly, the present application provides an
excellent system for constructing such creativity machines, and
further builds upon the creativity machine invention to achieve
self training neural networks. 

The current explosion of information has made it necessary to
develop new techniques for handling and analyzing such
information. In this regard, it would be helpful to be able to
effectively discover regularities and trends within data and to
be able to effectively sort and/or organize data. Currently,
various algorithmic techniques and systems may be utilized to
analyze data, however, such techniques and systems generally fail
to display the creativity needed to enable them to organize the
data and exhaust sets of data of all potential discoveries. The
use of neural networks for such tasks would be advantageous. 

Further, the advantages of new artificial neural networks (ANNs)
are ever increasing. Currently, such artificial neural networks
are often trained and implemented algorithmically. These
techniques require the skills of a neural network specialist who
may spend many hours developing the training and/or
implementation software for such algorithms. Further, when using
algorithms to train artificial neural networks, once new training
data is obtained, the new training data must be manually appended
to the preexisting set of training data and network training must
be reinitiated, requiring additional man hours.
Disadvantageously, if the newly acquired training data does not
fit the pattern of preexisting training data, the generalization
capacity of the network may be lowered. 

An additional drawback to traditional algorithm implemented
training and operation of artificial neural networks is that
within such schemes, individual activation levels are only
momentarily visible and accessible, as when the governing
algorithm evaluates the sigmoidal excitation of any given node or
neuron. Except for this fleeting appearance during program
execution, a neuron's excitation, or activation level, is quickly
obscured by redistribution among downstream processing elements. 

Accordingly, it is desirable and advantageous to provide a
simpler method of training, implementing, and simulating
artificial neural networks. It is farther desirable to provide
artificial neural networks which can be easily cascaded together
to facilitate the construction of more complex artificial neural
network systems. It also is desirable and advantageous to provide
neural networks which can be configured to perform a variety of
tasks, including self training artificial neural networks, as
well as networks capable of analyzing, sorting, and organizing
data. 

A principal object of the present invention is to provide a user
friendly system of implementing or simulating neural networks in
which movement of such networks and cascading of such networks is
facilitated. 

Another object of the present invention is to provide self
training artificial neural networks. 

A further object of the present invention is to provide
artificial neural networks capable of analyzing data within a
data space. 

Yet another object of the present invention is to provide
artificial neural networks which are mobile within a data space. 

Still another object of the present invention is to provide
artificial neural networks which can be easily duplicated within
a data space and which can be easily interconnected to facilitate
the construction of more complex artificial neural network
systems. 

---

> http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=/netahtml/srchnum.htm&r=1&f=G&l=50&s1=5,852,815.WKU.&OS=PN/5,852,815&RS=PN/5,852,815

Neural network based prototyping system and method 

Abstract

Constructing and simulating artificial neural networks and
components thereof within a spreadsheet environment results in
user friendly neural networks which do not require algorithmic
based software in order to train or operate. Such neural networks
can be easily cascaded to form complex neural networks and neural
network systems, including neural networks capable of
self-organizing so as to self-train within a spreadsheet, neural
networks which train simultaneously within a spreadsheet, and
neural networks capable of autonomously moving, monitoring,
analyzing, and altering data within a spreadsheet. Neural
networks can also be cascaded together in self training neural
network form to achieve a device prototyping system.





reply via email to

[Prev in Thread] Current Thread [Next in Thread]