酷兔英语

章节正文

O

 

observation language
Term used in analysing machine learning methods. The observation language refers to the notation used by the learning method to represent the data it uses for training. For example, in ID3, the observation language would be the notation used to represent the training instances, including attributes and their allowable values, and the way instances are described using attributes. In backprop, the observation language would be the notation used to represent the training patterns. In Aq, the observation language would again be the notation used to represent the instances, much as in ID3.

See also hypothesis language.

output unit
An output unit in a neural network is a neuron with no output connections of its own. Its activation thus serves as one of the output values of the neural net.
over-fitting
see article on generalization in backprop

 

P

 

perceptron
A perceptron is a simple artificial neuron whose activation function consists of taking the total net input and ouputting 1 if this is above a thresholdT, and 0 otherwise.

Perceptrons were originally used as pattern classifiers, where the term pattern is here used not in the sense of training pattern, but just in the sense of an input pattern that is to be put into on of several classes. Perceptual pattern classifiers of this sort (not based on perceptrons!) occur in simple animal visual systems, which can distinguish between prey, predators, and neutral environmental objects.

See also perceptron learning, and the XOR problem.

perceptron learning
The perceptron learning algorithm:
  1. All weights are initially set to zero.
  2. For each training example:
    • if the perceptron outputs 0 when it should output 1, then add the input vector to the weight vector.
    • if the perceptron outputs 1 when it should output 0, then subtract the input vector to the weight vector.
  3. Repeat step 2 until the perceptron yields the correct result for each training example.
propositional learning systems
A propositional learningsystem represents what it learns about the training instances by expressions equivalent to sentences in some form of logic (e.g.
class1 ← size=large and colour in {red, orange})

. See Aq and covering algorithm.
pruning decision trees
The data used to generate a decision tree, using an algorithm like the ID3 tree induction algorithm, can include noise - that is, instances that contain errors, either in the form of a wrong classification, or in the form of a wrong attribute value. There could also be instances whose classification is problematical because the attributes available are not sufficient to discriminate between some cases.

If for example, a node of the tree contains, say, 99 items in class C1 and 1 in class C2, it is plausible that the 1 item in class C2 is there because of an error either of classification or of feature value. There can thus be an argument for regarding this node as a leaf node of class C1. This termed pruning the decision tree.

The algorithm given in lectures for deciding when to prune is as follows:
At a branch node that is a candidate for pruning:

  1. Approximate expected error for the node.
  2. Approximate backed-up error from the children assuming that we do not prune.
  3. If expected error is less than backed-up error, then prune.

 

Q

 

R

 

recurrent network
A recurrent network is a neural network in which there is at least one cycle of activation flow. To put it another way, underlying any neural network there is a directed graph, obtained by regarding the neurons as nodes in the graph and the weights as directed edges in the graph. If this graph is not acyclic, the network is recurrent.

A recurrent connection is one that is part of a directed cycle, although term is sometimes reserved for a connection which is clearly going in the "wrong" direction in an otherwise feedforward network.

Recurrent networks include fully recurrent networks in which each neuron is connected to every other neuron, and partly recurrent networks in which greater or lesser numbers of recurrent connections exist. See also simple recurrent network.

This article is included for general interest - recurrent networks are not part of the syllabus of COMP9414 Artificial Intelligence.



文章标签:词典  

章节正文