酷兔英语

章节正文

logistic function
The function
φ(x) = 1/(1 + exp(–x))
which, when graphed, looks rather like a smoothed version of the step function
step(x) = 0 if x < 0, = 1 if x ≥ 0

graph of logistic function

It is used to transform the total net input of an artificial neuron in some implementations of backprop-trained networks.

 

A related function, also sometimes used in backprop-trained networks, is 2φ(x)–1, which can also be expressed as tanh(x/2). tanh(x/2) is, of course, a smoothed version of the step function which jumps from –1 to 1 at x = 0, i.e. the function which = –1 if x < 0, and = 1 if x ≥ 0.

M

 

machine learning
Machine learning is said to occur in a program that can modify some aspect of itself, often referred to as its state, so that on a subsequentexecution with the same input, a different (hopefully better) output is produced. See unsupervised learning and supervised learning, and also function approximation algorithms and symbolic learning algorithms.
momentum in backprop
See article on generalized delta rule.
multilayer perceptron (MLP)
See also feedforward network. Such a neural network differs from earlier perceptron-based models in two respects:
  • most importantly, in their activation function, which consists of transforming the total net input by a sigmoid function, rather than simply thresholding it;
  • using a multiple-layer model, with hidden units, rather than just a single input layer and a single output layer.

 

N

 

neural network
An artificial neural network is a collection of simple artificial neurons connected by directed weighted connections. When the system is set running, the activation levels of the input units is clamped to desired values. After this the activation is propagated, at each time step, along the directed weighted connections to other units. The activations of non-input neurons are computing using each neuron's activation function. The system might either settle into a stable state after a number of time steps, or in the case of a feedforward network, the activation might flow through to output units.

Learning might or might not occur, depending on the type of neural network and the mode of operation of the network.

neurode
see neuron.
neuron (artificial)
A simple model of a biological neuron used in neural networks to perform a small part of some overall computational problem. It has inputs from other neurons, with each of which is associated a weight - that is, a number which indicates the degree of importance which this neuron attaches to that input. It also has an activation function, and a bias. The bias acts like a threshold in a perceptron.

Also referred to a neurode, node, or unit.

node
see neuron, for a node in a neural network.
See graph for a node in a graph.
noisy data in machine learning
The term "noise" in this context refers to errors in the training data for machine learning algorithms. If a problem is difficult enough and complicated enough to be worth doing with machine learning techniques, then any reasonable training set is going to be large enough that there are likely to be errors in it. This will of course cause problems for the learning algorithm.

See also decision tree pruning and generalization in backprop.



文章标签:词典  

章节正文