Posts

Showing posts from March, 2019

Artificial neural network

Image
Artificial neural networks (ANN) or connectionist systems are computing systems vaguely inspired by the biological neural networks that constitute animal brains. The data structures and functionality of neural nets are designed to simulate associative memory. Neural nets learn by processing examples, each of which contains a known "input" and "result," forming probability-weighted associations between the two, which are stored within the data structure of the net itself. (The "input" here is more accurately called an input set, since it generally consists of multiple independent variables, rather than a single value.) Thus, the "learning" of a neural net from a given example is the difference in the state of the net before and after processing the example. After being given a sufficient number of examples, the net becomes capable of predicting results from inputs, using the associations built from the example set. If a feedback loop is prov

Overfitting and underfitting

Image
The green line represents an overfitted model and the black line represents a regularized model. While the green line best follows the training data, it is too dependent on that data and it is likely to have a higher error rate on new unseen data, compared to the black line. In statistics,  overfitting  is "the production of an analysis that corresponds too closely or exactly to a particular set of data, and may therefore fail to fit additional data or predict future observations reliably". An  overfitted model  is a  statistical model  that contains more  parameters  than can be justified by the data. The essence of overfitting is to have unknowingly extracted some of the residual variation (i.e. the  noise ) as if that variation represented underlying model structure. Underfitting  occurs when a statistical model cannot adequately capture the underlying structure of the data. An  under-fitted model  is a model where some parameters or terms that would appear in