Help


from Wikipedia
« »  
A fourth issue is the degree of noise in the desired output values ( the supervisory targets ).
If the desired output values are often incorrect ( because of human error or sensor errors ), then the learning algorithm should not attempt to find a function that exactly matches the training examples.
Attempting to fit the data too carefully leads to overfitting.
You can overfit even when there are no measurement errors ( stochastic noise ) if the function you are trying to learn is too complex for your learning model.
In such a situation that part of the target function that cannot be modeled " corrupts " your training data-this phenomenon has been called deterministic noise.
When either type of noise is present, it is better to go with a higher bias, lower variance estimator.

2.169 seconds.