You are seeing the paginated version of the page.
It was specially created to help search engines like Google to build the proper search index.

Click to load the full version of the page

Theory

A core objective of a learner is to generalize from its experience.

Generalization in this context is the ability of a learning machine to perform accurately on new, unseen examples/tasks after having experienced a learning data set.

The training examples come from some generally unknown probability distribution (considered representative of the space of occurrences) and

the learner has to build a general model about this space that enables it to produce sufficiently accurate predictions in new cases.

Machine Learning Algorithms - theoretical computer science -

The computational analysis of machine learning algorithms and their performance is a branch of theoretical computer science known as computational learning theory.

Because training sets are finite and the future is uncertain, learning theory

usually does not yield guarantees of the performance of algorithms.

Instead, probabilistic bounds on the performance are quite common.

The bias–variance decomposition is one way to quantify generalization error.

For the best performance in the context of generalization, the complexity

of the hypothesis should match the complexity of the function underlying the data.

If the hypothesis is less complex than the function,

then the model has under fitted the data.

If the complexity of the model is increased in response,

then the training error decreases.

But if the hypothesis is too complex,

then the model is subject to overfitting and generalization will be poorer.

In addition to performance bounds, learning theorists study the time complexity and feasibility of learning. In computational learning theory, a computation is considered feasible if it can be done in polynomial time.

There are two kinds of time complexity results.

Positive results show that a certain class of functions can be learned in polynomial time.

Negative results show that certain classes cannot be learned in polynomial time.