In machine learning, instance-based learning (sometimes called memory-based learning[1]) is a family of learning algorithms that, instead of performing explicit generalization, compare new problem instances with instances seen in training, which have been stored in memory. Because computation is postponed until a new instance is observed, these algorithms are sometimes referred to as "lazy."[2]
It is called instance-based because it constructs hypotheses directly from the training instances themselves.[3] This means that the hypothesis complexity can grow with the data:[3] in the worst case, a hypothesis is a list of n training items and the computational complexity of classifying a single new instance is O(n). One advantage that instance-based learning has over other methods of machine learning is its ability to adapt its model to previously unseen data. Instance-based learners may simply store a new instance or throw an old instance away.
Examples of instance-based learning algorithms are the k-nearest neighbors algorithm, kernel machines and RBF networks.[2]: ch. 8 These store (a subset of) their training set; when predicting a value/class for a new instance, they compute distances or similarities between this instance and the training instances to make a decision.
To battle the memory complexity of storing all training instances, as well as the risk of overfitting to noise in the training set, instance reduction algorithms have been proposed.[4]
instance, based, learning, machine, learning, instance, based, learning, sometimes, called, memory, based, learning, family, learning, algorithms, that, instead, performing, explicit, generalization, compare, problem, instances, with, instances, seen, training. In machine learning instance based learning sometimes called memory based learning 1 is a family of learning algorithms that instead of performing explicit generalization compare new problem instances with instances seen in training which have been stored in memory Because computation is postponed until a new instance is observed these algorithms are sometimes referred to as lazy 2 It is called instance based because it constructs hypotheses directly from the training instances themselves 3 This means that the hypothesis complexity can grow with the data 3 in the worst case a hypothesis is a list of n training items and the computational complexity of classifying a single new instance is O n One advantage that instance based learning has over other methods of machine learning is its ability to adapt its model to previously unseen data Instance based learners may simply store a new instance or throw an old instance away Examples of instance based learning algorithms are the k nearest neighbors algorithm kernel machines and RBF networks 2 ch 8 These store a subset of their training set when predicting a value class for a new instance they compute distances or similarities between this instance and the training instances to make a decision To battle the memory complexity of storing all training instances as well as the risk of overfitting to noise in the training set instance reduction algorithms have been proposed 4 See also editAnalogical modelingReferences edit Walter Daelemans Antal van den Bosch 2005 Memory Based Language Processing Cambridge University Press a b Tom Mitchell 1997 Machine Learning McGraw Hill a b Stuart Russell and Peter Norvig 2003 Artificial Intelligence A Modern Approach second edition p 733 Prentice Hall ISBN 0 13 080302 2 D Randall Wilson Tony R Martinez 2000 Reduction techniques for instance based learning algorithms Machine Learning nbsp This artificial intelligence related article is a stub You can help Wikipedia by expanding it vte Retrieved from https en wikipedia org w index php title Instance based learning amp oldid 1024885183, wikipedia, wiki, book, books, library,