TY - CHAP
T1 - Bayesian Learning
AU - Vanneschi, Leonardo
AU - Silva, Sara
N1 - Vanneschi, L., & Silva, S. (2023). Bayesian Learning. In Natural Computing Series (pp. 259-270). (Natural Computing Series). Springer, Cham. https://doi.org/10.1007/978-3-031-17922-8_9
PY - 2023/1/13
Y1 - 2023/1/13
N2 - Bayesian learning (Tipping, 2004; Barber, 2012) is the name commonly used to identify a set of computational methods for supervised learning based on Bayes’ Theorem. Broadly speaking, Bayes’ Theorem deals with the modification of our perception of the probability of an event, as a consequence of the occurrence of one or more facts. For instance, what probability are you assigning to the event “somebody stole my car” at the moment? Of course, this can depend on many different factors, but on a normal day, one may argue that that probability is generally rather low. Now, imagine that you go looking for your car, and the car is not in the place where you remember that you parked it. What is now the probability of the event “somebody stole my car”? The fact that the car is not where it was parked clearly changes the probability that it was stolen. This property is general: the realization of some events can modify the probability of others. This property can be exploited to tackle Machine Learning tasks, for instance classification: data, interpreted as events, can be used to change the probability that a given observation belongs to a given class. Before studying this mechanism in detail, let us first present Bayes’ Theorem and its most immediate use in Machine Learning.
AB - Bayesian learning (Tipping, 2004; Barber, 2012) is the name commonly used to identify a set of computational methods for supervised learning based on Bayes’ Theorem. Broadly speaking, Bayes’ Theorem deals with the modification of our perception of the probability of an event, as a consequence of the occurrence of one or more facts. For instance, what probability are you assigning to the event “somebody stole my car” at the moment? Of course, this can depend on many different factors, but on a normal day, one may argue that that probability is generally rather low. Now, imagine that you go looking for your car, and the car is not in the place where you remember that you parked it. What is now the probability of the event “somebody stole my car”? The fact that the car is not where it was parked clearly changes the probability that it was stolen. This property is general: the realization of some events can modify the probability of others. This property can be exploited to tackle Machine Learning tasks, for instance classification: data, interpreted as events, can be used to change the probability that a given observation belongs to a given class. Before studying this mechanism in detail, let us first present Bayes’ Theorem and its most immediate use in Machine Learning.
UR - http://www.scopus.com/inward/record.url?scp=85147428594&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-17922-8_9
DO - 10.1007/978-3-031-17922-8_9
M3 - Chapter
AN - SCOPUS:85147428594
SN - 978-3-031-17921-1
SN - 978-3-031-17924-2
T3 - Natural Computing Series
SP - 259
EP - 270
BT - Natural Computing Series
PB - Springer, Cham
CY - Cham, Switzerland
ER -