PROBABILITY AND INFORMATION THEORY IN MACHINE LEARNING
Probabilistic tools for machine learning and analysis of real-world datasets. Introductory topics include classification, regression, probability theory, decision theory and quantifying information with entropy, relative entropy and mutual information. Additional topics include naive Bayes, probabilistic graphical models, discriminant analysis, logistic regression, expectation maximization, source coding and variational inference.
Not Reported
Not Reported
Could not calculate change
Could not calculate change
Could not calculate change
No change from Historical
No data available
Sorted by ratings from Rate My Professors
Similar Courses
No data available
Sorted by ratings from Rate My Professors
No instructors found.
Visual representation of course prerequisites and related courses
Loading Graph...