Important Announcement
PubHTML5 Scheduled Server Maintenance on (GMT) Sunday, June 26th, 2:00 am - 8:00 am.
PubHTML5 site will be inoperative during the times indicated!

Home Explore Python for Probability, Statistics, and Machine Learning

Python for Probability, Statistics, and Machine Learning

Published by Willington Island, 2021-08-24 02:03:04

Description: This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Detailed proofs for certain important results are also provided. Modern Python modules like Pandas, Sympy, Scikit-learn, Tensorflow, and Keras are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples.

PYTHON MECHANIC

Search

Read the Text Version

3.4 Estimation Using Maximum Likelihood 139 with corresponding logarithm as nn J = log(L( p|x)) = log( p) xi + log(1 − p) n − xi i=1 i=1 Taking the derivative of this gives dJ = 1 n (n − n xi ) dp p p i =1 xi + −1 i =1 and solving this for p leads to 1n pˆ = xi n i =1 This is our estimator for p. Up until now, we have been using Sympy to solve for this based on the data xi but now that we have it analytically we don’t have to solve for it each time. To check if this estimator is biased, we compute its expectation: =1 n 1 n n E pˆ E(xi ) = nE(xi ) i by linearity of the expectation and where E(xi ) = p Therefore, E pˆ = p This means that the estimator is unbiased. Similarly, ⎡⎤ n2 E pˆ 2 = 1 E ⎣ n2 xi ⎦ i =1 and where E xi2 = p and by the independence assumption, E xi x j = E(xi )E(x j ) = p2


































































































Like this book? You can publish your book online for free in a few minutes!
Create your own flipbook