Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Thanks for sharing. I also recommend "An Introduction to Statistical Learning - with Applications in R": http://www-bcf.usc.edu/~gareth/ISL/


A direct link to the PDF for ISL is here:

https://web.stanford.edu/~hastie/local.ftp/Springer/ISLR_pri...

The grownup version of that, ESL, is also available free:

https://web.stanford.edu/~hastie/local.ftp/Springer/ESLII_pr...

And for people who are genuinely curious about how this segues into graphical models, NNs, and the autoencoder (maybe the most interesting part of modern NNs), there's

https://web.stanford.edu/~hastie/StatLearnSparsity_files/SLS...

The more curious or research oriented may appreciate

https://web.stanford.edu/~hastie/local.ftp/hastie_glmnet.pdf

I doubt Gareth or Daniela (the primary authors of ISL) would mind my pointing you towards Hastie's archives since both of them were advised by Trevor Hastie during their PhDs.

Matloff is a great guy. The chapters on shrinkage and dimension reduction aren't yet written in his book, and since these are important topics, you should consider reading the others. These things are mostly of interest for people who want to draw inference about underlying processes that may be generating observed outcomes. If all you care about is prediction, fit a Random Forest or xgboost GBM or a DNN and be done with it. But if you're actually curious about how complex descriptions of rare events can be thoughtfully analyzed, this is the standard progression.

Matloff's book is a great introduction. I particularly like the example on page 204. /ducks




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: