This is the repository for PHYS 440/540 "Big Data Physics: Methods of Machine Learning" at Drexel University, taught by Prof. Gordon Richards. The course syllabus can be found at http://www.physics.drexel.edu/~gtr/teaching/phys_440_540/
The course is a series of jupyter notebooks, building on previous versions of this course (https://github.com/gtrichards/PHYS_T480_F18 and https://github.com/gtrichards/PHYS_T480), where I have drawn heavily from resources from the following people/places:
Jake Vanderplas (University of Washington) -- one of the primary code developers of scikit-learn and astroML. I originally drew a lot from https://github.com/jakevdp/ESAC-stats-2014, but you can find a lot more from him too: https://github.com/jakevdp/.
Zeljko Ivezic (University of Washington) -- the lead author of the textbook that we use (https://press.princeton.edu/books/hardcover/9780691198309/statistics-data-mining-and-machine-learning-in-astronomy) and instructor (along with Mario Juric) for https://github.com/uw-astr-302-w18/astr-302-w18
Aurelien Geron's book: https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1492032646/ref=sr_1_5?\dchild=1&keywords=machine+learning&qid=1596499152&sr=8-5 "Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems"
Andy Connolly (University of Washington), particularly http://cadence.lsst.org/introAstroML/
Karen Leighly (University of Oklahoma), particularly http://seminar.ouml.org/
Adam Miller (Northwestern University), particularly https://github.com/LSSTC-DSFP/LSSTC-DSFP-Sessions/
Jo Bovy (University of Toronto), particularly http://astro.utoronto.ca/~bovy/teaching.html
Thomas Wiecki, particularly http://twiecki.github.io/blog/2015/11/10/mcmc-sampling/
My thanks also to Maher Harb (Drexel University), Liam Coatman (Cambridge), Nathalie Thibert (UWO), and Kevin Footer (Deloitte).
I have tried to be careful about properly attributing anything drawn from these resources, but if it isn't clear where something comes from, it is probably there. Others are welcome to draw from here for their own Machine Learning courses. Please send any corrections to [email protected].
If you have any interest in using these materials for your own Machine Learning course, please e-mail me and I'll send you my post lecture notes about what worked, what didn't, what took too long, what didn't take long enough -- basically what I would change for next time.
Lecture 1 (9/21): Motivation.ipynb and InitialSetup.ipynb
Lecture 2 (9/23): HistogramExample.ipynb
Lecture 3 (9/28): BasicStats.ipynb
Lecture 4 (9/30): BasicStats2.ipynb
Lecture 5 (10/5): Inference.ipynb
Lecture 6 (10/7): Inference2.ipynb
Lecture 7 (10/14): Scikit-Learn-Intro.ipynb
Lecture 8 (10/19): KernelDensityEstimation.ipynb and NearestNeighbor.ipynb
Lecture 9 (10/21): MixtureModel.ipynb and Clustering.ipynb
Lecture 10 (10/26): DimensionReduction.ipynb
Lecture 11 (10/28): NonlinearDimensionReduction.ipynb
Lecture 12 (11/2): Regression.ipynb
Lecture 13 (11/4): Regression2.ipynb
Lecture 14 (11/9): Classification.ipynb
Lecture 15 (11/11): Classification2.ipynb
Lecture 16 (11/16): NeuralNetworks.ipynb
Lecture 17 (11/18): NeuralNetworks2.ipynb
Lecture 18 (11/23): TensorFlow.ipynb
Lecture 19 (11/30): TimeSeries.ipynbp
Lecture 20 (12/2): TimeSeries2.ipynb