12.S592, 54-1623, F 1000-1200, Fall 2018
Sai Ravela (email@example.com)
Topics: Foundations, Deep Learning
- For ML to be useful
- Gaussian Graphical Models
- Kernel Components Part 1
- Kernel Principal Components Part 2
- Perceptrons, Associative Memories, Hopfield Networks
- Recurrent and Feedforward Networks
- Backpropagation and Multistage Two Point Boundary Value Problems
- Adjoint Machinery for Training Neural Networks
- Optimization Approaches in Learning
- Bias and Variance, Invariance and Selectivity, Validation and Model Selection
- Convolutional Networks and Anisotropies in Convolution
- Recurrent Networks
- Time and Sequential Data, LSTM
- Restricted Boltzmann Machines
- (Learning to) Optimize Neural Structure
- Neural Networks as Stochastic Systems
- Neural Networks as Chaotic Systems
- Neural Model Reduction for Dynamical Systems
- Training for Uncertainty in Neural Networks
- Learning with the Ensemble Filter and Smoother
- Mutual Information and Entropy functionals
- T. Hastie et al. The Elements of Statistical Learning
- I. Goodfellow et al., Deep Learning
- S. Raschka, Python Machine Learning
12.S592, 54-1623, F 1000-1200, Spring 2018
Sai Ravela (firstname.lastname@example.org)
The first part of an advanced two-semester sequence in Machine Learning that enables one to design Learning-based approaches for Natural Systems Science.
Natural Systems Science:
We are motivated at present by problems in applications across the breadth of Nature pertaining to Estimation, Prediction, Parameterization, Characterization, Discovery and Decision Making. Thre is a focus on Geophysics this term: Data Assimilation, Autonomous Environmental Mapping, Model Reduction, Uncertainty Quantification, Sensor Planning, Prediction and Predictability, Planning for Risk Mitigation, Convective Super-parameterization, Radiative-Convective Equilibrium, Nonlocal Operators, Teleconnections, Particle Detection and Sizing, Species Characterization, Paleoclimate, Event Detection and Tracking in X (Voclanoes, Earthquakes, Hurricanes, Tsunamis, Storms and Transits). Super-resolution/Downscaling, Coherent Structures in Turbulence, Seismic Imaging and Geomorphology, Porus Media, Reservoirs, Exoplanets.
However, other Engineering, Science and Finance applications may be included depending on participant interest. Whilst our motivation and reach is broad, we will drill down each term to a few core applications that is set by participant interest.
Participants will be required to complete monthly assignments and a collaborative project. This course is different from a typical course. It's hands-on, for both of us. Think about it as a "learnathon," we come together to solve problems/applications better than what either one of us can do alone. My job is to help tease you to the fundamentals and your job is to help take them to your application. Prior experience with Probability, Linear Algebra, Statistics or Data Science courses would be helpful, or permission of the instructor.
The entire course is organized in three phases; Preparatory, Core and Advanced Topics.
Preparatory Topics (both terms): Classification, Regression, Clustering, Density Estimation, Supervised vs. Unsupervised Learning, Loss functions, Training-Calibration-Validation, Sampling and Variational Inference, Nearest Neighbors, Generalization, Bias-Variance Dilemma, Model Selection, Feature Selection, Reproducing Kernel Hilbert Spaces, Regularization: Smoothness, Sparseness, Entropy and Information.
|Core Material Term 1||Core Material Term 2|
|Non Parametric Bayesian Inference, Graphical Models, Regression Machines, Kernel Machines, Ensemble Learning, Manifold Learning, Transfer Learning, Recurrent and Deep Learning||Basis and Dictionary Learning, Gaussian Processes. Stochastic Model Reduction, Abstraction, Segmentation and Grouping. Markov Decision Processes, Reinforcement Learning, Incremental Online and Active Learning, Causal Learning, Information Theoretic Learning|
Advanced Topics (both terms): Uncertain Probabilities and Ambiguity, Curse of Dimensionality, Sparse Data vs. Big Data, Rigged Hilbert Spaces, Parsimony Principles, Generic Chaining and Extremes, Predictability and Learnability, Model-based (Physically-based) Learning. As with applications, we will be selective with the Advanced Topics each term.