Machine Learning Foundations for Systems Science

12.S592, 54-1623, F 1000-1200, Fall 2018

Sai Ravela (ravela@mit.edu)

 

This is an "infinite course" that continues each term and round-robins among topics.  The entire course is organized in three phases; Preparatory, Core and Advanced Topics.

Follow the current course  on Stellar by clicking here

Learning Topics that we typical

  • Basics
    • Definitions - Classification, Regression, Clustering, Density Estimation
    • Loss functions, Training-Validation-Testing, Generalization
    • Model Selection, Feature Selection, Data Selection
    • Sampling and Variational Inference, Nearest Neighbors
    • Bias-Variance Dilemma
    • Regularization: Smoothness, Sparseness, Entropy and Information.
  • Methodology
    • Graphical Models
    • Kernel Machines, Manifold Learning
    • Ensemble Learning
    • Transfer Learning
    • Incremental Online Learning and Relevance Feedback
    • Active Learning
    • Basis and Dictionary Learning
    • Reinforcement Learning
    • Deep (Neural) Learning
    • Markov Decision Processes
    • Reinforcement Learning
    • Information Theoretic Learning
    • Causal Learning
  • Problem Solving: 
    • Detection
    • Model Reduction and Abstraction
    • Upscaling and Downscaling
    • State and Parameter Estimation
  • Advanced Topics that Emerge from Time to Time
    • Dynamics of Deep Networks
    • Neural Dynamical Systems and Stochastic Systems
    • Optimization in Learning
    • The role of Invariance and Learning Invariance
    • Information Transfers in Learning
    • Predictability and Learnability
    • Rigged Hilbert Spaces
    • Sparse and 
       

Advanced Topics (both terms): Uncertain Probabilities and Ambiguity, Curse of Dimensionality, Sparse Data vs. Big Data, Rigged Hilbert Spaces, Parsimony Principles, Generic Chaining and Extremes, Predictability and Learnability, Model-based (Physically-based) Learning. As with applications, we will be selective with the Advanced Topics each term.

Books

  1. T. Hastie et al. The Elements of Statistical  Learning
  2. I. Goodfellow et al., Deep Learning
  3. S. Raschka, Python Machine Learning

 

12.S592, 54-1623, F 1000-1200, Spring 2018

Sai Ravela (ravela@mit.edu)

The first part of an advanced two-semester sequence in Machine Learning that enables one to design Learning-based approaches for Natural Systems Science.

Natural Systems Science:

We are motivated at present  by problems in applications across the breadth of Nature pertaining to Estimation, Prediction, Parameterization, Characterization, Discovery and Decision Making.  Thre is a focus on Geophysics this term: Data Assimilation, Autonomous Environmental Mapping, Model Reduction, Uncertainty Quantification, Sensor Planning, Prediction and Predictability, Planning for Risk Mitigation, Convective Super-parameterization, Radiative-Convective Equilibrium, Nonlocal Operators, Teleconnections, Particle Detection and Sizing, Species Characterization, Paleoclimate, Event Detection and Tracking in X (Voclanoes, Earthquakes, Hurricanes, Tsunamis, Storms and Transits). Super-resolution/Downscaling, Coherent Structures in Turbulence,  Seismic Imaging and Geomorphology, Porus Media, Reservoirs, Exoplanets.

However, other Engineering, Science and Finance applications may be included depending on participant interest. Whilst our motivation and reach is broad, we will drill down each term to a few core applications that is set by participant interest.

Participation

Participants will be required to complete monthly assignments and a collaborative project.  This course is different from a typical course. It's hands-on, for both of us. Think about it as a "learnathon," we come together to solve problems/applications better than what either one of us can do alone. My job is to help tease you to the fundamentals and your job is to help take them to your application.  Prior experience with Probability, Linear Algebra, Statistics or Data Science courses would be helpful, or permission of the instructor.

 

Follow the course on Stellar by clicking here