Teaching

Teaching Assistant – Machine Learning (AI-511) & Visual Learning (AI-825)

In Fall 2023, I served as a Teaching Assistant for Machine Learning (AI-511) and in Spring 2024, for Visual Learning (AI-825)

My TA role in Machine Learning was highly hands-on. In addition to holding office hours and grading a cohort of more than 100 students, our team of five TAs jointly conducted weekly recap lectures paired with practical coding sessions of the theory taught in class. We also organized multiple Kaggle competitions, and were responsible for designing and evaluating major course projects for our individual cohorts.

Our Course Material: Machine Learning TA Resources for Fall 2023 on GitHub

Here is a breakdown of our weekly material. In each folder of the repo, there are coding session notebooks as well.

Machine Learning (AI-511) – Weekly Breakdown

Week 1: Preprocessing & Exploratory Data Analysis

  • Data cleaning and feature scaling
  • Handling missing values
  • Visual EDA for insights

Week 2: Linear Regression

  • Theory and implementation
  • Gradient Descent optimization
  • Model evaluation

Week 3: Maximum Likelihood Estimation & Bayesian Methods

  • Frequentist vs Bayesian inference
  • Priors, posteriors, and likelihoods

Week 4: Logistic Regression, K-Means, K-NN

  • Classification with Logistic Regression
  • Unsupervised learning: K-Means
  • Instance-based learning: K-NN

Week 5: PCA & Decision Trees

  • Dimensionality reduction with PCA
  • Interpretable models using Decision Trees

Week 6: Random Forests & Boosting

  • Ensemble learning techniques
  • Bagging with Random Forests
  • Boosting methods (AdaBoost, Gradient Boosting, CatBoost and some recent advances as well)

Week 7: Constrained Optimization

  • Lagrange multipliers
  • Regularization and constraints in ML

Week 8: Support Vector Machines

  • Margin maximization
  • Kernel trick for non-linear separation

Week 9: Neural Networks & PyTorch

  • Neural network basics
  • Implementing models in PyTorch
  • Training loops and backpropagation

My Project

Contested a kaggle compeition for my cohort on the task of Weather Forecasting, using a dataset with 23 features and 145,000 rows. The objective was to give students hands-on experience in feature selection through exploratory data analysis, applying model ensembling techniques, and performing hyperparameter tuning for optimal performance.