Skip to content

SHSID-Data-science-Club/Highschool_ML_Course

Repository files navigation

Highschool_ML_Course

A Machine Learning course specifically for highschool students based on the USAAIO course provided by Beaver-Edge AI.

IMPORTANT:If you want best experience please download the whole repo and run the ipynb for visualization!!!


Table Of Content

    1. Prerequisites
    • 0.1 Environment Setup (Anaconda, CUDA, VS Code, Python)
    1. Mathematical Foundations for AI
    • 1.1 Linear Algebra
      • 1.1.1 Vector Spaces, Subspaces, Basis, Orthonormal Vectors
      • 1.1.2 Vector and Matrix Operations
      • 1.1.3 Eigenvalues and Eigenvectors
      • 1.1.4 Matrix Decompositions
    • 1.2 Calculus
      • 1.2.1 Single-Variable Derivatives
      • 1.2.2 Multivariable Derivatives and Gradients
      • 1.2.3 Chain Rule
    • 1.3 Probability & Statistics
      • 1.3.1 Discrete Distributions
      • 1.3.2 Continuous Distributions
      • 1.3.3 Expectation and Mean
      • 1.3.4 Variance and Covariance
      • 1.3.5 Bayes’ Rule
    • 1.4 Convex Optimization
      • 1.4.1 Convexity
      • 1.4.2 Gradient Descent
      • 1.4.3 Duality
    1. Python for AI
    • 2.1 Advanced Python Techniques
    • 2.2 NumPy
    • 2.3 Pandas
    • 2.4 Matplotlib
    • 2.5 Seaborn
    1. Core Machine Learning
    • 3.1 Terminology (Supervised, Unsupervised, Overfitting, etc.)
    • 3.2 Linear Regression
    • 3.3 Logistic Regression
    • 3.4 Regularization, Bias–Variance Trade-off, Kernel Methods
    • 3.5 Cross-Validation
    • 3.6 k-Nearest Neighbors
    • 3.7 K-Means Clustering
    • 3.8 Support Vector Machines
    • 3.9 Principal Component Analysis & Dimensionality Reduction (done)
    • 3.10 Decision Trees
    • 3.11 Random Forests
    • 3.12 Boosting
    1. PyTorch Fundamentals
    • 4.1 Tensors
    • 4.2 Autograd
    • 4.3 Devices (CPU/GPU)
    • 4.4 Modules
    • 4.5 Datasets
    • 4.6 DataLoader & Collation
    • 4.7 Loss Functions
    • 4.8 Optimizers
    1. Deep Learning & Computer Vision
    • 5.1 Multi-Layer Perceptron (MLP)
    • 5.2 Forward Propagation & Activation Functions
    • 5.3 Backpropagation & Gradient Descent (done)
    • 5.4 Adam & Other Adaptive Optimizers
    • 5.5 Parameter Initialization
    • 5.6 Batch Normalization
    • 5.7 Dropout
    • 5.8 Convolutional Layers & Pooling Layers (done)
    • 5.9 Convolutional Neural Networks (CNNs) (done)
    • 5.10 Image Data Augmentation
    • 5.11 VGG
    • 5.12 ResNet
    • 5.13 GoogLeNet (Inception)
    • 5.14 Transfer Learning
    1. Transformers
    • 6.1 Self-Attention
    • 6.2 Cross-Attention
    • 6.3 Masked Self-Attention
    • 6.4 Layer Normalization
    • 6.5 Word Embeddings
    • 6.6 Positional Encoding
    • 6.7 Batch Processing
    • 6.8 Training Procedures
    • 6.9 Inference & Deployment
    • 6.10 Pre-training
    • 6.11 Fine-tuning (Hugging Face)
    • 6.12 BERT, T5, GPT
    1. Natural Language Processing & Graph Neural Networks
    • 7.1 Character Tokenization
    • 7.2 Subword Tokenization
    • 7.3 Word Tokenization
    • 7.4 Word Embedding Methods
      • 7.4.1 Skip-Gram
      • 7.4.2 Continuous Bag-of-Words
      • 7.4.3 Global Vectors (GloVe)
    • 7.5 Encoder-Only Transformers (BERT)
    • 7.6 Decoder-Only Transformers (GPT)
    • 7.7 Message-Passing Neural Networks
    • 7.8 Graph Convolutional Networks
    • 7.9 Vision Transformers
    1. OpenCV & Generative AI
    • 8.1 Object Detection
    • 8.2 Adversarial Attacks
    • 8.3 U-Net
    • 8.4 Autoencoders
    • 8.5 Variational Autoencoders
    • 8.6 Generative Adversarial Networks
    • 8.7 Denoising Diffusion Probabilistic Models
    • 8.8 Stable Diffusion

About

A Machine Learning course specifically for highschool students

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •