Skip to content

sarth-04/AI-ML

Repository files navigation

🌟My Machine Learning A-Z Journey 🚀

Hi, I'm Sarthak Ingle, and this repository is a personal showcase of my hands-on learning journey through the Machine Learning A-Z™: AI, Python & R + ChatGPT Prize [2025] course on Udemy. Over 43+ hours, I worked on practical projects covering fundamental to advanced concepts in machine learning, data preprocessing, deep learning, and reinforcement learning.

Each project taught me something new, and this repository is my way of reflecting on that journey and sharing what I built with the world.



📁 Projects & What I Learned

🔧 1. Data Preprocessing

  • About: Cleaned messy datasets, handled missing data, and encoded categorical variables.
  • Learnings: Realized the importance of data quality before modeling. Mastered Label Encoding, One-Hot Encoding, and Feature Scaling.

📊 2. Simple Linear Regression

  • About: Predicted continuous outcomes with one independent variable.
  • Learnings: My first machine learning model! Learned about best-fit lines, error minimization, and visualization.

📊 3. Multiple Linear Regression

  • About: Extended to multiple variables and learned feature selection using Backward Elimination.
  • Learnings: Gained skills in feature importance analysis and the impact of irrelevant features.

📈 4. Polynomial Regression

  • About: Captured non-linear trends using polynomial features.
  • Learnings: Learned to visually compare linear vs. polynomial fits.

⚙️ 5. Support Vector Regression (SVR)

  • About: Applied SVR to fit complex curves.
  • Learnings: Understood kernel tricks and the power of feature scaling in SVR.

🌳 6. Decision Tree Regression

  • About: Built tree-like models that split the dataset based on conditions.
  • Learnings: Realized the simplicity and interpretability of tree-based models.

🌳 7. Random Forest Regression

  • About: Enhanced prediction accuracy using an ensemble of trees.
  • Learnings: Discovered the power of bagging and ensemble learning.

🔐 8. Logistic Regression

  • About: Used for binary classification problems.
  • Learnings: First step into classification. Learned about the sigmoid function and probability thresholds.

🤝 9. K-Nearest Neighbors (KNN)

  • About: Classified data points based on nearest neighbors.
  • Learnings: Improved my understanding of distance metrics and non-parametric models.

✨ 10. Support Vector Machine (SVM)

  • About: Built classifiers with linear and non-linear kernels.
  • Learnings: Grasped the concepts of margin maximization and kernel methods.

📊 11. Naive Bayes Classifier

  • About: Created a text classifier using probabilistic principles.
  • Learnings: Understood Bayes’ theorem and the role of independence assumptions.

🌳 12. Decision Tree Classification

  • About: Built interpretable decision trees for classification.
  • Learnings: Learned about entropy, information gain, and visualizing decision paths.

🌳 13. Random Forest Classification

  • About: Improved accuracy using an ensemble of trees for classification.
  • Learnings: Learned how ensembles reduce overfitting and improve generalization.

📊 14. K-Means Clustering

  • About: Used for unsupervised learning to form clusters.
  • Learnings: Learned the Elbow Method and cluster interpretation.

🔗 15. Hierarchical Clustering

  • About: Applied dendrograms and agglomerative clustering.
  • Learnings: Appreciated the beauty of visual cluster formation.

🔻 16. Principal Component Analysis (PCA)

  • About: Reduced dimensions while retaining variance.
  • Learnings: Understood dimensionality reduction and how to visualize high-dimensional data.

🛒 17. Association Rule Learning - Apriori

  • About: Discovered frequent itemsets from shopping basket data.
  • Learnings: Gained skills in rule mining and confidence-lift analysis.

✉️ 18. Natural Language Processing (NLP)

  • About: Built a spam classifier from scratch.
  • Learnings: Explored text cleaning, Bag-of-Words model, and Naive Bayes in NLP.

🧠 19. Artificial Neural Networks (ANN)

  • About: Built an ANN using TensorFlow and Keras.
  • Learnings: Understood neuron layers, activation functions, and backpropagation.

🧠 20. Convolutional Neural Networks (CNN)

  • About: Classified images using CNNs.
  • Learnings: Explored image convolutions, pooling, and deep learning workflows.

🎮 21. Reinforcement Learning

  • About: Applied UCB and Thompson Sampling in simulated environments.
  • Learnings: Understood exploration vs. exploitation tradeoffs and probabilistic decision-making.

🛠️ Tech Stack & Tools

  • Languages: Python, R
  • ML Libraries: scikit-learn, TensorFlow, Keras, pandas, NumPy
  • Visualization: matplotlib, seaborn, OpenCV
  • NLP: NLTK
  • Tools: Jupyter Notebook, Google Colab, VS Code

🚀 Key Skills I Gained

✅ Data Preprocessing & Cleaning
✅ Supervised Learning (Regression, Classification)
✅ Unsupervised Learning (Clustering, Dimensionality Reduction)
✅ Natural Language Processing (NLP)
✅ Deep Learning with ANN & CNN
✅ Reinforcement Learning Fundamentals
✅ Model Evaluation, Tuning, and Optimization
✅ Strong foundations in Python & Machine Learning Workflow


🔗 My Socials


Thanks for reading through my journey! 🌟 Feel free to explore each folder for the respective project code and notebooks.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published