Lecture | File Name | Description |
---|---|---|
1 | Lecture-1.ipynb | Introduction to Python for mathematical and statistical analysis. Covers Numpy basics, arrays, and vectorized operations. |
2 | Lecture-2.ipynb | Advanced Numpy indexing, slicing, views vs. copies, array manipulations, and symbolic mathematics with SymPy. |
3 | Lecture-3.ipynb | Data visualization with Matplotlib: figures, axes, plotting, subplots, and basic Pandas introduction. |
4 | Lecture-4.ipynb | Pandas Series and DataFrame basics: indexing, selection, aggregation, and handling missing data. |
5 | Lecture-5.ipynb | Introduction to machine learning: supervised, unsupervised, and reinforcement learning. Data types, scales, and basic statistics. |
6 | Lecture-6.ipynb | Data cleaning: handling missing/invalid data, imputation, and introduction to scikit-learn's fit/transform paradigm. |
7 | Lecture-7.ipynb | Exploratory data analysis: descriptive statistics, visualization, and encoding categorical/ordinal data. |
8 | Lecture-8.ipynb | Encoding techniques: LabelEncoder, OrdinalEncoder, OneHotEncoder, and introduction to artificial neural networks. |
9 | Lecture-9.ipynb | Preparing datasets for neural networks: train/test split, Keras Sequential model basics, and dense layers. |
10 | Lecture-10.ipynb | Keras model compilation: optimizers, loss functions, metrics, batch/epoch concepts, and model training. |
11 | Lecture-11.ipynb | Model evaluation, prediction, activation functions (ReLU, sigmoid), and batch processing in Keras. |
12 | Lecture-12.ipynb | Activation functions (linear, softmax), loss/metric functions, Keras callbacks, and model summary interpretation. |
13 | Lecture-13.ipynb | Feature scaling: standard, min-max, max-abs, normalization, robust scaling, and their impact on model performance. |
14 | Lecture-14.ipynb | Regression concepts: linear, polynomial, nonlinear, decision tree, random forest, SVR, and neural network regression. |
15 | Lecture-15.ipynb | Neural network regression example: Auto MPG dataset, feature scaling, model building, evaluation, and serialization. |
16 | Lecture-16.ipynb | Regression with Boston Housing dataset, multivariate regression, and introduction to multilabel classification. |
17 | Lecture-17.ipynb | Sentiment analysis with IMDB dataset: manual vectorization, binary classification, and model evaluation. |
18 | Lecture-18.ipynb | Text vectorization with CountVectorizer, binary/multiclass classification, and Reuters news topic classification. |
19 | Lecture-19.ipynb | Training with large datasets: generators, Sequence API, and partial data training in Keras. |
20 | Lecture-20.ipynb | Image classification: MNIST dataset, grayscale conversion, one-hot encoding, and dense neural network models. |
21 | Lecture-21.ipynb | Convolutional Neural Networks (CNNs): convolution operations, filters, padding, and image processing basics. |
22 | Lecture-22.ipynb | Building CNNs in Keras: Conv2D, Flatten, pooling layers, and training on MNIST for digit recognition. |
23 | Lecture-23.ipynb | Color image classification: CIFAR-10 dataset, CNN architecture, training, evaluation, and prediction. |
24 | Lecture-24.ipynb | Temporal data and 1D convolutions: Conv1D, word embeddings, and text data preprocessing for neural networks. |
25 | Lecture-25.ipynb | Recurrent Neural Networks (RNNs): theory, implementation, and sequence modeling with SimpleRNN in Keras. |
26 | Lecture-26.ipynb | RNNs for text: IMDB sentiment analysis, regularization (dropout), and introduction to LSTM layers. |
27 | Lecture-27.ipynb | Advanced RNNs: Bidirectional LSTM, GRU layers, and performance comparison on text classification tasks. |
28 | Lecture-28.ipynb | RNNs for time series: LSTM for climate prediction, data generators, and sequence-to-value modeling. |
29 | Lecture-29.ipynb | Sequence generation: LSTM for text generation (Nietzsche example), one-hot encoding, and sampling strategies. |
30 | Lecture-30.ipynb | Overview of supervised/unsupervised learning, regression types, model selection, and regression evaluation metrics. |
Project | File Name | Description |
---|---|---|
Project 1 | Project-1.ipynb | Medical Insurance Cost Prediction using regression models. |
Project 2 | Project-2.ipynb | Netflix stock price prediction using regression models (Linear Regression, metrics, plots). |
Project 3 | Project-3.ipynb | Boston Housing Dataset. |
Project 4 | Project-4.ipynb | Diamond Price Prediction. |
Project 5 | Project-5.ipynb | Advertising and Sales. |
Project 6 | Project-6.ipynb | Calories Burnt Prediction. |
Datasets in Projects folder:
- Advertising.csv
- boston_data.csv
- Close_Prediction.csv
- diamonds.csv
- insurance.csv
- NFLX.csv
- calories.csv, exercise.csv
For Project-1, see Projects/Project-1.ipynb for Medical Insurance Cost Prediction using regression models.
For Project-2, see Projects/Project-2.ipynb for Netflix stock price regression and evaluation.
For Project-3, see Projects/Project-3.ipynb for Boston Housing dataset regression analysis.
For Project-4, see Projects/Project-4.ipynb for Diamond Price Prediction.
For Project-5, see Projects/Project-5.ipynb for Advertising and Sales analysis.
For Project-6, see Projects/Project-6.ipynb for Calories Burnt Prediction.