Skip to content

Commit

Permalink
v0 release
Browse files Browse the repository at this point in the history
  • Loading branch information
Enforcer03 committed Jun 24, 2024
1 parent b6ca465 commit c76cdbe
Show file tree
Hide file tree
Showing 114 changed files with 40,698 additions and 2,985 deletions.
Binary file modified .gitignore
Binary file not shown.
74 changes: 64 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,38 @@
# ForecastPro
# ForecastPro 📈
* A comparative study of Deep Learning Based Time Series models

[![v0 Release](https://img.shields.io/badge/version-v0.0-blue)](#)

## Overview

1. **Models Implemented**:
- **Stacked LSTMs**: Captures long-term patterns in time series data using multiple LSTM layers for enhanced learning of temporal dependencies.
- **LSTNet**: Combines CNNs and LSTMs to efficiently capture both short-term and long-term patterns in time series data, outperforming traditional stacked LSTMs.
- **ConvLSTMs**: Integrates convolutional layers with LSTMs to capture spatiotemporal dependencies, making it suitable for tasks like air quality prediction.

2. **Key Advantages**:
- **Stacked LSTMs**:
- Enhanced ability to learn long-term dependencies.
- Improved modeling of intricate temporal patterns.
- **LSTNet**:
- Combines CNNs and LSTMs for better feature extraction and pattern learning.
- Addresses vanishing gradient problem common in deep recurrent architectures.
- **ConvLSTMs**:
- Captures both spatial and temporal dependencies.
- Effective for applications involving spatiotemporal data, such as environmental monitoring.

3. **Comparison with Statistical Models**:
- Statistical models (e.g., ARIMA, ARCH) rely on predefined assumptions and smaller datasets, offering interpretable coefficients and computational efficiency.
- Deep learning models, including LSTNet and ConvLSTMs, require more data and computational resources but offer superior performance in capturing complex patterns.

4. **Key Features**:
- **Side-by-Side Model Comparison**: 🔍 Predictions from Stacked LSTMs, LSTNet, and ConvLSTMs can be compared side by side, allowing for comprehensive evaluation of model performance.

5. **Advancements and Results**:
- LSTNet and ConvLSTMs show promising results in addressing traditional challenges of deep learning models in time series forecasting, demonstrating improved accuracy and performance.

For detailed information and implementation, refer to our comprehensive articles and resources listed in the [References](#references) section.

## LSTNet(iterated forecast)
Apple stock price(data used till June 30th 2023)
Index 3000+ is the forecast
Expand Down Expand Up @@ -37,16 +71,33 @@ Key advantages of LSTNet over stacked LSTMs:
- Improved feature extraction capabilities, allowing for better representation learning.
- Addressing the vanishing gradient problem, which is common in deep recurrent architectures.

## Statistical Autoregressive Models
## Why Deep-Learning Based Time Series Models?

### 1. **Handling Complex Patterns**

Deep learning models, such as ConvLSTM, Stacked LSTM, and RNNs, excel at capturing complex and non-linear patterns in time series data that traditional statistical models might miss. These models can learn intricate temporal dependencies and relationships that are essential for accurate forecasting in many real-world applications.

Statistical autoregressive models, such as autoregressive integrated moving average (ARIMA) and autoregressive conditional heteroskedasticity (ARCH) models, have long been utilized in time series analysis. These models are based on statistical techniques and rely on predefined assumptions about the underlying distribution of the data.
### 2. **Feature Learning**

Deep learning models automatically learn relevant features from raw data without the need for manual feature engineering. This is particularly beneficial for time series data, where identifying the right features can be challenging and time-consuming. Models like ConvLSTM combine convolutional layers with LSTM layers to extract spatial and temporal features simultaneously.

### 3. **Scalability**

Deep learning models are highly scalable and can handle large datasets effectively. This is crucial for applications involving big data, where traditional statistical models might struggle with computational efficiency and performance.

### 4. **Flexibility and Adaptability**

Deep learning models can be adapted to various types of time series data, including non-stationary and highly non-linear data. They do not rely on strong assumptions about the data distribution, making them more flexible than statistical models.

### 5. **Robustness to Noise**

Deep learning models, particularly those with multiple layers and complex architectures, are more robust to noise in the data. They can learn to filter out irrelevant variations and focus on the underlying patterns, leading to more reliable predictions.

### 6. **Transfer Learning and Pre-trained Models**

The availability of pre-trained models and the concept of transfer learning in deep learning allow for the application of knowledge gained from one domain to another. This can significantly reduce training time and improve performance, especially when dealing with limited data in specific time series applications.

ARIMA models, for example, utilize past values of a time series to predict future values based on the assumption that the data is stationary and can be modeled using autoregressive (AR), moving average (MA), and differencing (I) components.

Advantages of statistical autoregressive models:
- Strong mathematical foundation and well-defined assumptions.
- Interpretable coefficients and statistical tests.
- Efficient performance with smaller datasets.

## Challenges Faced by DL Models

Expand All @@ -62,14 +113,17 @@ DL models have traditionally encountered challenges in outperforming statistical

However, advancements in DL techniques, such as LSTNet, have shown promising results in addressing these challenges and outperforming traditional statistical autoregressive models in certain scenarios.

## Results
<!-- ## Results -->

<!-- Placeholder for the results of the project. -->

## Directions of Use
<!-- ## Directions of Use -->

<!-- Placeholder for the updated directions of use as the project proceeds. -->

### About us
You can find more of such projects at: https://dsgiitr.in/

## References

The following references provide more information about the concepts and techniques used in this project:
Expand Down
Loading

0 comments on commit c76cdbe

Please sign in to comment.