Skip to content

Commit 3870df9

Browse files
committed
Updated README with TFT info
1 parent 443feb3 commit 3870df9

File tree

3 files changed

+14
-6
lines changed

3 files changed

+14
-6
lines changed

README.md

+14-6
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,6 @@ An example of a bid stack is shown in the figure above. Historical bid stacks fo
3434
Price transparency is ensured, and the final Day-Ahead Price is published on OMIE’s website. The market also includes an Intraday Market, where participants can adjust their positions after the Day-Ahead results. If consumers fail to secure sufficient energy in the DAM or if their energy needs change, they can use the intraday market to purchase additional electricity. Similarly, producers can sell any surplus they may have or adjust their commitments to match their real-time production capabilities. Prices are influenced by supply, demand, and the availability of generation sources, ensuring efficient electricity trading.
3535

3636

37-
3837
### Motivation For Accurately Predicting Energy Demand & Price
3938

4039
- **Improved Grid Stability & Reliability** - Accurate demand forecasts help grid operators balance supply and demand, reducing blackouts and inefficiencies.
@@ -90,7 +89,6 @@ The complete work can be found in : [XGBoost Jupyter Notebook](xgboost_demand.ip
9089
- This represents a substantial 47 - 53% improvement in both absolute and relative error compared to week ahead benchmark, demonstrating the model's ability to capture complex patterns in the data.
9190
- The model is saved under `./models/demand_xgboost_model.pkl`
9291

93-
9492
### Transformer Model
9593

9694
The aim of the Transformer model is to leverage self-attention mechanisms to capture both short- and long-range dependencies in energy demand data, allowing for more flexible feature interactions compared to traditional time series models.
@@ -100,7 +98,7 @@ The complete work can be found in : [Transformer Jupyter Notebook](transformer_d
10098
#### Summary
10199

102100
- The Transformer was selected for energy demand forecasting because its self-attention mechanism efficiently captures both short- and long-term dependencies, handles multiple input features simultaneously, and scales well for multi-output tasks, making it more suitable than other neural network-based methods.
103-
- `Encoder-only Transformer`, `Seq2Seq`, `Data Restructuring`, `Model Architecture`, `Learnable Positional Encoding`, `Hyperparameter Tuning`
101+
- `Encoder-only Transformer`, `Seq2Seq`, `Deep Learning`, `Data Restructuring`, `Model Architecture`, `Learnable Positional Encoding`, `Hyperparameter Tuning`
104102
- **Model Evaluation**
105103
- RMSE = 899.11 MW, MAE = 681.94 MW and MAPE = 2.57%.
106104
<img src="./images/transformer_performance.png" alt="Transformer Performance" width="800"/>
@@ -158,18 +156,28 @@ The complete work can be found in : [XGBoost Jupyter Notebook](xgboost_price.ipy
158156
- This represents a substantial 22-29% improvement in error compared to the day ahead benchmark, demonstrating the model's ability to capture complex patterns in the data.
159157
- The model is saved under `./models/price_xgboost_model.pkl`
160158

159+
### TFT (Temporal Fusion Transformer) Model
161160

162-
### TFT Model
163-
164-
The aim of the TFT model is to ...
161+
The aim of the TFT model is to leverage deep learning techniques tailored for time-series forecasting, enabling accurate energy price predictions by capturing both short- and long-term dependencies, while effectively incorporating known future inputs and exogenous variables.
165162

166163
The complete work can be found in : [TFT Jupyter Notebook](tft_price.ipynb)
167164

168165
#### Summary
169166

167+
- TFT was selected for energy price forecasting due to its architecture specifically designed for time-series data, combining LSTMs, attention mechanisms, and variable selection networks. Its ability to handle mixed inputs, such as historical observations, exogenous variables, and known future inputs, makes it well-suited to capturing the complex dynamics involved in energy markets.
168+
- `TimeSeriesDataset`, `Deep Learning`, `Model Comparison`, `CUDA`
169+
- **Model Evaluation**
170+
- RMSE = 22.23 EUR/MWh, MAE = 16.52 EUR/MWh
171+
<img src="./images/tft_performance.png" alt="TFT Performance" width="800"/>
172+
<img src="./images/tft_residuals.png" alt="TFT Residuals" width="500"/>
173+
174+
- This represents a substantial improvement in error compared to the day ahead benchmark, and similar performance to the XGBoost model.
175+
- The model is saved under `./models/price_tft_model.ckpt`
170176

171177
### Energy Price Conclusion
172178

179+
In conclusion, both the XGBoost and TFT models significantly outperformed the SARIMA model and the day-ahead benchmark, reducing forecasting error by ~25%. While XGBoost slightly outperformed TFT in terms of RMSE, the difference in performance was minimal. Notably, the TFT model was able to achieve comparable accuracy despite the limited dataset size, thanks to its architecture specifically designed for time-series forecasting. Given its strong performance and interpretability, the TFT model represents a powerful deep learning alternative to XGBoost for energy price forecasting.
180+
173181

174182
## Data Collection
175183

images/tft_performance.png

51.2 KB
Loading

images/tft_residuals.png

17 KB
Loading

0 commit comments

Comments
 (0)