You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+14-6
Original file line number
Diff line number
Diff line change
@@ -34,7 +34,6 @@ An example of a bid stack is shown in the figure above. Historical bid stacks fo
34
34
Price transparency is ensured, and the final Day-Ahead Price is published on OMIE’s website. The market also includes an Intraday Market, where participants can adjust their positions after the Day-Ahead results. If consumers fail to secure sufficient energy in the DAM or if their energy needs change, they can use the intraday market to purchase additional electricity. Similarly, producers can sell any surplus they may have or adjust their commitments to match their real-time production capabilities. Prices are influenced by supply, demand, and the availability of generation sources, ensuring efficient electricity trading.
35
35
36
36
37
-
38
37
### Motivation For Accurately Predicting Energy Demand & Price
39
38
40
39
-**Improved Grid Stability & Reliability** - Accurate demand forecasts help grid operators balance supply and demand, reducing blackouts and inefficiencies.
@@ -90,7 +89,6 @@ The complete work can be found in : [XGBoost Jupyter Notebook](xgboost_demand.ip
90
89
- This represents a substantial 47 - 53% improvement in both absolute and relative error compared to week ahead benchmark, demonstrating the model's ability to capture complex patterns in the data.
91
90
- The model is saved under `./models/demand_xgboost_model.pkl`
92
91
93
-
94
92
### Transformer Model
95
93
96
94
The aim of the Transformer model is to leverage self-attention mechanisms to capture both short- and long-range dependencies in energy demand data, allowing for more flexible feature interactions compared to traditional time series models.
@@ -100,7 +98,7 @@ The complete work can be found in : [Transformer Jupyter Notebook](transformer_d
100
98
#### Summary
101
99
102
100
- The Transformer was selected for energy demand forecasting because its self-attention mechanism efficiently captures both short- and long-term dependencies, handles multiple input features simultaneously, and scales well for multi-output tasks, making it more suitable than other neural network-based methods.
@@ -158,18 +156,28 @@ The complete work can be found in : [XGBoost Jupyter Notebook](xgboost_price.ipy
158
156
- This represents a substantial 22-29% improvement in error compared to the day ahead benchmark, demonstrating the model's ability to capture complex patterns in the data.
159
157
- The model is saved under `./models/price_xgboost_model.pkl`
160
158
159
+
### TFT (Temporal Fusion Transformer) Model
161
160
162
-
### TFT Model
163
-
164
-
The aim of the TFT model is to ...
161
+
The aim of the TFT model is to leverage deep learning techniques tailored for time-series forecasting, enabling accurate energy price predictions by capturing both short- and long-term dependencies, while effectively incorporating known future inputs and exogenous variables.
165
162
166
163
The complete work can be found in : [TFT Jupyter Notebook](tft_price.ipynb)
167
164
168
165
#### Summary
169
166
167
+
- TFT was selected for energy price forecasting due to its architecture specifically designed for time-series data, combining LSTMs, attention mechanisms, and variable selection networks. Its ability to handle mixed inputs, such as historical observations, exogenous variables, and known future inputs, makes it well-suited to capturing the complex dynamics involved in energy markets.
- This represents a substantial improvement in error compared to the day ahead benchmark, and similar performance to the XGBoost model.
175
+
- The model is saved under `./models/price_tft_model.ckpt`
170
176
171
177
### Energy Price Conclusion
172
178
179
+
In conclusion, both the XGBoost and TFT models significantly outperformed the SARIMA model and the day-ahead benchmark, reducing forecasting error by ~25%. While XGBoost slightly outperformed TFT in terms of RMSE, the difference in performance was minimal. Notably, the TFT model was able to achieve comparable accuracy despite the limited dataset size, thanks to its architecture specifically designed for time-series forecasting. Given its strong performance and interpretability, the TFT model represents a powerful deep learning alternative to XGBoost for energy price forecasting.
0 commit comments