A deep learning library for time series forecasting, built on PyTorch and PyTorch Lightning.
- Overview
- Key Features
- Installation
- Quick Start
- Supported Models
- Project Structure
- Core Concepts
- Documentation
- Citation
- Contributing
- License
DeepTimeSeries provides a logical framework for designing and implementing various deep learning architectures specifically tailored for time series forecasting.
We present logical guidelines for designing various deep learning models for time series forecasting.
The library targets intermediate-level users who need to develop deep learning models for time series prediction. It solves common challenges unique to time series data—variable-length sequences, encoding/decoding windows, multi-feature handling, and probabilistic forecasting—while its high-level API allows beginners to use pre-implemented models with minimal configuration.
- Modular Architecture — Clean separation between encoding, decoding, and prediction components
- Multiple Model Support — Pre-implemented models including MLP, Dilated CNN, RNN variants (LSTM, GRU), and Transformer
- Flexible Data Handling — Pandas DataFrame-based data processing with chunk-based extraction
- Data Preprocessing — Built-in
ColumnTransformerfor feature scaling and transformation - Probabilistic Forecasting — Support for both deterministic and probabilistic predictions
- PyTorch Lightning Integration — Seamless training, validation, and testing workflows
# Using pip
pip install .
# Using uv (recommended)
uv sync
# For development with dev dependencies
uv sync --all-groupsRequirements
| Package | Version |
|---|---|
| Python | ≥ 3.10, < 3.12 |
| PyTorch | ≥ 2.0.0 |
| PyTorch Lightning | ≥ 2.0.0 |
| NumPy | ≥ 1.24.2 |
| Pandas | ≥ 1.5.3 |
| XArray | ≥ 2023.2.0 |
See pyproject.toml for the complete list of dependencies.
import numpy as np
import pandas as pd
import pytorch_lightning as pl
from torch.utils.data import DataLoader
import deep_time_series as dts
from deep_time_series.model import MLP
from sklearn.preprocessing import StandardScaler
# Prepare data
data = pd.DataFrame({
'target': np.sin(np.arange(100)),
'feature': np.cos(np.arange(100))
})
# Preprocess data
transformer = dts.ColumnTransformer(
transformer_tuples=[
(StandardScaler(), ['target', 'feature'])
]
)
data = transformer.fit_transform(data)
# Create model
model = MLP(
hidden_size=64,
encoding_length=10,
decoding_length=5,
target_names=['target'],
nontarget_names=['feature'],
n_hidden_layers=2,
)
# Create dataset and dataloader
dataset = dts.TimeSeriesDataset(
data_frames=data,
chunk_specs=model.make_chunk_specs()
)
dataloader = DataLoader(dataset, batch_size=32)
# Train model
trainer = pl.Trainer(max_epochs=10)
trainer.fit(model, train_dataloaders=dataloader, val_dataloaders=dataloader)| Model | Target Features | Non-target Features | Deterministic | Probabilistic |
|---|---|---|---|---|
| MLP | ✓ | ✓ | ✓ | ✓ |
| Dilated CNN | ✓ | ✓ | ✓ | ✓ |
| Vanilla RNN | ✓ | ✓ | ✓ | ✓ |
| LSTM | ✓ | ✓ | ✓ | ✓ |
| GRU | ✓ | ✓ | ✓ | ✓ |
| Transformer | ✓ | ✓ | ✓ | ✓ |
deep_time_series/
├── core.py # ForecastingModule, Head, BaseHead, etc.
├── chunk.py # Chunk specification and extraction
├── dataset.py # TimeSeriesDataset
├── transform.py # ColumnTransformer for preprocessing
├── plotting.py # Visualization utilities
├── layer.py # Custom neural network layers
├── util.py # Utility functions
└── model/ # Pre-implemented forecasting models
├── mlp.py
├── dilated_cnn.py
├── rnn.py
└── single_shot_transformer.py
The library uses a chunk-based approach for handling time series data:
| Chunk | Purpose |
|---|---|
EncodingChunkSpec |
Defines the input window for the encoder |
DecodingChunkSpec |
Defines the input window for the decoder |
LabelChunkSpec |
Defines the target window for prediction |
All models inherit from ForecastingModule, which provides:
- Automatic training / validation / test step implementations
- Metric tracking and logging
- Loss calculation with multiple heads
- Chunk specification generation
DataFrame → ColumnTransformer → TimeSeriesDataset → Lightning Trainer → ChunkInverter → DataFrame
Full documentation: https://bet-lab.github.io/DeepTimeSeries/
- User Guide — Design concepts and usage patterns
- Tutorials — Step-by-step examples
- API Reference — Complete API documentation
If you use DeepTimeSeries in your research, please cite:
Choi, W., & Lee, S. (2023). Performance evaluation of deep learning architectures for load and temperature forecasting under dataset size constraints and seasonality. Energy and Buildings, 288, 113027. https://doi.org/10.1016/j.enbuild.2023.113027
@article{choi2023performance,
author = {Choi, W. and Lee, S.},
title = {Performance evaluation of deep learning architectures for load
and temperature forecasting under dataset size constraints
and seasonality},
journal = {Energy and Buildings},
volume = {288},
pages = {113027},
year = {2023},
doi = {10.1016/j.enbuild.2023.113027}
}Contributions are welcome! Please feel free to submit issues or pull requests.
MIT License