FluxArchitectures
Complex neural network examples for Flux.jl.
This package contains a loose collection of (slightly) more advanced neural network architectures, mostly centered around time series forecasting.
Installation
To install FluxArchitectures, type ]
to activate the package manager, and type
add FluxArchitectures
for installation. After using FluxArchitectures
, the following functions are exported:
prepare_data
get_data
DARNN
DSANet
LSTnet
TPALSTM
See their docstrings, the documentation, and the examples
folder for details.
Models

LSTnet: This "Long and Shortterm Timeseries network" follows the paper by Lai et. al..

DARNN: The "DualStage AttentionBased Recurrent Neural Network for Time Series Prediction" is based on the paper by Qin et. al..

TPALSTM: The Temporal Pattern Attention LSTM network is based on the paper "Temporal Pattern Attention for Multivariate Time Series Forecasting" by Shih et. al..

DSANet: The "Dual SelfAttention Network for Multivariate Time Series Forecasting" is based on the paper by Siteng Huang et. al.
Quickstart
Activate the package and load some sampledata:
using FluxArchitectures
poollength = 10; horizon = 15; datalength = 1000;
input, target = get_data(:exchange_rate, poollength, datalength, horizon)
Define a model and a loss function:
model = LSTnet(size(input, 1), 2, 3, poollength, 120)
loss(x, y) = Flux.mse(model(x), y')
Train the model:
Flux.train!(loss, Flux.params(model),Iterators.repeated((input, target), 20), Adam(0.01))