Augurus is a financial market prediction system utilizing a Transformer architecture for next-token prediction and RL fine-tuning.
Market movement prediction using Augurus as indicator in OpenBackTest

It employs a hybrid architecture based on a GPT-style model. This approach embeds and quantizes mixed continuous and discrete inputs into tokens, which are then processed using multi-head self-attention.
- /training: Python-based training pipeline (PyTorch) and export it to ONNX format.
- /inference: C#-based API for real-time market predictions (ONNX).
Converts raw Binance OHLCV CSV data into tokenized features and calculates rolling volatility.
python process_data.py- Input:
data/BTCUSDT_5m_...csv - Output:
processed/tokenized_data.csv,processed/quantizer.pkl
Trains the Transformer to predict the next market "token" (price movement bin). This builds the model's understanding of market dynamics.
python train.py- Config:
SEQ_LEN=64,BATCH_SIZE=128 - Output:
processed/model_epoch_N.pt
Fine-tunes the pre-trained model using Reinforcement Learning to optimize for actual trading PnL (Profit and Loss). (Not necesary for good results)
python rl_train.py- Requires: A checkpoint from Phase 1 (e.g.,
model_epoch_5.pt). - Output:
processed/model_rl_epoch_N.pt
Converts the trained PyTorch model into ONNX format for high-performance inference in the C# Augurus.Api.
python export_onnx.py- Output:
../inference/financial_transformer.onnx,../inference/quantizer.pkl
- CUDA: Recommended for
train.pyandrl_train.py. - Memory: Reducing
SEQ_LENto 64 allows for largerBATCH_SIZE(e.g., 128) on consumer GPUs.