Fast Learning for Quant Finance: The Power of Extreme Learning Machines
https://arxiv.org/abs/2505.09551
In quantitative finance, speed matters. Whether you’re calibrating a model to market data or predicting intraday price moves, every second counts. But traditional deep neural networks (DNNs), while powerful, can be slow to train and run and that’s a problem when markets move faster than your model.
A new study from researchers at Peking University and Delft University of Technology shows that Extreme Learning Machines (ELMs) can deliver DNN-like accuracy at a fraction of the training and inference time and in some cases, even outperform them.
What’s an Extreme Learning Machine?
An ELM is a type of single-layer neural network with a twist:
Hidden layer weights are set randomly and never updated.
Only the output weights are trained and that’s done in one shot using convex optimization (like solving a system of equations), not slow gradient descent.
The result? Blazing-fast training (often seconds instead of hours) and rapid inference.
Despite its simplicity, ELM is backed by strong theory. It’s a universal function approximator and avoids some of the pitfalls of overfitting.
Why This Matters for Finance
Finance is full of time-sensitive tasks:
Calibrating stochastic volatility models to option prices
Predicting short-term price moves
Completing or smoothing implied volatility surfaces
Solving high-dimensional PDEs for exotic derivatives
In all these cases, speed and adaptability can be as important as accuracy.
Key Findings from the Paper
The researchers tested ELM on both supervised and unsupervised finance problems. Here’s what they found:
1. Option Pricing Functions
For Heston and rough Heston models, ELM learned pricing functions much faster than DNNs or Gaussian Process Regression (GPR).
Inference was up to 15× faster than GPR and 4× faster than DNNs, with similar accuracy.
ELM adapts quickly when market parameters move outside the original training range. No slow retraining loops.
2. Intraday Return Prediction
On high-frequency Ping An Bank data, ELM beat logistic regression in accuracy and was 20× faster to retrain.
This makes it practical for daily re-calibration in a live trading setup.
3. Completing Implied Volatility Surfaces
ELM produced smoother, more robust surfaces with fewer arbitrage violations than GPR.
No hyperparameter tuning required. Just set the architecture and go.
4. Solving Financial PDEs
ELM-based Physics-Informed Neural Networks (PINNs) solved Black–Scholes and more complex PDEs orders of magnitude faster than DNN-based PINNs.
Worked well even in multi-asset settings, showing promise against the curse of dimensionality.
Why ELM Works So Well
No iterative backpropagation → training time drops drastically.
Compact architecture → lower memory use, easier deployment.
Strong generalization → smoother outputs, less overfitting.
Flexibility → works for both regression (pricing) and classification (prediction) tasks.
Where Could This Go Next?
The authors see big potential in:
Path-dependent derivatives like American options
Real-time market recalibration using online learning with ELM
Multi-asset and high-dimensional pricing problems that break traditional methods
The Bottom Line
Extreme Learning Machines won’t replace deep learning everywhere. But if you need speed, scalability, and solid accuracy in quant finance, they deserve a spot in your toolkit. In fast-moving markets, the ability to train in seconds and adapt in real time could be a decisive edge.