
AI Quantitative Trading
COURSE OBJECTIVE
Intensive course on artificial intelligence techniques applied to trading using two powerful programming languages: Python and R. The course aims to:
Show powerful Trading Algorithms programmed in Python and R within the Jupyter environment.
Present modern trading strategies among others: technical analysis, quantitative or fundamental trading and directional trading.
Show advanced machine learning and deep learning methodologies to predict price directions for stocks, commodities and cryptocurrencies.
Advanced supervised machine learning techniques such as ensemble learning, gradient tree boosting, random forest, support vector machine and unsupervised such as k-means are taught. Techniques are presented to validate the strategies and algorithms.
Advanced deep learning techniques are applied such as convolved neural networks for trading models, or the LSTM to modernize volatility and time series, the hybrid LSTM-CNN model for asset allocation.
An innovative module on reinforcement learning has been included to generate trading algorithms.
Present trading and forecasting strategies for stocks and cryptocurrencies, using Arima, Garch, VAR, VEC models and recurrent neural networks.
Show the use of machine learning and social networks to predict the price of shares, through sentiment analysis. As well as advanced Natural Language Processing models using bidirectional long short term memory (BLSTM)
Show portfolio management financial models, among others: MPT, CAPM, APT, through powerful exercises in Python and R.
Show recent methodologies in measuring Value at Risk and Expected Shortfall as well as the use of adversarial and generative neural networks GAN as an alternative to Monte Carlo simulation.
The validation of the models and the main issues in backtesting are explained in depth.
Present market microstructure models and structural break tests.
WHO SHOULD ATTEND?
This program is aimed at traders, managers, analysts and trading consultants. And to all those people interested in machine learning applied to trading.
Monday, December 01, 2023

Europe: Mon-Fri, CEST 16-19 h
America: Mon-Fri, CDT 18-21 h
Asia: Mon-Fri, IST 18-21 h

Price: 7.900 €
.png)
Duration: 30 h
.png)
-
Presentaciones PDF
-
R , Python, Jupyterlab y Tensorflow
.png)
-

Banks

Agenda
Module 1: Trading Strategies
-
Introduction to trading strategies
-
Trading with securities
-
Trading with Cryptocurrencies
-
Technical analysis
-
Indicators
-
Oscillators
-
Moving averages, bands and directional movements
-
Speed and acceleration
-
Movement techniques
-
Divergence in movement
-
Patterns
-
Fibonacci numbers
-
Elliot waves
-
Trend systems
-
Directional Trading
-
MACD
-
RSI
-
Bollinger Band
-
Pairs Trading
-
-
Quantitative Trading on Fundamentals
-
Data collection
-
Definition of financial ratios
-
Machine Learning Models
-
Clusters
-
Decision trees
-
Backtesting
-
Exercise 1: Analysis of MACD, RSI and Bollinger band in R
-
Exercise 2: Correlation and cointegration of trading pairs in R
-
Exercise 3: Fundamental Analysis with Clusters and decision trees in R
TRADING STRATEGIES

-
Module 2: Feature engineering
-
Feature engineering
-
Standard data preprocessing
-
Defining data preprocessing
-
Definition of feature engineering
-
Implementation of cross-sectional momentum
-
Time series features
-
-
Residualization of stock returns
-
Neutralize stock returns
-
Techniques used to residualize stock returns
-
-
Common characteristics of quantitative trading
-
Cross-sectional characteristics versus time series
-
Variables based on price
-
Fundamentals-Based Variables
-
Variables based on feelings
-
Text-based variables
-
Audio-based variables
-
Image-based variables
-
Video-based variables
-
Network-based variables
-
-
Common feature normalization techniques
-
Min-Max
-
Z Score
-
Logarithmic normalization
-
Quantile Normalization
-
Range Normalization
-
Other normalizations
-
-
Advanced techniques
-
The fixed time horizon method
-
Calculation of dynamic thresholds
-
The triple barrier method
-
Size learning
-
Meta labeling
-
How to use meta labeling
-
The Quantitative Path
-
Exercise 4: Cross-sectional momentum and time series features
-
Exercise 5: Typology of normalizations
-
Exercise 6: The triple barrier method
-
Exercise 7: Binning by side and size
-
Feature Engineering

-
Module 3: Trading Machine Learning
-
Definition of Machine Learning
-
Machine Learning Methodology
-
Data Storage
-
Abstraction
-
Generalization
-
Assessment
-
-
Supervised and Unsupervised Learning
-
Typology of Machine Learning algorithms
-
Steps to implement an algorithm
-
Information collection
-
Exploratory Analysis
-
Model training
-
Model Evaluation
-
Model improvements
-
-
Machine Learning for Trading
-
Use of RSI, MACD, Bollinge bands
-
Defining market direction
-
Probability distribution and prediction
-
Using Logistic Regression for prediction
-
Exercise 8: Direction Prediction Using Logistic Regression in R
TRADING MACHINE LEARNING

-
Module 4: Classification Algorithms
-
Decision Trees
-
Modeling
-
Advantages and disadvantages
-
Recursion and Partitioning Processes
-
Recursive partitioning tree
-
Pruning Decision tree
-
Conditional inference tree
-
Tree visualization
-
Measuring Decision Tree Prediction
-
CHAID model
-
Model C5.0
-
-
K-Nearest Neighbors
-
Modeling
-
Advantages and disadvantages
-
Euclidean Distance
-
Distance Manhattan
-
K value selection
-
-
Exercise 9: K-Nearest Neighbors for price prediction in R
-
Exercise 10: Decision trees for price prediction in R
Module 5: Unsupervised Algorithm
-
Characteristics of unsupervised algorithms
-
Clusters
-
K-Means
-
Application in trading
-
Advantages and disadvantages
-
Exercise 11: K-Means for trading strategy in R
Module 6: Replication of indices using Autoencoders
-
Replicating an index
-
Data collection
-
Implementation of standard Autoencoders
-
Data exploration and preparation
-
Model creation and adjustment
-
Model evaluation
-
Replicate an index using autoencoders
-
Explore some variants of autoencoders
-
The denoising autoencoder
-
Understand autoencoders in depth
Unsupervised Learning

-
Module 7: Advanced NN and SVM Algorithms
-
Support Vector Machine
-
support vectors
-
Optimal hyperplane
-
Add costs
-
Advantages and disadvantages
-
SVM visualization
-
SVM Tuning
-
Kernel Trick
-
-
Neural Networks (NN)
-
Perceptron Training
-
Multilayer Perceptron
-
Backpropagation algorithm
-
Training procedures
-
Tuning NN
-
NN visualization
-
Advantages and disadvantages
-
-
Exercise 12: Market data extraction from Python and R
-
Exercise 14: Support Vector Machine for price direction prediction in R
-
Exercise 15: Neural Networks for price direction prediction in R
Module 8: Ensemble Learning
-
Set models
-
Bagging
-
Random Forest
-
Boosting
-
Adaboost
-
Gradient Tree Boosting
-
Boosting and Bagging for regression models
-
Advantages and disadvantages
-
Stock Price Trend Prediction
-
Exercise 16: Price trend prediction using Ensemble models in R
-
Exercise 17: Intraday Price Trend Prediction Using Random Forest in Python and R
-
Exercise 18: Price Trend Prediction Using Adaboost in R
-
Exercise 19: Price trend prediction using gradient tree boosting in Python
Module 9: Deep Learning
-
Activation function
-
Sigmoidal
-
Rectified linear unit
-
Hypertangent
-
-
Feedforward network
-
Multilayer Perceptron
-
Recurrent Neural Networks
-
Using Tensorflow
-
Using Tensorboard
-
R deep Learning
-
Python deep learning
-
Convolutional neural networks
-
Use of deep learning in image classification
-
cost function
-
Gradient Descent Optimization
-
Using deep learning for trading
-
Advantages and disadvantages of deep learning
-
Exercise 20: Trend prediction using Deep Learning with tensorflow and python and Backtesting in Python
Module 10: Trading Rules with Convolved Neural Networks CNN
-
Trading signals with technical indicators
-
Get data from public sources
-
Configure data
-
Formulation of hypotheses and sample tests
-
Comparison of alternative models
-
Simple classification network
-
Build a convolutional neural network
-
Model investment logic
-
Select network architecture
-
Set the data in the correct format
-
Train and test the model
-
Exercise 20: Trading signals: Feed forward Neural Network modeling versus CNN Convolved Neural Network
Supervised Learning

-
Module 11: Volatility forecasting using LSTM
-
Volatility measurement
-
Types of volatility
-
Historical volatility
-
Implied volatility
-
Volatility index
-
Intraday volatility
-
Realized Volatility
-
-
Implement the LSTM model
-
Data preparation
-
Create and adjust the model
-
Evaluate the model
-
Improve model performance
-
Online learning
-
Stack layers
-
Adjust hyperparameters
-
View results
-
-
Compare LSTM with other models
-
RNN GRU model
-
GARCH model
-
Cumulative squared error display
-
-
Exercise 21: Volatility forecasting using LSTM versus Garch model and backtesting
Volatility Forecasting

-
Module 12: Forecasting of financial time series
-
Data treatment
-
Decomposition of time series
-
Using Pandas
-
Moving average
-
Exponential smoothing
-
Holt-Winter's exponential
-
ARIMA models
-
Data treatment
-
Normality test
-
Heavy tail estimation
-
T-test and F-test
-
Autocorrelation tests
-
Non-Stationary Series
-
Dickey-Fuller test
-
Cointegration Tests
-
Durbin-Watson
-
-
Stock Price Predictions
-
Liquidity and illiquidity estimates
-
-
Gold price predictions
-
Bitcoin price predictions
-
High-Frequency Data Treatment
-
Exercise 22: Non-stationary Series and Cointegration Tests in R
-
Exercise 23: Advanced price forecasting using ARIMA in Python
-
Exercise 24: Treatment of High-frequency data in Python
Module 14: Forecasting Models
-
Trading strategies with forecasting models
-
Multivariate Models
-
VAR Vector Autoregressive Models
-
ARCH Models
-
GARCH models
-
GARCH Models Multivariate Copulas
-
VEC Error Correction Vector Model
-
Johansen method
-
-
Machine Learning Models
-
Supported Vector Machine
-
Neural Network
-
Market time series forecasting
-
Data from Yahoo! Finance (r)
-
Data from Google Finance (r)
-
FRED's data
-
Data from Census Bureau, Treasury and BLS
-
Forecasting market time series returns
-
NN and SVM algorithms for performance forecasting
-
Forecasting volatility NN vs. Garch
-
-
Development and validation basis
-
-
Deep Learning
-
Recurrent Neural Networks RNN
-
Elman Neural Network
-
Jordan Neural Network
-
Basic structure of RNN
-
Long short term memory
-
Temporary windows
-
Development and validation sample
-
Regression
-
Sequence modeling
-
-
Time series analysis with Prophet from Facebook
-
Exercise 25: Stock price modeling with VAR and error correction vectors in R
-
Exercise 26: Multivariate volatility forecasting GARCH in R
-
Exercise 27: Forecasting Machine Learning using NN in R
-
Exercise 28: Forecasting stock prices using Recurrent Neural Networks in Python
-
Exercise 29: Forecasting stock series with Prophet
-
Exercise 30: Bitcoin forecasting using neural networks in R
-
Exercise 31: Garch-Arima trading algorithm
-
Exercise 32: Cointegration Trading Algorithm
Module 15: Reinforcement Learning
-
Differences between reinforcement learning and supervised and unsupervised learning
-
Basic concepts:
-
Agents
-
Policies
-
Around
-
Actions
-
Rewards
-
-
Markov decision process
-
Bellman equations
-
Dynamic programming
-
Monte Carlo methods
-
Application to finance
-
Trading Application
-
Exercise 33: Reinforcement Learning applied to trading strategies in Python and Tensorflow
Forecasting Time Series

-
Module 16: Sentiment Analysis on Twitter to predict stock prices
-
Sentiment-Based Stock Price Prediction Models
-
Twitter
-
Information sources
-
Twitter API connection
-
Token and Keys
-
-
Information processing
-
Definition Text Mining
-
Unstructured Data
-
Exploratory Analysis
-
Treemaps
-
-
Predictive modeling in Text Mining
-
K-Nearest Neighbors
-
-
Text Mining on Social Networks
-
Keyword Search
-
Classification algorithms
-
Clustering Algorithms
-
-
Feeling in linguistics and psychology
-
Subjectivity
-
Facticity
-
-
Sentiment Analysis on Twitter
-
Polarity Analysis and Score
-
Support Vector Machine
-
Neural Networks
-
-
Exercise 34: Text Mining of a document in R
-
Exercise 35: Analysis of words and tweet associations
-
Exercise 36: Twitter Sentiment Analysis and predictive model using Support Vector Machine in R
Module 17: Using NLP with BLSTM for news
-
Natural Language processing NLP
-
Sentiment analysis for finance
-
Represent text data: words into vectors
-
Frequency-Based Word Vectors
-
Counting vectorization
-
TF-IDF vectorization
-
Word embeddings
-
WordVec
-
CBOW
-
Skip-gram
-
FastText
-
GloVe
-
-
Loading and splitting data
-
Implementation of the BLSTM model
-
Data preparation
-
Model creation and adjustment
-
Model evaluation
-
Performance improvement
-
Handling unbalanced classes
-
Applying pre-trained word embeddings
-
Considering separate decisions
-
Exercise 37: NLP Modeling with BLSTM Using Stock Price News
Sentiment Analysis

-
Module 18: Portfolio Management
-
Asset Management
-
Modern MPT Portfolio Theory
-
Objectives: minimize risk, VaR and ES.
-
Maximize sharpe ratio
-
utility function
-
Efficient Frontier
-
rebalancing of positions
-
-
Capital Asset Pricing Model CAPM
-
Beta estimation and adjustments
-
-
Asset Pricing Theory APT
-
Multifactor Model
-
Exercise 38: Efficient frontier estimation, minimizing ES using Python
-
Exercise 39: Optimization and rebalancing of securities positions in R
-
Exercise 40: Estimating betas and Capital Asset Pricing Model (CAPM) in Python
-
Exercise 41: Asset Pricing Theory (APT) model
Module 19: Asset Allocation LSTM-CNN
-
Tactical Asset Allocation Modeling
-
Joint forecast for an asset class
-
Forecast and individual bets
-
Configure data
-
Build the model
-
Understand the deep learning model
-
Implement a CNN-LSTM model
-
Alternative Bayesian VAR model
-
Test and validate the model
-
Exercise 42: Hybrid model Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) for asset allocation
Asset Management

-
Module 20: Algorithm validation
-
Verification of p-values in regressions
-
R squared, MSE, MAD
-
Waste diagnosis
-
Goodness of Fit Test
-
Deviance
-
Bayesian Information Criterion (BIC)
-
Akaike Information Criterion
-
-
Cross validation
-
Bug Bootstrapping
-
confusion matrix
-
Kappa
-
Main discriminant power tests:
-
K.S.
-
ROC curve
-
Lift Curve
-
Gini Index
-
Cumulative Accuracy Profile
-
-
Confidence intervals
-
Jackknifing with discriminant power test
-
Bootstrapping with discriminant power test
-
K-Fold Cross Validation
-
Exercise 43: Goodness of Fit Test Logistic Regression
-
Exercise 44: Gini, CAP, ROC Estimation in R
-
Exercise 45: ROC Bootstrapping of R Parameters
-
Exercise 46: K-Fold Cross Validation in R
Module 21: Cross-validation in finance
-
The goal of cross validation
-
Why K-Fold CV fails in finance
-
Solution A: Bleeding CV K-Fold
-
Training set purge
-
Embargo
-
The purged K-Fold class
Module 22: Errors in cross validation
-
Feature Importance
-
The importance of the importance of features
-
Importance of characteristics with substitution effects
-
Average impurity decrease
-
Average Decrease Accuracy
-
Importance of characteristics without substitution effects
-
Importance of a single feature
-
Orthogonal features
-
Parallelized vs Importance of Stacked Features
-
Experiments with synthetic data
Module 23: Hyperparameter Tuning with Cross Validation
-
Grid Search Cross Validation,
-
Random Search Cross Validation
-
Uniform logarithmic distribution
-
Hyperparameter scoring and tuning
Model Validation

-
Module 24: Advanced Backtesting
-
Backtesting Analysis
-
Bet size
-
Strategy-independent bet sizing approaches
-
Bet Size from Predicted Odds
-
Average active bets
-
Size discretization
-
Dynamic bet sizes and limit prices
-
-
Risks of backtesting
-
Clean and dirty backtesting
-
Backtesting problems
-
Some general recommendations
-
Strategy selection
-
-
Backtesting using cross validation
-
The Forward-Looking Method
-
Walk-Forward Method Errors
-
The cross validation method
-
The combinatorial method of purged cross-validation
-
Combinatorial divisions
-
Debugged combinatorial cross-validation
-
Backtesting algorithm
-
How Clean Combinatorial Cross-Validation Addresses Backtesting Overfitting
-
-
Synthetic data backtesting
-
Trading rules
-
The problem
-
Our framework,
-
Optimal trading rules
-
Algorithms
-
Implementation
-
Experimental results
-
Cases with zero long-run equilibrium
-
Cases with positive long-term balance
-
Cases with long-term negative equilibrium
-
-
Backtesting Statistics
-
Types of backtesting statistics
-
Evaluation metrics
-
Information Coefficient and R2
-
General characteristics
-
Performance
-
Time-weighted rate of return
-
Execution statistics for performance evaluation
-
Implementation deficit
-
Sharpe's relationship
-
The probabilistic Sharpe relation
-
The deflated Sharpe ratio
-
Efficiency statistics
-
Classification and attribution scores
-
Exercise 47: Backtesting metrics and estimation of the Sharpe ratio of a trading strategy
-
Backtesting

-
Module 25: Value at Risk (VaR) and Expected Shortfall (ES)
-
Linear and non-linear portfolios
-
Volatility Estimation
-
Parametric Models
-
Normal VaR
-
t-student distribution
-
Lognormal Distribution
-
Linear Model for Stocks and Bonds
-
-
Quadratic Model for options
-
Expected Shortfall
-
Exercise 48: Expected Shortfall and VaR in Python
Module 26: Historical Simulation and Monte Carlo
-
VaR Historical Simulation
-
Volatility Adjustment
-
Bootstrapping
-
-
VaR Monte Carlo Simulation
-
Simulation with a risk factor
-
Simulation with multiple risk factors
-
Variance Reduction Methods
-
-
VaR Monte Carlo based on Gaussian copula
-
VaR Monte Carlo based on t-student copula
-
Exercise 49: VaR Estimation: Using Monte Carlo Simulation in Python
-
Exercise 50: Historical Simulation in Excel
-
Exercise 51: Historical Simulation Backtesting in Excel
-
Exercise 52: VaR using Gaussian copula and tStudent in R
Module 27: Risk measurement using GAN
-
Estimation of value at risk
-
Computing methods and disadvantages
-
Introduction of generative adversarial networks
-
Generative models
-
Discriminative models
-
GAN inner workings
-
Implementation of a risk model using GAN
-
Definition of our model
-
GAN model implementation
-
Benchmarking results
-
Exercise 53: Estimation of VaR using GAN neural networks and Monte Carlo simulation
Risks: VaR, ES and Machine Learning

-
Module 28: Structural breaks
-
Types of structural failure tests
-
Entropy functions
-
Shannon entropy
-
The complementary (or maximum likelihood) estimator
-
Lempel-Ziv estimators
-
Coding schemes
-
Binary encoding
-
Quantile coding
-
Sigma Coding
-
Entropy of a Gaussian process
-
Entropy and generalized mean
-
Some financial applications of entropy
-
Market efficiency
-
Maximum entropy generation
-
Portfolio concentration
-
Market microstructure
Module 29: Microstructural characteristics
-
First generation: price sequences
-
The tick rule
-
The roll model
-
CUSUM tests
-
Brown-Durbin-Evans CUSUM test in recursive
-
Waste
-
Chu-Stinchcombe-White CUSUM Test in Tiers
-
Explosivity tests
-
Dickey-Fuller Chow type test,
-
Supreme Augmented Dickey-Fuller
-
Submartingale and supermartingale tests
-
High-Low Volatility Estimator
-
Second Generation: Strategic Business Models
-
Kyle's Lambda
-
Amihud Lambda
-
Hasbrouck Lambda
-
Third generation: sequential business models
-
Information Based Trading Probability
-
Probability of information synchronized with volume
-
Trade
-
Additional features of microstructural data sets
-
Order Size Distribution
-
Cancellation rates, limit orders, market orders
-
Time Weighted Average Price Execution Algorithms
-
Options Markets
-
Serial correlation of signed order flow
-
What is market microstructural information?
-
Exercise 54: Structural break test: The Chow Test
-
Exercise 55: Structural break test: The CUSUM Test
Market Microstructure

-
C. Rafael Bergamin Nº 6 28043 Madrid
Tel. Madrid: +(34) 911 238 518
© 2023 by Fermac Risk SL todos los derechos reservados





.png)

