Identifying the Importance Between Order Book Levels using Neural Network Attention
Lampinen, Samu (2020)
Lampinen, Samu
2020
Tieto- ja sähkötekniikan kandidaattiohjelma - Degree Programme in Computing and Electrical Engineering, BSc (Tech)
Informaatioteknologian ja viestinnän tiedekunta - Faculty of Information Technology and Communication Sciences
This publication is copyrighted. You may download, display and print it for Your own personal use. Commercial use is prohibited.
Hyväksymispäivämäärä
2020-04-16
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:tuni-202004284212
https://urn.fi/URN:NBN:fi:tuni-202004284212
Tiivistelmä
Time-series classification is a complex task filled with noisy data and complexity. Recent studies with high-frequency financial time-series have shown that neural network attention can be used to enhance the performance of the model and to understand and highlight the decisions and the information a neural network has obtained from the underlying data.
Neural networks have been called black boxes because of the challenges they propose for the humans who try to understand the inherent tangled nature of their decision making, that is how the weights are connected to each other and what exactly is prioritized in the given data. The development of attention mechanisms has clarified both what the model is concentrating on and what is important in the input features.
The purpose of this thesis is to identify which order book levels a model selects as important, if any, using a FI-2010 dataset consisting of high-frequency limit order book data from NASDAQ OMX Nordic and a Temporal Attention augmented Bilinear Layer (TABL). In this thesis TABL's architecture was modified to have corresponding attention masks for each of the features given in the FI-2010 and the resulting model was used to classify short term mid-price movement using a prediction horizon of 10 events. Attention scores for all masks were tracked epoch-wise during training, and an analysis was performed based on the last values from the resulting attention paths.
The results from tracking the attention masks show that neural network attention can be used to obtain information of the importance between order book levels and how this exact model behaves with different features.
Neural networks have been called black boxes because of the challenges they propose for the humans who try to understand the inherent tangled nature of their decision making, that is how the weights are connected to each other and what exactly is prioritized in the given data. The development of attention mechanisms has clarified both what the model is concentrating on and what is important in the input features.
The purpose of this thesis is to identify which order book levels a model selects as important, if any, using a FI-2010 dataset consisting of high-frequency limit order book data from NASDAQ OMX Nordic and a Temporal Attention augmented Bilinear Layer (TABL). In this thesis TABL's architecture was modified to have corresponding attention masks for each of the features given in the FI-2010 and the resulting model was used to classify short term mid-price movement using a prediction horizon of 10 events. Attention scores for all masks were tracked epoch-wise during training, and an analysis was performed based on the last values from the resulting attention paths.
The results from tracking the attention masks show that neural network attention can be used to obtain information of the importance between order book levels and how this exact model behaves with different features.
Kokoelmat
- Kandidaatintutkielmat [6534]