Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    UID:
    (DE-627)1845006453
    Format: 1 Online-Ressource (44 p)
    Content: Since its introduction in 2017 (Vaswani et al., 2017), the Transformer model has excelled in a wide range of tasks involving natural language processing and computer vision. We investigate the Transformer model to address an important sequence learning problem in finance: time series forecasting. The underlying idea is to use the attention mechanism and the seq2seq architecture in the Transformer model to capture long-range dependencies and interactions across assets and perform multi-step time series forecasting in finance. The first part of this article systematically reviews the Transformer model while highlighting its strengths and limitations. In particular, we focus on the attention mechanism and the seq2seq architecture, which are at the core of the Transformer model. Inspired by the concept of weak learners in ensemble learning, we identify the diversification benefit of generating a collection of low-complexity models with simple structures and fewer features. The second part is dedicated to two financial applications. First, we consider the construction of trend-following strategies. Specifically, we use the encoder part of the Transformer model to construct a binary classification model to predict the sign of an asset’s future returns. The second application is the multi-period portfolio optimization problem, particularly volatility forecasting. In addition, our paper discusses the issues and considerations when using machine learning models in finance
    Note: Nach Informationen von SSRN wurde die ursprüngliche Fassung des Dokuments February 2023 erstellt
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages