Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    IOP Publishing ; 2022
    In:  Journal of Physics: Conference Series Vol. 2205, No. 1 ( 2022-02-01), p. 012009-
    In: Journal of Physics: Conference Series, IOP Publishing, Vol. 2205, No. 1 ( 2022-02-01), p. 012009-
    Abstract: To effectively mine historical data information and improve the accuracy of short-term load prediction, this paper aims at the characteristics of time series and nonlinear power load. Deep learning for load forecasting has received a lot of attention in recent years, and it has become popular in the analysis of electricity load forecasting. Long short-term memory (LSTM) and gated recurrent unit (GRU) are specifically designed for time-series data. However, due to the gradient disappearing and exploding problem, recurrent neural networks (RNNs) cannot capture long-term dependence. The Transformer, a self-attention-based sequence model, has produced impressive results in a variety of generating tasks that demand long-range coherence. This shows that self-attention could be useful in power load forecasting modeling. In this paper, to effectively and efficiently model the large-scale load forecasting, we further design the transform encoder with relative position encoding, which consists of four main components: single-layer neural network, relative positional encoding module, encoder module, and feed-forward network. Experimental results on real-world datasets demonstrate that our method outperforms the GRU, LSTM, and original Transformer encoder.
    Type of Medium: Online Resource
    ISSN: 1742-6588 , 1742-6596
    Language: Unknown
    Publisher: IOP Publishing
    Publication Date: 2022
    detail.hit.zdb_id: 2166409-2
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages