Now showing 1 - 3 of 3
  • Publication
    Knowledge-based Deep Learning for Modeling Chaotic Systems
    Elabid, Zakaria
    Deep Learning has received increased attention due to its unbeatable success in many fields, such as computer vision, natural language processing, recommendation systems, and most recently in simulating multiphysics problems and predicting nonlinear dynamical systems. However, modeling and forecasting the dynamics of chaotic systems remains an open research problem since training deep learning models requires big data, which is not always available in many cases. Such deep learners can be trained from additional information obtained from simulated results and by enforcing the physical laws of the chaotic systems. This paper considers extreme events and their dynamics and proposes elegant models based on deep neural networks, called knowledge-based deep learning (KDL). Our proposed KDL can learn the complex patterns governing chaotic systems by jointly training on real and simulated data directly from the dynamics and their differential equations. This knowledge is transferred to model and forecast real-world chaotic events exhibiting extreme behavior. We validate the efficiency of our model by assessing it on three real-world benchmark datasets: El Niño sea surface temperature, San Juan Dengue viral infection, and Bjørnøya daily precipitation, all governed by extreme events' dynamics. Using prior knowledge of extreme events and physics-based loss functions to lead the neural network learning, we ensure physically consistent, generalizable, and accurate forecasting, even in a small data regime. Index Terms-Chaotic systems, long short-term memory, deep learning, extreme event modeling.
      22  2
  • Publication
    PARNN: A Probabilistic Autoregressive Neural Network Framework for Accurate Forecasting
    (2022) ; ;
    Panja, Madhurima
    Kumar, Uttam
    Forecasting time series data represents an emerging field of research in data science and knowledge discovery with vast applications ranging from stock price and energy demand prediction to the early prediction of epidemics. Numerous statistical and machine learning methods have been proposed in the last five decades with the demand for high-quality and reliable forecasts. However, in real-life prediction problems, situations exist in which a model based on one of the above paradigms is preferable. Therefore, hybrid solutions are needed to bridge the gap between classical forecasting methods and modern neural network models. In this context, we introduce a Probabilistic AutoRegressive Neural Network (PARNN) model that can handle a wide variety of complex time series data (e.g., nonlinearity, non-seasonal, long-range dependence, and non-stationarity). The proposed PARNN model is built by creating a fusion of an integrated moving average and autoregressive neural network to preserve the explainability, scalability, and "white-boxlike" prediction behavior of the individuals. Sufficient conditions for asymptotic stationarity and geometric ergodicity are obtained by considering the asymptotic behavior of the associated Markov chain. Unlike advanced deep learning tools, the uncertainty quantification of the PARNN model based on prediction intervals is obtained. During computational experiments, PARNN outperforms standard statistical, machine learning, and deep learning models (e.g., Transformers, NBeats, DeepAR, etc.) on a diverse collection of real-world datasets from macroeconomics, tourism, energy, epidemiology, and others for short-term, medium-term, and long-term forecasting. Multiple comparisons with the best method are carried out to showcase the superiority of the proposal in comparison with the state-ofthe-art forecasters over different forecast horizons.
      55  2
  • Publication
    W-Transformers : A Wavelet-based Transformer Framework for Univariate Time Series Forecasting
    Deep learning utilizing transformers has recently achieved a lot of success in many vital areas such as natural language processing, computer vision, anomaly detection, and recommendation systems, among many others. Among several merits of transformers, the ability to capture long-range temporal dependencies and interactions is desirable for time series forecasting, leading to its progress in various time series applications. In this paper, we build a transformer model for non-stationary time series. The problem is challenging yet crucially important. We present a novel framework for univariate time series representation learning based on the wavelet-based transformer encoder architecture and call it W-Transformer. The proposed W-Transformers utilize a maximal overlap discrete wavelet transformation (MODWT) to the time series data and build local transformers on the decomposed datasets to vividly capture the nonstationarity and long-range nonlinear dependencies in the time series. Evaluating our framework on several publicly available benchmark time series datasets from various domains and with diverse characteristics, we demonstrate that it performs, on average, significantly better than the baseline forecasters for short-term and long-term forecasting, even for datasets that consist of only a few hundred training samples.
      32  12