papers AI Learner
The Github is limit! Click to go to the new site.

A Statistical Investigation of Long Memory in Language and Music

2019-04-08
Alexander Greaves-Tunnell, Zaid Harchaoui

Abstract

Representation and learning of long-range dependencies is a central challenge confronted in modern applications of machine learning to sequence data. Yet despite the prominence of this issue, the basic problem of measuring long-range dependence, either in a given data source or as represented in a trained deep model, remains largely limited to heuristic tools. We contribute a statistical framework for investigating long-range dependence in current applications of sequence modeling, drawing on the statistical theory of long memory stochastic processes. By analogy with their linear predecessors in the time series literature, we identify recurrent neural networks (RNNs) as nonlinear processes that simultaneously attempt to learn both a feature representation for and the long-range dependency structure of an input sequence. We derive testable implications concerning the relationship between long memory in real-world data and its learned representation in a deep network architecture, which are explored through a semiparametric framework adapted to the high-dimensional setting. We establish the validity of statistical inference for a simple estimator, which yields a decision rule for long memory in RNNs. Experiments illustrating this statistical framework confirm the presence of long memory in a diverse collection of natural language and music data, but show that a variety of RNN architectures fail to capture this property even after training to benchmark accuracy in a language model.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1904.03834

PDF

http://arxiv.org/pdf/1904.03834


Similar Posts

Comments