Search Results

Engineering Recurrent Neural Networks for Low-rank and Noise-robust Computation

Download or Read eBook Engineering Recurrent Neural Networks for Low-rank and Noise-robust Computation PDF written by Christopher Hopkins Stock and published by . This book was released on 2021 with total page pages. Available in PDF, EPUB and Kindle.
Engineering Recurrent Neural Networks for Low-rank and Noise-robust Computation
Author :
Publisher :
Total Pages :
Release :
ISBN-10 : OCLC:1242872418
ISBN-13 :
Rating : 4/5 (18 Downloads)

Book Synopsis Engineering Recurrent Neural Networks for Low-rank and Noise-robust Computation by : Christopher Hopkins Stock

Book excerpt: Making sense of dynamical computation in nonlinear recurrent neural networks is a major goal in neuroscience. The advent of modern machine learning approaches has made it possible, via black-box training methods, to efficiently generate computational models of a network performing a given task; indeed, deep learning has thrived on building large, flexible, and highly non-convex models which nonetheless can be effectively optimized to achieve remarkable out-of-sample generalization performance. However, the resulting trained network models can be so complex that they defy intuitive understanding. What design principles govern how the connectivity and dynamics of recurrent neural networks (RNNs) endow them with their computational capabilities? It is evident that there remains a large "explainability gap" between the empirical ability of trained recurrent neural networks to capture variance in neural recordings, on one hand, and the theoretical difficulty of writing down constraints on weight space from task-relevant considerations, on the other. This thesis presents new approaches to closing the explainability gap in neural networks, and in particular, in RNNs. First, we present several novel methods for constructing task-performant RNNs directly from a high-level description of the task to be performed. Critically, unlike black-box machine learning methods for training networks, our construction methods rely solely on simple and easily interpreted mathematical operations. In doing, our approach makes explicit the relationship between network structure and task performance. Harnessing the role of fixed points in recurrent computation, we find forward engineering methods that produce exactly solvable nonlinear networks for a variety of context-dependent computations, including those of arbitrary finite state machines. Second, we examine tools for discovering low-rank structure both in trained recurrent network models and in the learning dynamics of gradient descent in deep networks. First, we introduce a novel method for discovering low-rank structure in trained recurrent networks. In many temporal signal processing tasks in biology, including sequence memory, sequence classification, and natural language processing, neural networks operate in a transient regime far from fixed points. We develop a general approach for capturing transient computations in recurrent networks by dramatically reducing the complexity of networks trained to solve transient processing tasks. Our method, called dynamics-reweighted singular value decomposition (DR-SVD), performs a reweighted dimensionality reduction to obtain a much lower rank connectivity matrix that preserves the dynamics of the original neural network. Second, we show that learning dynamics of deep feedforward networks exhibit low-rank tensor structure which is discoverable and interpretable through the lens of tensor decomposition. Finally, through a study of a fundamental symmetry present in RNNs with homogeneous activation functions, we derive a novel exploration of weight space that improves the noise robustness of a trained RNN without sacrificing performance on the task, or even without requiring any knowledge of the particular task being performed. Our exploration takes the form of a novel, biologically plausible local learning rule that provably increases the robustness of neural dynamics to noise in nonlinear recurrent neural networks with homogeneous nonlinearities, and promotes balance between the incoming and outgoing synaptic weights of each neuron in the network. Our rule, which we refer to as synaptic balancing, is consistent with many known aspects of experimentally observed heterosynaptic plasticity, and moreover makes new experimentally testable predictions relating plasticity at the incoming and outgoing synapses of individual neurons.


Engineering Recurrent Neural Networks for Low-rank and Noise-robust Computation Related Books

Engineering Recurrent Neural Networks for Low-rank and Noise-robust Computation
Language: en
Pages:
Authors: Christopher Hopkins Stock
Categories:
Type: BOOK - Published: 2021 - Publisher:

DOWNLOAD EBOOK

Making sense of dynamical computation in nonlinear recurrent neural networks is a major goal in neuroscience. The advent of modern machine learning approaches h
Recurrent Neural Networks
Language: en
Pages: 414
Authors: Larry Medsker
Categories: Computers
Type: BOOK - Published: 1999-12-20 - Publisher: CRC Press

DOWNLOAD EBOOK

With existent uses ranging from motion detection to music synthesis to financial forecasting, recurrent neural networks have generated widespread attention. The
Recurrent Neural Networks
Language: en
Pages: 426
Authors: Amit Kumar Tyagi
Categories: Computers
Type: BOOK - Published: 2022-08-08 - Publisher: CRC Press

DOWNLOAD EBOOK

The text discusses recurrent neural networks for prediction and offers new insights into the learning algorithms, architectures, and stability of recurrent neur
Recurrent Neural Networks for Short-Term Load Forecasting
Language: en
Pages: 74
Authors: Filippo Maria Bianchi
Categories: Computers
Type: BOOK - Published: 2017-11-09 - Publisher: Springer

DOWNLOAD EBOOK

The key component in forecasting demand and consumption of resources in a supply network is an accurate prediction of real-valued time series. Indeed, both serv
Recurrent Neural Networks
Language: en
Pages: 0
Authors: Fathi M. Salem
Categories:
Type: BOOK - Published: 2022 - Publisher:

DOWNLOAD EBOOK

This textbook provides a compact but comprehensive treatment that provides analytical and design steps to recurrent neural networks from scratch. It provides a
Scroll to top