Welcome to P K Kelkar Library, Online Public Access Catalogue (OPAC)

Space-time computing with temporal neural networks / (Record no. 562266)

000 -LEADER
fixed length control field 10117nam a2200745 i 4500
001 - CONTROL NUMBER
control field 7933302
003 - CONTROL NUMBER IDENTIFIER
control field IEEE
005 - DATE AND TIME OF LATEST TRANSACTION
control field 20200413152924.0
006 - FIXED-LENGTH DATA ELEMENTS--ADDITIONAL MATERIAL CHARACTERISTICS
fixed length control field m eo d
007 - PHYSICAL DESCRIPTION FIXED FIELD--GENERAL INFORMATION
fixed length control field cr cn |||m|||a
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 170626s2017 caua foab 000 0 eng d
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
International Standard Book Number 9781627058902
Qualifying information ebook
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
Canceled/invalid ISBN 9781627059480
Qualifying information print
024 7# - OTHER STANDARD IDENTIFIER
Standard number or code 10.2200/S00771ED1V01Y201704CAC039
Source of number or code doi
035 ## - SYSTEM CONTROL NUMBER
System control number (CaBNVSL)swl00407520
035 ## - SYSTEM CONTROL NUMBER
System control number (OCoLC)991869997
040 ## - CATALOGING SOURCE
Original cataloging agency CaBNVSL
Language of cataloging eng
Description conventions rda
Transcribing agency CaBNVSL
Modifying agency CaBNVSL
050 #4 - LIBRARY OF CONGRESS CALL NUMBER
Classification number QA76.87
Item number .S657 2017
082 04 - DEWEY DECIMAL CLASSIFICATION NUMBER
Classification number 006.32
Edition number 23
100 1# - MAIN ENTRY--PERSONAL NAME
Personal name Smith, James E.
Fuller form of name (James Edward),
Dates associated with a name 1950-,
Relator term author.
245 10 - TITLE STATEMENT
Title Space-time computing with temporal neural networks /
Statement of responsibility, etc. James E. Smith.
264 #1 - PRODUCTION, PUBLICATION, DISTRIBUTION, MANUFACTURE, AND COPYRIGHT NOTICE
Place of production, publication, distribution, manufacture [San Rafael, California] :
Name of producer, publisher, distributor, manufacturer Morgan & Claypool,
Date of production, publication, distribution, manufacture, or copyright notice 2017.
300 ## - PHYSICAL DESCRIPTION
Extent 1 PDF (xxv, 215 pages) :
Other physical details illustrations.
336 ## - CONTENT TYPE
Content type term text
Source rdacontent
337 ## - MEDIA TYPE
Media type term electronic
Source isbdmedia
338 ## - CARRIER TYPE
Carrier type term online resource
Source rdacarrier
490 1# - SERIES STATEMENT
Series statement Synthesis lectures on computer architecture,
International Standard Serial Number 1935-3243 ;
Volume/sequential designation # 39
538 ## - SYSTEM DETAILS NOTE
System details note Mode of access: World Wide Web.
538 ## - SYSTEM DETAILS NOTE
System details note System requirements: Adobe Acrobat Reader.
500 ## - GENERAL NOTE
General note Part of: Synthesis digital library of engineering and computer science.
504 ## - BIBLIOGRAPHY, ETC. NOTE
Bibliography, etc. note Includes bibliographical references (pages 205-214).
505 0# - FORMATTED CONTENTS NOTE
Formatted contents note Part I. Introduction to space-time computing and temporal neural networks -- 1. Introduction -- 1.1 Basics of neuron operation -- 1.2 Space-time communication and computation -- 1.2.1 Communication -- 1.2.2 Computation -- 1.2.3 Discussion -- 1.3 Background: neural network models -- 1.3.1 Rate coding -- 1.3.2 Temporal coding -- 1.3.3 Rate processing -- 1.3.4 Spike processing -- 1.3.5 Summary and taxonomy -- 1.4 Background: machine learning -- 1.5 Approach: interaction of computer engineering and neuroscience -- 1.6 Bottom-up analysis: a guiding analogy -- 1.7 Overview --
505 8# - FORMATTED CONTENTS NOTE
Formatted contents note 2. Space-time computing -- 2.1 Definition of terms -- 2.2 Feedforward computing networks -- 2.3 General TNN model -- 2.4 Space-time computing systems -- 2.5 Implications of invariance -- 2.6 TNN system architecture -- 2.6.1 Training -- 2.6.2 Computation (evaluation) -- 2.6.3 Encoding -- 2.6.4 Decoding -- 2.7 Summary: meta-architecture -- 2.7.1 Simulation -- 2.7.2 Implied functions -- 2.8 Special case: feedforward McCulloch-Pitts networks -- 2.9 Race logic --
505 8# - FORMATTED CONTENTS NOTE
Formatted contents note 3. Biological overview -- 3.1 Overall brain structure (very brief ) -- 3.2 Neurons -- 3.2.1 Synapses -- 3.2.2 Synaptic plasticity -- 3.2.3 Frequency-current relationship -- 3.2.4 Inhibition -- 3.3 Hierarchy and columnar organization -- 3.3.1 Neurons -- 3.3.2 Columns (micro-columns) -- 3.3.3 Macro-columns -- 3.3.4 Regions -- 3.3.5 Lobes -- 3.3.6 Uniformity -- 3.4 Inter-neuron connections -- 3.4.1 Path distances -- 3.4.2 Propagation velocities -- 3.4.3 Transmission delays -- 3.4.4 Numbers of connections -- 3.4.5 Attenuation of excitatory responses -- 3.4.6 Connections summary -- 3.5 Sensory processing -- 3.5.1 Receptive fields -- 3.5.2 Saccades and whisks -- 3.5.3 Vision pathway -- 3.5.4 Waves of spikes -- 3.5.5 Feedforward processing path -- 3.5.6 Precision -- 3.5.7 Information content -- 3.5.8 Neural processing -- 3.6 Oscillations -- 3.6.1 Theta oscillations -- 3.6.2 Gamma oscillations --
505 8# - FORMATTED CONTENTS NOTE
Formatted contents note Part II. Modeling temporal neural networks -- 4. Connecting TNNs with biology -- 4.1 Communication via voltage spikes -- 4.2 Columns and spike bundles -- 4.3 Spike synchronization -- 4.3.1 Aperiodic synchronization: saccades, whisks, and sniffs -- 4.3.2 Periodic synchronization -- 4.4 First spikes carry information -- 4.5 Feedforward processing -- 4.6 Simplifications summary -- 4.7 Plasticity and training -- 4.8 Fault tolerance and temporal stability -- 4.8.1 Interwoven fault tolerance -- 4.8.2 Temporal stability -- 4.8.3 Noise (or lack thereof ) -- 4.9 Discussion: reconciling biological complexity with model simplicity -- 4.10 Prototype architecture overview --
505 8# - FORMATTED CONTENTS NOTE
Formatted contents note 5. Neuron modeling -- 5.1 Basic models -- 5.1.1 Hodgkin Huxley neuron model -- 5.1.2 Derivation of the leaky integrate and fire (LIF) model -- 5.1.3 Spike response model (SRM0) -- 5.2 Modeling synaptic connections -- 5.3 Excitatory neuron implementation -- 5.4 The menagerie of LIF neurons -- 5.4.1 Synaptic conductance model -- 5.4.2 Biexponential SRM0 model -- 5.4.3 Single stage SRM0 -- 5.4.4 Linear leak integrate and fire (LLIF) -- 5.5 Other neuron models -- 5.5.1 Alpha function -- 5.5.2 Quadratic integrate-and-fire -- 5.6 Synaptic plasticity and training --
505 8# - FORMATTED CONTENTS NOTE
Formatted contents note 6. Computing with excitatory neurons -- 6.1 Single neuron clustering -- 6.1.1 Definitions -- 6.1.2 Excitatory neuron function, approximate description -- 6.1.3 Looking ahead -- 6.2 Spike coding -- 6.2.1 Volleys -- 6.2.2 Nonlinear mappings -- 6.2.3 Distance functions -- 6.3 Prior work: radial basis function (RBF) neurons -- 6.4 Excitatory neuron I: training mode -- 6.4.1 Modeling excitatory response functions -- 6.4.2 Training set -- 6.4.3 STDP update rule -- 6.4.4 Weight stabilization -- 6.5 Excitatory neuron I: compound response functions -- 6.6 Excitatory neuron model II -- 6.6.1 Neuron model derivation -- 6.6.2 Training mode -- 6.6.3 Evaluation mode -- 6.7 Attenuation of excitatory responses -- 6.8 Threshold detection -- 6.9 Excitatory neuron model II summary --
505 8# - FORMATTED CONTENTS NOTE
Formatted contents note 7. System architecture -- 7.1 Overview -- 7.2 Interconnection structure -- 7.3 Input encoding -- 7.4 Excitatory column operation -- 7.4.1 Evaluation -- 7.4.2 Training -- 7.4.3 Unsupervised synaptic weight training -- 7.4.4 Supervised weight training -- 7.5 Inhibition -- 7.5.1 Feedback inhibition -- 7.5.2 Lateral inhibition -- 7.5.3 Feedforward inhibition -- 7.6 Volley decoding and analysis -- 7.6.1 Temporal flattening -- 7.6.2 Decoding to estimate clustering quality -- 7.6.3 Decoding for classification -- 7.7 Training inhibition -- 7.7.1 FFI: establishing tF and kF -- 7.7.2 LI: establishing tL and kL -- 7.7.3 Excitatory neuron training in the presence of inhibition -- Part III: extended design study: clustering the MNIST dataset --
505 8# - FORMATTED CONTENTS NOTE
Formatted contents note 8. Simulator implementation -- 8.1 Simulator overview -- 8.2 Inter-unit communication -- 8.3 Simulating time -- 8.4 Synaptic weight training -- 8.5 Evaluation -- 8.5.1 EC block -- 8.5.2 IC block -- 8.5.3 VA block -- 8.6 Design methodology --
505 8# - FORMATTED CONTENTS NOTE
Formatted contents note 9. Clustering the MNIST dataset -- 9.1 MNIST workload -- 9.2 Prototype clustering architecture -- 9.3 OnOff encoding -- 9.4 Intra-CC network -- 9.5 Excitatory column (EC) -- 9.6 Lateral inhibition -- 9.7 144 RFs -- 9.8 Feedforward inhibition -- 9.9 Layer 1 result summary -- 9.10 Related work -- 9.11 Considering layer 2 --
505 8# - FORMATTED CONTENTS NOTE
Formatted contents note 10. Summary and conclusions -- References -- Author biography.
506 ## - RESTRICTIONS ON ACCESS NOTE
Terms governing access Abstract freely available; full-text restricted to subscribers or individual document purchasers.
510 0# - CITATION/REFERENCES NOTE
Name of source Compendex
510 0# - CITATION/REFERENCES NOTE
Name of source INSPEC
510 0# - CITATION/REFERENCES NOTE
Name of source Google scholar
510 0# - CITATION/REFERENCES NOTE
Name of source Google book search
520 3# - SUMMARY, ETC.
Summary, etc. Understanding and implementing the brain's computational paradigm is the one true grand challenge facing computer researchers. Not only are the brain's computational capabilities far beyond those of conventional computers, its energy efficiency is truly remarkable. This book, written from the perspective of a computer designer and targeted at computer researchers, is intended to give both background and lay out a course of action for studying the brain's computational paradigm. It contains a mix of concepts and ideas drawn from computational neuroscience, combined with those of the author. As background, relevant biological features are described in terms of their computational and communication properties. The brain's neocortex is constructed of massively interconnected neurons that compute and communicate via voltage spikes, and a strong argument can be made that precise spike timing is an essential element of the paradigm. Drawing from the biological features, a mathematics-based computational paradigm is constructed. The key feature is spiking neurons that perform communication and processing in space-time, with emphasis on time. In these paradigms, time is used as a freely available resource for both communication and computation. Neuron models are first discussed in general, and one is chosen for detailed development. Using the model, single-neuron computation is first explored. Neuron inputs are encoded as spike patterns, and the neuron is trained to identify input pattern similarities. Individual neurons are building blocks for constructing larger ensembles, referred to as "columns". These columns are trained in an unsupervised manner and operate collectively to perform the basic cognitive function of pattern clustering. Similar input patterns are mapped to a much smaller set of similar output patterns, thereby dividing the input patterns into identifiable clusters. Larger cognitive systems are formed by combining columns into a hierarchical architecture. These higher level architectures are the subject of ongoing study, and progress to date is described in detail in later chapters. Simulation plays a major role in model development, and the simulation infrastructure developed by the author is described.
530 ## - ADDITIONAL PHYSICAL FORM AVAILABLE NOTE
Additional physical form available note Also available in print.
588 ## - SOURCE OF DESCRIPTION NOTE
Source of description note Title from PDF title page (viewed on June 26, 2017).
650 #0 - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Neural networks (Computer science)
650 #0 - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Computational neuroscience.
650 #0 - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Space and time
General subdivision Data processing.
650 #0 - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Temporal databases.
653 ## - INDEX TERM--UNCONTROLLED
Uncontrolled term spiking neural networks
653 ## - INDEX TERM--UNCONTROLLED
Uncontrolled term temporal models
653 ## - INDEX TERM--UNCONTROLLED
Uncontrolled term unsupervised learning
653 ## - INDEX TERM--UNCONTROLLED
Uncontrolled term classification
653 ## - INDEX TERM--UNCONTROLLED
Uncontrolled term neuron models
653 ## - INDEX TERM--UNCONTROLLED
Uncontrolled term computing theory
655 #0 - INDEX TERM--GENRE/FORM
Genre/form data or focus term Electronic books.
776 08 - ADDITIONAL PHYSICAL FORM ENTRY
Relationship information Print version:
International Standard Book Number 9781627059480
830 #0 - SERIES ADDED ENTRY--UNIFORM TITLE
Uniform title Synthesis digital library of engineering and computer science.
830 #0 - SERIES ADDED ENTRY--UNIFORM TITLE
Uniform title Synthesis lectures in computer architecture ;
Volume/sequential designation # 39.
International Standard Serial Number 1935-3243
856 42 - ELECTRONIC LOCATION AND ACCESS
Materials specified Abstract with links to resource
Uniform Resource Identifier http://ieeexplore.ieee.org/servlet/opac?bknumber=7933302
Holdings
Withdrawn status Lost status Damaged status Not for loan Permanent Location Current Location Date acquired Barcode Date last seen Price effective from Koha item type
        PK Kelkar Library, IIT Kanpur PK Kelkar Library, IIT Kanpur 2020-04-13 EBKE766 2020-04-13 2020-04-13 E books

Powered by Koha