Neuro-inspired Information Processing

Neuro-inspired Information Processing
Автор книги: id книги: 1887682     Оценка: 0.0     Голосов: 0     Отзывы, комментарии: 0 16643,7 руб.     (184,1$) Читать книгу Купить и скачать книгу Купить бумажную книгу Электронная книга Жанр: Техническая литература Правообладатель и/или издательство: John Wiley & Sons Limited Дата добавления в каталог КнигаЛит: ISBN: 9781119721819 Скачать фрагмент в формате   fb2   fb2.zip Возрастное ограничение: 0+ Оглавление Отрывок из книги

Реклама. ООО «ЛитРес», ИНН: 7719571260.

Описание книги

With the end of Moore's law and the emergence of new application needs such as those of the Internet of Things (IoT) or artificial intelligence (AI), neuro-inspired, or neuromorphic, information processing is attracting more and more attention from the scientific community. Its principle is to emulate in a simplified way the formidable machine to process information which is the brain, with neurons and artificial synapses organized in network. These networks can be software – and therefore implemented in the form of a computer program – but also hardware and produced by nanoelectronic circuits. The 'material' path allows very low energy consumption, and the possibility of faithfully reproducing the shape and dynamics of the action potentials of living neurons (biomimetic approach) or even being up to a thousand times faster (high frequency approach). This path is promising and welcomed by the major manufacturers of nanoelectronics, as circuits can now today integrate several million neurons and artificial synapses.

Оглавление

Alain Cappy. Neuro-inspired Information Processing

Table of Contents

List of Tables

List of Illustrations

Guide

Pages

Neuro-inspired Information Processing

Acknowledgments

Introduction

1. Information Processing. 1.1. Background

1.1.1. Encoding

1.1.2. Memorization

1.2. Information processing machines

1.2.1. The Turing machine

1.2.2. von Neumann architecture

1.2.3. CMOS technology

1.2.4. Evolution in microprocessor performance

1.3. Information and energy

1.3.1. Power and energy dissipated in CMOS gates and circuits

1.3.1.1. Power and energy dissipated during CMOS gate transitions

1.3.1.2. How to reduce the energy dissipated, ?

1.3.1.3. Power dissipated in a CMOS circuit

1.4. Technologies of the future

1.4.1. Evolution of the “binary coding/von Neumann/CMOS” system

1.4.1.1. New switches

1.4.1.2. Change in state variable

1.4.2. Revolutionary approaches

1.4.2.1. Quantum computation

1.4.2.1.1. Coding: quantum bits or qubits

1.4.2.1.2. Bloch sphere

1.4.2.1.3. Multi-qubit systems

1.4.2.1.4. Entanglement

1.4.2.1.5. Non-cloning property

1.4.2.1.6. Architecture of the quantum computer

1.4.2.1.7. Quantum parallelism

1.4.2.1.8. Quantum coherence

1.4.2.1.9. Quantum computer technologies

1.4.2.2. Neuro-inspired information processing

1.5. Microprocessors and the brain

1.5.1. Physical parameters. 1.5.1.1. Surface area and volume

1.5.1.2. Number and size of elementary devices

1.5.1.3. Interconnections

1.5.1.4. Power dissipated

1.5.2. Information processing

1.5.2.1. Information coding

1.5.2.2. Operating frequency

1.5.2.3. Processing protocols

1.5.2.4. Computation method

1.5.3. Memorization of information

1.6. Conclusion

2. Information Processing in the Living

2.1. The brain at a glance

2.1.1. Brain functions

2.1.2. Brain anatomy

2.2. Cortex. 2.2.1. Structure

2.2.2. Hierarchical organization of the cortex

2.2.3. Cortical columns

2.2.4. Intra- and intercolumnar connections

2.2.4.1. Intracolumnar connections

2.2.4.2. Intercolumnar connections. 2.2.4.2.1. Direct connections or feedforward (FF)

2.2.4.2.2. Downward connections or feedback (FB)

2.3. An emblematic example: the visual cortex

2.3.1. Eye and retina

2.3.1.1. Photoreceptors

2.3.1.2. Horizontal cells

2.3.1.3. Bipolar cells

2.3.1.4. Amacrine cells

2.3.1.5. Ganglion cells

2.3.2. Optic nerve

2.3.3. Cortex V113

2.3.4. Higher level visual areas V2, V3, V4, V5 and IT

2.3.5. Conclusion

2.4. Conclusion

3. Neurons and Synapses

3.1. Background

3.1.1. Neuron

3.1.2. Synapses

3.2. Cell membrane

3.2.1. Membrane structure

3.2.2. Intra- and extracellular media

3.2.3. Transmembrane proteins. 3.2.3.1. Ion channels

3.2.3.2. Ion pumps

3.3. Membrane at equilibrium

3.3.1. Resting potential, Vr

3.3.1.1. Simplified model of resting potential

3.3.1.2. Complete model of Vr

3.3.1.3. Conclusion

3.4. The membrane in dynamic state

3.4.1. The Hodgkin–Huxley model. 3.4.1.1. Mathematical foundation and expressions of the model

3.4.1.2. Sodium and potassium currents

3.4.1.3. Parameters of the HH model

3.4.1.4. Dynamic analysis: response to an excitation-current pulse

3.4.1.5. Excitation leading to a burst of action potentials

3.4.2. Beyond the Hodgkin–Huxley model

3.4.3. Simplified HH models

3.4.3.1. The phase plan method

3.4.3.2. The FitzHugh model

3.4.3.3. The Morris–Lecar model

3.4.3.4. Phenomenological models

3.4.3.4.1. LIF model

3.4.3.4.2. The FitzHugh–Nagumo model

3.4.3.4.3. The Izhikevich model

3.4.4. Application of membrane models. 3.4.4.1. From the membrane model to the neuron model

3.4.4.2. Spike propagation in the axon

3.4.4.2.1. Physical parameters of the axon

3.4.4.2.2. The propagation equation

3.4.4.2.3. Resolving the propagation equation

3.4.4.2.4. Spike generation and propagation

3.4.4.2.5. Propagation velocity

3.4.4.2.6. Myelinated axons

3.4.4.3. The membrane as a low-pass filter

3.4.4.4. Noise

3.5. Synapses

3.5.1. Biological characteristics

3.5.2. Synaptic plasticity. 3.5.2.1. Synaptic weight

3.5.2.2. Learning

3.6. Conclusion

4. Artificial Neural Networks

4.1. Software neural networks

4.1.1. Neuron and synapse models

4.1.1.1. The perceptron

4.1.1.2. Spiking neurons

4.1.2. Artificial Neural Networks

4.1.2.1. The multilayer perceptron or feedforward network

4.1.2.2. Convolutional networks

4.1.2.3. Recurrent networks

4.1.2.4. Reservoir computing

4.1.2.5. Conclusion

4.1.3. Learning

4.1.3.1. Supervised learning

4.1.3.1.1. Multilayer-perceptron scenario

4.1.3.1.2. Recurrent-network scenario

4.1.3.2. Unsupervised learning

4.1.4. Conclusion

4.2. Hardware neural networks

4.2.1. Comparison of the physics of biological systems and semiconductors

4.2.2. Circuits simulating the neuron

4.2.2.1. Subthreshold MOS operation

4.2.2.2. Current mode

4.2.2.3. Voltage mode

4.2.2.3.1. Subthreshold inverter

4.2.2.3.2. The axon-hillock circuit

4.2.2.3.3. Energy performance

4.2.2.3.4. The Morris–Lecar neuron

4.2.2.3.5. The simplified ML neuron

4.2.2.3.6. Comparison with experiment

4.2.2.3.7. Conclusion

4.2.3. Circuits simulating the synapse. 4.2.3.1. Introduction

4.2.3.2. Electronic memories

4.2.3.2.1. Analog memories

4.2.3.2.2. Digital memories

4.2.3.2.3. Resistive memories

4.2.3.2.4. Memristances

4.2.4. Circuits for learning

4.2.5. Examples of hardware neural networks. 4.2.5.1. Simple circuits

4.2.5.1.1. Stochastic resonance

4.2.5.1.2. Reservoir computing

4.2.5.2. Complex ANNs

4.2.5.3. BrainScaleS

4.2.5.4. Loihi

4.2.5.5. TrueNorth

4.3. Conclusion

References

Index. A, B, C

D, E, F

H, I, L

M, N, O

P, Q, R

S, T, V

WILEY END USER LICENSE AGREEMENT

Отрывок из книги

To my mentors, Professors Georges Salmer and Eugène Constant, who passed on to me their passion for research into semiconductor device physics

.....

This remarkable evolution was only made possible by the existence of a universal model of information processing machines, the Turing machine, and a technology capable of physically implementing these machines, that of semiconductor devices. More specifically, the “binary coding/Von Neumann architecture/CMOS technology” triplet has been the dominant model of information processing systems since the early 1970s.

Yet two limits have been reached at present: that of miniaturization, with devices not exceeding several nanometers in size, and that of power dissipated, with a barrier of the order of 100 Watts when the processor is working intensely.

.....

Добавление нового отзыва

Комментарий Поле, отмеченное звёздочкой  — обязательно к заполнению

Отзывы и комментарии читателей

Нет рецензий. Будьте первым, кто напишет рецензию на книгу Neuro-inspired Information Processing
Подняться наверх