Preface

This text is about neural modeling, i.e., about neurons and biological neural networks (BNNs) and how their dynamic behavior can be quantitatively described. It was written for graduate students in biomedical engineering, but will also be of interest to neurophysiologists, computational neurobiologists, and biophysicists who are concerned with how neural systems process information and how these processes can be modeled.

What sort of academic background does the reader need to get the most out of this text? The author has assumed that readers are familiar with the formulation and solution of ordinary differential equations, also introductory probability theory, basic EE circuit theory, and that they have had an introductory course in neurobiology. This interdisciplinary background is not unusual for graduate students in biomedical engineering and biophysics. For the reader who wants to pursue any topic in greater depth, there are many references, some from the "classic period" in neurobiology (the 1960s and 1970s) and others from contemporary work.

Neural modeling as a discipline (now known as computational neurobiology or computational neuroscience) has a long history, back at least to the 1952 groundbreaking kinetic model of Hodgkin and Huxley for the generation of the nerve action potential. The Hodgkin-Huxley model dealt with events at the molecular and ionic levels on a unit area of axon membrane. Other models have examined neural behavior on a more macroscopic level, preserving neural components such as synapses, dendrites, soma, axon, etc. In another approach, the bulk behavior of large sensory networks such as the vertebrate retina or the arthropod compound eye has been modeled using the linear mathematics of engineering systems analysis. Each approach is valid in the proper context.

This text discusses tools and methods that can describe and predict the dynamic behavior of single neurons, small assemblies of neurons devoted to a single task (e.g., central pattern generators), larger sensory arrays and their underlying neuropile (e.g., arthropod compound eyes, vertebrate retinas, olfactory systems, etc.), and, finally, very large assemblies of neurons (e.g., central nervous system structures). Neural modeling is now performed by solving large sets of nonlinear, ordinary differential equations (ODEs) on a digital computer. There is considerable art and science in setting up a computational model that can be validated from existing neurophysiological data. There are special, free computer programs available at Web sites that allow the user to set up and test neural models. There are also "component libraries" available on the Internet that supply the modeler with the parameters of different types of neurons.

One goal of neural modeling is to examine the validity of putative signal processing algorithms that may be performed by interneurons that enable sensory arrays to perform feature extraction. The results of modeling can point to new experiments with living systems that may validate hypotheses. Experimental neurobiology and computational neuroscience are in fact synergistic; results of one should reinforce the results of the other.

Some of the neural models considered in this text are based on the author's own neurophysiological research and that of his graduate students. Its focus has been on small and medium-sized neural networks. Attention has been directed to visual feature extraction and, in particular, the compound eye visual system and the vertebrate retina. Other interesting sensory arrays are also treated, and their behavior modeled or described mathematically, where applicable. These include the gravity receptor arrays of the cockroach Arenivaga sp., electroreceptors of fish, magneto-receptors on many diverse animals, and the two angular rate sensors on dipteran flies, etc.

Introduction to Dynamic Modeling of Neurosensory Systems is organized into nine chapters plus several appendices containing large computer programs. There is also an extensive Bibliography/References section, which includes relevant Web site URLs.

Chapter 1, Introduction to Neurons, begins by describing the anatomy of the major classes of neurons and their roles in the nervous system. The molecular bases for the resting potential of the neuron and its action potential are considered, along with ion pumps and the many, diverse, gated ion channels. Next, the bulk properties of the passive core conductor (the dendrite) are presented, and an equivalent, lumped-parameter RC transmission line circuit is developed to model how postsynaptic potentials propagate down dendrites to active regions of the neuron membrane. How chemical synapses (excitatory and inhibitory) work is described. Rectifying and nonrectifying electrical synapses are considered, as well.

The generation of the nerve action potential by the active nerve membrane is described; the original Hodgkin-Huxley equations are presented and modeled with Simnon™. Factors affecting the propagation velocity of the action potential are discussed. Models of complete neural circuits are presented (e.g., a spinal reflex arc) and discussed in terms of neural "components." The behavior of the Hodgkin-Huxley model under voltage clamp conditions is simulated.

The second chapter is Selected Examples of Sensory Receptors and Small Receptor Arrays. The generalized receptor as a transducer is presented first. The power-law or log relation between stimulus and receptor response (not all receptors generate nerve spikes directly) is described. Factors limiting receptor response, including their membrane dynamics, noise, and dynamic range, are considered.

Properties of selected receptors are considered in detail:

• Certain mechanoreceptors (Pacinian corpuscles, muscle spindles, Golgi tendon organs, chordotonal organs in arthropods, sensory hairs, etc.)

• Magnetoreceptors (putative magnetite-based mechanosensors and Lorentz force-based models)

• Electroreceptors (in fish and the platypus)

• Gravity vector sensors (tricholiths in Arenivaga sp., statocysts in crustaceans and mollusks)

• Angular rate sensors used in dipteran flight (halteres)

• Chemoreceptors (amazing threshold sensitivities)

• Basic photoreceptors (the "eye" of Mytilus edulis) that convert photon energy to changes in nerve resting potential

The third chapter, Electronic Models of Neurons: A Historical Perspective, introduces the neuromime, an electronic analog circuit first used to simulate the behavior of single neurons, and small groups of interacting neurons. Neuromime circuits use the phenomenological locus approach to emulate the behavior of single biological neurons and small numbers of biological neurons. Neural locus theory is described; spike generator loci, excitatory and inhibitory synaptic potentials, delay operators, summing points, and low-pass filters to model lossy propagation are introduced and used in examples. The locus architecture is extended to numerical simulation of neurons.

Chapter 4, Simulation of the Behavior of Small Assemblies of Neurons, continues to examine the building blocks of locus theory in detail. The ODEs required to model ordinary excitatory postsynaptic potential (epsp) and inhibitory postsynaptic potential (ipsp) generation are derived, as well as those for nonlinear synapses, including facilitating and antifacilitating dynamics. Dendrites are considered over which epsps and ipsps are summed spatiotemporally. Dendrites are modeled as RC transmission lines; however, computational simplicity accrues when simple delay functions are used along with multipole low-pass filters to model their behavior.

Rather than use the detailed Hodgkin-Huxley model for spike generation, the spike generator locus (SGL) can be modeled with either integral pulse frequency modulation (IPFM) or relaxation pulse frequency modulation (RPFM) (the leaky integrator spike generator). Considerable computational economy results when the RPFM SGL is used. ODEs and nonlinear equations are developed to model adaptation and neural fatigue.

Basic signal-processing characteristics of small assemblies of neurons are treated. Locus models are shown to be effective in simulating simple two- and three-neuron circuits that exhibit pulse frequency selectivity (tuning) for a specific range of frequencies (the band detector), as well as high-pass and low-pass characteristics. The T-neuron is shown to be a pulse coincidence gate (an AND gate with an input memory).

Finally, locus models are used to investigate the dynamics of reciprocal inhibition and neural circuits capable of behaving like central pattern generators (CPGs). Ring circuits with positive feedback are shown to make poor CPGs. Negative feedback rings with delays in their loop gains are shown to be capable of generating two-phase burst patterns.

Chapter 5, Large Arrays of Interacting Receptors: The Compound Eye, introduces the large neurosensory array. How arrays have certain properties that enhance their effectiveness over single receptors is shown. Such enhancements have to do with linear and nonlinear signal processing in the underlying neuropile. The arthropod compound eye was chosen as the first array example because the author has spent many years doing research on insect (grasshopper) compound eye vision. Compound eye optics and neuroanatomy are described. The spatial resolution of the compound eye is considered; repeated measurements have shown that certain neurons in the optic lobes respond to visual objects so small that theoretically they should not be "seen." This theory is based on the analysis of the optics of a single dioptric unit (photoreceptor cluster) called an ommatidium. A theoretical model providing a basis for "anomalous resolution" was developed by the author based on multiplicative processing between adjacent ommatidia and the generation of an equivalent synthetic aperture receptor.

Early studies of the simple compound eye of the common horseshoe crab, Limulus, revealed that an underlying neural network had the property of lateral inhibition (LI), in which adjacent photoreceptors inhibited each other when stimulated. LI is shown to produce spatial high-pass filtering for the compound eye system, enhancing the contrast of high spatial frequencies in an object presented to the eye. Perceptual evidence for the presence of some form of LI is observed in the human visual system.

Finally, Chapter 5 considers feature extraction (FE) by the visual systems of grasshoppers, flies, and certain crustacean compound eye systems. FE is shown to be the result of neural preprocessing the spatiotemporal properties of an object (intensity, motion) to lighten the cognitive burden on the animal's central nervous system. Neural units from the optic lobes of insects respond selectively to either contrasting edge or spot objects moved in a preferred direction. It is argued that these directionally sensitive (DS) units are used by the insect for flight stabilization and guidance, or for the detection of predators or other insects. DS units have been found in all arthropods with compound eyes in which the DS units have been sought.

Large Arrays of Interacting Receptors: The Vertebrate Retina is the subject of Chapter 6. First, the anatomy and neurophysiology of a the vertebrate retina are described; cell types and functions are given, including the differences between rods and cones.

FE was first shown to occur in the frog's retina by Lettvin et al. in 1959. Vastly different in neural architecture than the arthropod compound eye, FE has evolved in the retina in several congruent patterns with compound eye systems. FE properties in the frog are contrasted to FE observed in the retinas of other vertebrates, such as the pigeon and the rabbit. Minimal FE takes place in the retinas of primates, where the burden of FE has been shifted to the well-developed visual cortex.

Chapter 7, Theoretical Models of Information Processing and Feature Extraction in Visual Sensory Arrays, considers various models that have been proposed to describe FE in vertebrates' retinas and the optic lobes of insects. One- and two-dimensional, linear, Fourier spatial filters are introduced and it is shown how visual array circuits can be considered to perform spatial filtering, including the edge-enhancement function of the Limulus LI. More complex interconnections are shown to yield edge orientation filters and spot filters. The static spatial filtering models of Fukushima and the spatiotemporal filters of Zorkoczy are described.

The spatial matched filter is introduced. Models for neural matched filters and how they might figure in visual pattern recognition are considered.

Characterization of Neuro-Sensory Systems is treated in Chapter 8. First, the classic means for identifying linear systems (LSs), including cross-correlation techniques to extract an LS weighting function or transfer function, is reviewed. Next, methods of deriving canonical, anatomical models for the connection architecture small arrays of spiking neurons based on the joint peri-stimulus time (JPST) technique is considered.

The mathematical basis for triggered correlation, developed to characterize the frequency selectivity of cat eighth nerve units by de Boer, is rigorously derived. The works of de Boer and of Wu are described.

Finally, the white noise method of Marmarelis for the characterization of nonlinear physiological systems is reviewed, and its application to the goldfish retina by Naka is described.

Chapter 9, Software for Simulation of Neural Systems, reviews the currently available simulation packages with which neurons can be modeled: NEURON, GENESIS, XNBC v8, EONS, SNNAP, SONN, and Nodus 3.2. All these programs are in English, and are free to university students. Most of them run on UNIX-type OS computers with Xwindows and have a graphical user interface. Some also run on personal computers with Windows 95, 98, or NT4. Nodus 3.2 runs on Apple Macintosh systems. The author also makes a strong case for the use of the general, powerful, nonlinear ODE solver, Simnon, for some basic tasks in neural modeling.

Quantitative problems are included with Chapters 1, 2, 4, 5, and 7. In addition to their obvious pedagogical value for student readers using this text, solved problems can be used as teaching examples by instructors. A Solutions Manual is available from the publisher. Also available online at the CRC Press Web site at http://www.crc-press.com/ under the title of this text are all the Simnon programs used in the text and in the problems. When downloaded, a given program can be cut and pasted and saved as a "Text Only," *.t file to be used by Simnon 3.2 or Simnon 3.0/PCW. This procedure saves the modeler the effort of copying files out of the text line by line and saving them in text only, *.t format.

Robert B. Northrop

Was this article helpful?

0 0
Peripheral Neuropathy Natural Treatment Options

Peripheral Neuropathy Natural Treatment Options

This guide will help millions of people understand this condition so that they can take control of their lives and make informed decisions. The ebook covers information on a vast number of different types of neuropathy. In addition, it will be a useful resource for their families, caregivers, and health care providers.

Get My Free Ebook


Post a comment