Imperial College London

Dr Dan Goodman

Faculty of EngineeringDepartment of Electrical and Electronic Engineering

Lecturer
 
 
 
//

Contact

 

+44 (0)20 7594 6264d.goodman Website

 
 
//

Location

 

1001Electrical EngineeringSouth Kensington Campus

//

Summary

 

Publications

Publication Type
Year
to

22 results found

Goodman DFM, Winter IM, Léger AC, de Cheveigné A, Lorenzi Cet al., 2017, Modelling firing regularity in the ventral cochlear nucleus: Mechanisms, and effects of stimulus level and synaptopathy., Hear Res

The auditory system processes temporal information at multiple scales, and disruptions to this temporal processing may lead to deficits in auditory tasks such as detecting and discriminating sounds in a noisy environment. Here, a modelling approach is used to study the temporal regularity of firing by chopper cells in the ventral cochlear nucleus, in both the normal and impaired auditory system. Chopper cells, which have a strikingly regular firing response, divide into two classes, sustained and transient, based on the time course of this regularity. Several hypotheses have been proposed to explain the behaviour of chopper cells, and the difference between sustained and transient cells in particular. However, there is no conclusive evidence so far. Here, a reduced mathematical model is developed and used to compare and test a wide range of hypotheses with a limited number of parameters. Simulation results show a continuum of cell types and behaviours: chopper-like behaviour arises for a wide range of parameters, suggesting that multiple mechanisms may underlie this behaviour. The model accounts for systematic trends in regularity as a function of stimulus level that have previously only been reported anecdotally. Finally, the model is used to predict the effects of a reduction in the number of auditory nerve fibres (deafferentation due to, for example, cochlear synaptopathy). An interactive version of this paper in which all the model parameters can be changed is available online.

JOURNAL ARTICLE

Goodman DFM, Stimberg M, Brette R, 2016, Brian 2.0 simulator

Brian is a simulator for spiking neural networks. It is written in the Python programming language and is available on almost all platforms. We believe that a simulator should not only save the time of processors, but also the time of scientists. Brian is therefore designed to be easy to learn and use, highly flexible and easily extensible.

SOFTWARE

Rossant C, Kadir SN, Goodman DFM, Schulman J, Hunter MLD, Saleem AB, Grosmark A, Belluscio M, Denfield GH, Ecker AS, Tolias AS, Solomon S, Buzsaki G, Carandini M, Harris KDet al., 2016, Spike sorting for large, dense electrode arrays, NATURE NEUROSCIENCE, Vol: 19, Pages: 634-+, ISSN: 1097-6256

JOURNAL ARTICLE

Kadir SN, Goodman DFM, Harris KD, 2014, High-Dimensional Cluster Analysis with the Masked EM Algorithm, NEURAL COMPUTATION, Vol: 26, Pages: 2379-2394, ISSN: 0899-7667

JOURNAL ARTICLE

Stimberg M, Goodman DFM, Benichoux V, Brette Ret al., 2014, Equation-oriented specification of neural models for simulations., Front Neuroinform, Vol: 8, ISSN: 1662-5196

Simulating biological neuronal networks is a core method of research in computational neuroscience. A full specification of such a network model includes a description of the dynamics and state changes of neurons and synapses, as well as the synaptic connectivity patterns and the initial values of all parameters. A standard approach in neuronal modeling software is to build network models based on a library of pre-defined components and mechanisms; if a model component does not yet exist, it has to be defined in a special-purpose or general low-level language and potentially be compiled and linked with the simulator. Here we propose an alternative approach that allows flexible definition of models by writing textual descriptions based on mathematical notation. We demonstrate that this approach allows the definition of a wide range of models with minimal syntax. Furthermore, such explicit model descriptions allow the generation of executable code for various target languages and devices, since the description is not tied to an implementation. Finally, this approach also has advantages for readability and reproducibility, because the model description is fully explicit, and because it can be automatically parsed and transformed into formatted descriptions. The presented approach has been implemented in the Brian2 simulator.

JOURNAL ARTICLE

Goodman DFM, Benichoux V, Brette R, 2013, Decoding neural responses to temporal cues for sound localization., Elife, Vol: 2, ISSN: 2050-084X

The activity of sensory neural populations carries information about the environment. This may be extracted from neural activity using different strategies. In the auditory brainstem, a recent theory proposes that sound location in the horizontal plane is decoded from the relative summed activity of two populations in each hemisphere, whereas earlier theories hypothesized that the location was decoded from the identity of the most active cells. We tested the performance of various decoders of neural responses in increasingly complex acoustical situations, including spectrum variations, noise, and sound diffraction. We demonstrate that there is insufficient information in the pooled activity of each hemisphere to estimate sound direction in a reliable way consistent with behavior, whereas robust estimates can be obtained from neural activity by taking into account the heterogeneous tuning of cells. These estimates can still be obtained when only contralateral neural responses are used, consistently with unilateral lesion studies. DOI: http://dx.doi.org/10.7554/eLife.01312.001.

JOURNAL ARTICLE

Rossant C, Fontaine B, Goodman DFM, 2013, Playdoh: A lightweight Python library for distributed computing and optimisation, Journal of Computational Science, Vol: 4, Pages: 352-359, ISSN: 1877-7503

Parallel computing is now an essential paradigm for high performance scientific computing. Most existing hardware and software solutions are expensive or difficult to use. We developed Playdoh, a Python library for distributing computations across the free computing units available in a small network of multicore computers. Playdoh supports independent and loosely coupled parallel problems such as global optimisations, Monte Carlo simulations and numerical integration of partial differential equations. It is designed to be lightweight and easy to use and should be of interest to scientists wanting to turn their lab computers into a small cluster at no cost. © 2011 Elsevier B.V.

JOURNAL ARTICLE

Brette R, Goodman DFM, 2012, Simulating spiking neural networks on GPU., Network, Vol: 23, Pages: 167-182

Modern graphics cards contain hundreds of cores that can be programmed for intensive calculations. They are beginning to be used for spiking neural network simulations. The goal is to make parallel simulation of spiking neural networks available to a large audience, without the requirements of a cluster. We review the ongoing efforts towards this goal, and we outline the main difficulties.

JOURNAL ARTICLE

Brette R, Goodman DFM, 2011, Vectorized algorithms for spiking neural network simulation., Neural Comput, Vol: 23, Pages: 1503-1535

High-level languages (Matlab, Python) are popular in neuroscience because they are flexible and accelerate development. However, for simulating spiking neural networks, the cost of interpretation is a bottleneck. We describe a set of algorithms to simulate large spiking neural networks efficiently with high-level languages using vector-based operations. These algorithms constitute the core of Brian, a spiking neural network simulator written in the Python language. Vectorized simulation makes it possible to combine the flexibility of high-level languages with the computational efficiency usually associated with compiled languages.

JOURNAL ARTICLE

Fontaine B, Goodman DFM, Benichoux V, Brette Ret al., 2011, Brian hears: online auditory processing using vectorization over channels., Front Neuroinform, Vol: 5

The human cochlea includes about 3000 inner hair cells which filter sounds at frequencies between 20 Hz and 20 kHz. This massively parallel frequency analysis is reflected in models of auditory processing, which are often based on banks of filters. However, existing implementations do not exploit this parallelism. Here we propose algorithms to simulate these models by vectorizing computation over frequency channels, which are implemented in "Brian Hears," a library for the spiking neural network simulator package "Brian." This approach allows us to use high-level programming languages such as Python, because with vectorized operations, the computational cost of interpretation represents a small fraction of the total cost. This makes it possible to define and simulate complex models in a simple way, while all previous implementations were model-specific. In addition, we show that these algorithms can be naturally parallelized using graphics processing units, yielding substantial speed improvements. We demonstrate these algorithms with several state-of-the-art cochlear models, and show that they compare favorably with existing, less flexible, implementations.

JOURNAL ARTICLE

Kremer Y, Léger J-F, Goodman D, Brette R, Bourdieu Let al., 2011, Late emergence of the vibrissa direction selectivity map in the rat barrel cortex., J Neurosci, Vol: 31, Pages: 10689-10700

In the neocortex, neuronal selectivities for multiple sensorimotor modalities are often distributed in topographical maps thought to emerge during a restricted period in early postnatal development. Rodent barrel cortex contains a somatotopic map for vibrissa identity, but the existence of maps representing other tactile features has not been clearly demonstrated. We addressed the issue of the existence in the rat cortex of an intrabarrel map for vibrissa movement direction using in vivo two-photon imaging. We discovered that the emergence of a direction map in rat barrel cortex occurs long after all known critical periods in the somatosensory system. This map is remarkably specific, taking a pinwheel-like form centered near the barrel center and aligned to the barrel cortex somatotopy. We suggest that this map may arise from intracortical mechanisms and demonstrate by simulation that the combination of spike-timing-dependent plasticity at synapses between layer 4 and layer 2/3 and realistic pad stimulation is sufficient to produce such a map. Its late emergence long after other classical maps suggests that experience-dependent map formation and refinement continue throughout adult life.

JOURNAL ARTICLE

Rossant C, Goodman DFM, Fontaine B, Platkiewicz J, Magnusson AK, Brette Ret al., 2011, Fitting neuron models to spike trains., Front Neurosci, Vol: 5

Computational modeling is increasingly used to understand the function of neural circuits in systems neuroscience. These studies require models of individual neurons with realistic input-output properties. Recently, it was found that spiking models can accurately predict the precisely timed spike trains produced by cortical neurons in response to somatically injected currents, if properly fitted. This requires fitting techniques that are efficient and flexible enough to easily test different candidate models. We present a generic solution, based on the Brian simulator (a neural network simulator in Python), which allows the user to define and fit arbitrary neuron models to electrophysiological recordings. It relies on vectorization and parallel computing techniques to achieve efficiency. We demonstrate its use on neural recordings in the barrel cortex and in the auditory brainstem, and confirm that simple adaptive spiking models can accurately predict the response of cortical neurons. Finally, we show how a complex multicompartmental model can be reduced to a simple effective spiking model.

JOURNAL ARTICLE

Fletcher A, Goodman D, 2010, Quasiregular mappings of polynomial type in R<sup>2</sup>, Conformal Geometry and Dynamics, Vol: 14, Pages: 322-336

Complex dynamics deals with the iteration of holomorphic functions. As is well known, the first functions to be studied which gave non-trivial dynamics were quadratic polynomials, which produced beautiful computer generated pictures of Julia sets and the Mandelbrot set. In the same spirit, this article aims to study the dynamics of the simplest non-trivial quasiregular mappings. These are mappings in R 2 which are a composition of a quadratic polynomial and an affine stretch. © 2010 American Mathematical Society.

JOURNAL ARTICLE

Goodman DFM, 2010, Code generation: a strategy for neural network simulators., Neuroinformatics, Vol: 8, Pages: 183-196

We demonstrate a technique for the design of neural network simulation software, runtime code generation. This technique can be used to give the user complete flexibility in specifying the mathematical model for their simulation in a high level way, along with the speed of code written in a low level language such as C+ +. It can also be used to write code only once but target different hardware platforms, including inexpensive high performance graphics processing units (GPUs). Code generation can be naturally combined with computer algebra systems to provide further simplification and optimisation of the generated code. The technique is quite general and could be applied to any simulation package. We demonstrate it with the 'Brian' simulator ( http://www.briansimulator.org ).

JOURNAL ARTICLE

Goodman DFM, Brette R, 2010, Spike-timing-based computation in sound localization., PLoS Comput Biol, Vol: 6

Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination) in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

JOURNAL ARTICLE

Goodman DFM, Brette R, 2010, Learning to localise sounds with spiking neural networks

To localise the source of a sound, we use location-specific properties of the signals received at the two ears caused by the asymmetric filtering of the original sound by our head and pinnae, the head-related transfer functions (HRTFs). These HRTFs change throughout an organism's lifetime, during development for example, and so the required neural circuitry cannot be entirely hardwired. Since HRTFs are not directly accessible from perceptual experience, they can only be inferred from filtered sounds. We present a spiking neural network model of sound localisation based on extracting location-specific synchrony patterns, and a simple supervised algorithm to learn the mapping between synchrony patterns and locations from a set of example sounds, with no previous knowledge of HRTFs. After learning, our model was able to accurately localise new sounds in both azimuth and elevation, including the difficult task of distinguishing sounds coming from the front and back.

CONFERENCE PAPER

Rossant C, Goodman DFM, Platkiewicz J, Brette Ret al., 2010, Automatic fitting of spiking neuron models to electrophysiological recordings., Front Neuroinform, Vol: 4

Spiking models can accurately predict the spike trains produced by cortical neurons in response to somatically injected currents. Since the specific characteristics of the model depend on the neuron, a computational method is required to fit models to electrophysiological recordings. The fitting procedure can be very time consuming both in terms of computer simulations and in terms of code writing. We present algorithms to fit spiking models to electrophysiological data (time-varying input and spike trains) that can run in parallel on graphics processing units (GPUs). The model fitting library is interfaced with Brian, a neural network simulator in Python. If a GPU is present it uses just-in-time compilation to translate model equations into optimized code. Arbitrary models can then be defined at script level and run on the graphics card. This tool can be used to obtain empirically validated spiking models of neurons in various systems. We demonstrate its use on public data from the INCF Quantitative Single-Neuron Modeling 2009 competition by comparing the performance of a number of neuron spiking models.

JOURNAL ARTICLE

Goodman DFM, Brette R, 2009, The brian simulator., Front Neurosci, Vol: 3, Pages: 192-197

"Brian" is a simulator for spiking neural networks (http://www.briansimulator.org). The focus is on making the writing of simulation code as quick and easy as possible for the user, and on flexibility: new and non-standard models are no more difficult to define than standard ones. This allows scientists to spend more time on the details of their models, and less on their implementation. Neuron models are defined by writing differential equations in standard mathematical notation, facilitating scientific communication. Brian is written in the Python programming language, and uses vector-based computation to allow for efficient simulations. It is particularly useful for neuroscientific modelling at the systems level, and for teaching computational neuroscience.

JOURNAL ARTICLE

Goodman D, Brette R, 2008, Brian: a simulator for spiking neural networks in python., Front Neuroinform, Vol: 2

"Brian" is a new simulator for spiking neural networks, written in Python (http://brian. di.ens.fr). It is an intuitive and highly flexible tool for rapidly developing new models, especially networks of single-compartment neurons. In addition to using standard types of neuron models, users can define models by writing arbitrary differential equations in ordinary mathematical notation. Python scientific libraries can also be used for defining models and analysing data. Vectorisation techniques allow efficient simulations despite the overheads of an interpreted language. Brian will be especially valuable for working on non-standard neuron models not easily covered by existing software, and as an alternative to using Matlab or C for simulations. With its easy and intuitive syntax, Brian is also very well suited for teaching computational neuroscience.

JOURNAL ARTICLE

Goodman DFM, Stimberg M, Brette R, 2008, Brian simulator

Brian is a simulator for spiking neural networks available on almost all platforms. The motivation for this project is that a simulator should not only save the time of processors, but also the time of scientists.Brian is easy to learn and use, highly flexible and easily extensible. The Brian package itself and simulations using it are all written in the Python programming language, which is an easy, concise and highly developed language with many advanced features and development tools, excellent documentation and a large community of users providing support and extension packages.

SOFTWARE

Goodman D, 2006, Spirals in the boundary of slices of quasi-fuchsian space, Conformal Geometry and Dynamics, Vol: 10, Pages: 136-158

We prove that the Bers and Maskit slices of the quasi-Fuchsian space of a once-punctured torus have a dense, uncountable set of points in their boundaries about which the boundary spirals infinitely. © 2006 American Mathematical Society.

JOURNAL ARTICLE

Zheng JX, Goodman DFM, Pawar S, Graph Drawing by Weighted Constraint Relaxation

A popular method of force-directed graph drawing is multidimensional scalingusing graph-theoretic distances as input. We present an algorithm to minimizeits energy function, known as stress, by using a relaxation method thatconsiders a single pair of vertices at a time. Our results show that relaxationcan reach lower stress levels faster and more consistently than majorization,without needing help from a good initialization. We then present variousreal-world applications to show how the unique properties of relaxation make iteasier to produce constrained layouts than previous approaches. We also showhow relaxation can be directly applied within the sparse stress approximationof Ortmann et al. [1], making the algorithm scalable up to large graphs.

JOURNAL ARTICLE

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://wlsprd.imperial.ac.uk:80/respub/WEB-INF/jsp/search-html.jsp Request URI: /respub/WEB-INF/jsp/search-html.jsp Query String: respub-action=search.html&id=00741013&limit=30&person=true