98 results found
Salnikov V, Cassese D, Lambiotte R, et al., 2018, Co-occurrence simplicial complexes in mathematics: identifying the holes of knowledge., Appl Netw Sci, Vol: 3
In the last years complex networks tools contributed to provide insights on the structure of research, through the study of collaboration, citation and co-occurrence networks. The network approach focuses on pairwise relationships, often compressing multidimensional data structures and inevitably losing information. In this paper we propose for the first time a simplicial complex approach to word co-occurrences, providing a natural framework for the study of higher-order relations in the space of scientific knowledge. Using topological methods we explore the conceptual landscape of mathematical research, focusing on homological holes, regions with low connectivity in the simplicial structure. We find that homological holes are ubiquitous, which suggests that they capture some essential feature of research practice in mathematics. k-dimensional holes die when every concept in the hole appears in an article together with other k+1 concepts in the hole, hence their death may be a sign of the creation of new knowledge, as we show with some examples. We find a positive relation between the size of a hole and the time it takes to be closed: larger holes may represent potential for important advances in the field because they separate conceptually distant areas. We provide further description of the conceptual space by looking for the simplicial analogs of stars and explore the likelihood of edges in a star to be also part of a homological cycle. We also show that authors' conceptual entropy is positively related with their contribution to homological holes, suggesting that polymaths tend to be on the frontier of research.
The plant endoplasmic reticulum forms a network of tubules connected by three-way junctions or sheet-like cisternae. Although the network is three-dimensional, in many plant cells, it is constrained to a thin volume sandwiched between the vacuole and plasma membrane, effectively restricting it to a 2-D planar network. The structure of the network, and the morphology of the tubules and cisternae can be automatically extracted following intensity-independent edge-enhancement and various segmentation techniques to give an initial pixel-based skeleton, which is then converted to a graph representation. Collectively, this approach yields a wealth of quantitative metrics for ER structure and can be used to describe the effects of pharmacological treatments or genetic manipulation. The software is publicly available.
Aryaman J, Johnston IG, Jones NS, 2017, Mitochondrial DNA Density Homeostasis Accounts for a Threshold Effect in a Cybrid Model of a Human Mitochondrial Disease, Biochemical Journal, Vol: 474, Pages: 4019-4034, ISSN: 1470-8728
Mitochondrial dysfunction is involved in a wide array of devastating diseases, but the heterogeneity and complexity of the symptoms of these diseases challenges theoretical understanding of their causation. With the explosion of omics data, we have the unprecedented opportunity to gain deep understanding of the biochemical mechanisms of mitochondrial dysfunction. This goal raises the outstanding need to make these complex datasets interpretable. Quantitative modelling allows us to translate such datasets into intuition and suggest rational biomedical treatments. Taking an interdisciplinary approach, we use a recently published large-scale dataset and develop a descriptive and predictive mathematical model of progressive increase in mutant load of the MELAS 3243A>G mtDNA mutation. The experimentally observed behaviour is surprisingly rich, but we find that our simple, biophysically motivated model intuitively accounts for this heterogeneity and yields a wealth of biological predictions. Our findings suggest that cells attempt to maintain wild-type mtDNA density through cell volume reduction, and thus power demand reduction, until a minimum cell volume is reached. Thereafter, cells toggle from demand reduction to supply increase, up-regulating energy production pathways. Our analysis provides further evidence for the physiological significance of mtDNA density and emphasizes the need for performing single-cell volume measurements jointly with mtDNA quantification. We propose novel experiments to verify the hypotheses made here to further develop our understanding of the threshold effect and connect with rational choices for mtDNA disease therapies.
Fulcher B, Jones NS, 2017, hctsa: A computational framework for automated timeseriesphenotyping using massive feature extraction, Cell Systems, Vol: 5, Pages: 527-531.e3, ISSN: 2405-4712
Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data.
How smart can a micron-sized bag of chemicals be? How can an artificial orreal cell make inferences about its environment? From which kinds ofprobability distributions can chemical reaction networks sample? We begintackling these questions by showing four ways in which a stochastic chemicalreaction network can implement a Boltzmann machine, a stochastic neural networkmodel that can generate a wide range of probability distributions and computeconditional probabilities. The resulting models, and the associated theorems,provide a road map for constructing chemical reaction networks that exploittheir native stochasticity as a computational resource. Finally, to show thepotential of our models, we simulate a chemical Boltzmann machine to classifyand generate MNIST digits in-silico.
Deshpande A, Gopalkrishnan M, Ouldridge TE, et al., 2017, Designing the Optimal Bit: Balancing Energetic Cost, Speed and Reliability, Proceedings of the Royal Society A: Mathematical, Physical & Engineering Sciences, Vol: 473, ISSN: 1364-5021
We consider the technologically relevant costs of operating a reliable bitthat can be erased rapidly. We find that both erasing and reliability times arenon-monotonic in the underlying friction, leading to a trade-off betweenerasing speed and bit reliability. Fast erasure is possible at the expense oflow reliability at moderate friction, and high reliability comes at the expenseof slow erasure in the underdamped and overdamped limits. Within a given classof bit parameters and control strategies, we define "optimal" designs of bitsthat meet the desired reliability and erasing time requirements with the lowestoperational work cost. We find that optimal designs always saturate the boundon the erasing time requirement, but can exceed the required reliability timeif critically damped. The non-trivial geometry of the reliability and erasingtime-scales allows us to exclude large regions of parameter space assub-optimal. We find that optimal designs are either critically damped or closeto critical damping under the erasing procedure.
Aryaman J, hoitzing H, burgstaller J, et al., 2017, Mitochondrial heterogeneity, metabolic scaling and cell death, Bioessays, Vol: 39, ISSN: 1521-1878
Heterogeneity in mitochondrial content has been previously suggested as a major contributor to cellular noise, with multiple studies indicating its direct involvement in biomedically important cellular phenomena. A recently published dataset explored the connection between mitochondrial functionality and cell physiology, where a non-linearity between mitochondrial functionality and cell size was found. Using mathematical models, we suggest that a combination of metabolic scaling and a simple model of cell death may account for these observations. However, our findings also suggest the existence of alternative competing hypotheses, such as a non-linearity between cell death and cell size. While we find that the proposed non-linear coupling between mitochondrial functionality and cell size provides a compelling alternative to previous attempts to link mitochondrial heterogeneity and cell physiology, we emphasise the need to account for alternative causal variables, including cell cycle, size, mitochondrial density and death, in future studies of mitochondrial physiology.
Fricker MD, Akita D, Heaton LLM, et al., 2017, Automated analysis of Physarum network structure and dynamics, JOURNAL OF PHYSICS D-APPLIED PHYSICS, Vol: 50, ISSN: 0022-3727
Brittain RA, Jones NS, Ouldridge TE, 2017, What we learn from the learning rate, Journal of Statistical Mechanics-Theory and Experiment, Vol: 2017, ISSN: 1742-5468
The learning rate is an information-theoretical quantity for bipartite Markovchains describing two coupled subsystems. It is defined as the rate at whichtransitions in the downstream subsystem tend to increase the mutual informationbetween the two subsystems, and is bounded by the dissipation arising fromthese transitions. Its physical interpretation, however, is unclear, althoughit has been used as a metric for the sensing performance of the downstreamsubsystem. In this paper we explore the behaviour of the learning rate for anumber of simple model systems, establishing when and how its behaviour isdistinct from the instantaneous mutual information between subsystems. In thesimplest case, the two are almost equivalent. In more complex steady-statesystems, the mutual information and the learning rate behave qualitativelydistinctly, with the learning rate clearly now reflecting the rate at which thedownstream system must update its information in response to changes in theupstream system. It is not clear whether this quantity is the most naturalmeasure for sensor performance, and, indeed, we provide an example in whichoptimising the learning rate over a region of parameter space of the downstreamsystem yields an apparently sub-optimal sensor.
Johnson S, Jones NS, 2017, Looplessness in networks is linked to trophic coherence, Proceedings of the National Academy of Sciences of the United States of America, Vol: 114, Pages: 5618-5623, ISSN: 1091-6490
Many natural, complex systems are remarkably stable thanks to anabsence of feedback acting on their elements. When described as net-works, these exhibit few or no cycles, and associated matrices have smallleading eigenvalues. It has been suggested that this architecture can con-fer advantages to the system as a whole, such as ‘qualitative stability’,but this observation does not in itself explain how a loopless structuremight arise. We show here that the number of feedback loops in a net-work, as well as the eigenvalues of associated matrices, are determined bya structural property called trophic coherence, a measure of how neatlynodes fall into distinct levels. Our theory correctly classifies a variety ofnetworks – including those derived from genes, metabolites, species, neu-rons, words, computers and trading nations – into two distinct regimesof high and low feedback, and provides a null model to gauge the signifi-cance of related magnitudes. Since trophic coherence suppresses feedback,whereas an absence of feedback alone does not lead to coherence, our worksuggests that the reasons for ‘looplessness’ in nature should be sought incoherence-inducing mechanisms.
Colijn C, Jones N, Johnston I, et al., 2017, Towards precision healthcare: context and mathematical challenges, Frontiers in Physiology, Vol: 8, ISSN: 1664-042X
Precision medicine refers to the idea of delivering the right treatment to the right patient at the right time, usually with a focus on a data-centred approach to this task. In this perspective piece, we use the term "precision healthcare" to describe the development of precision approaches that bridge from the individual to the population, taking advantage of individual-level data, but also taking the social context into account. These problems give rise to a broad spectrum of technical, scientific, policy, ethical and social challenges, and new mathematical techniques will be required to meet them. To ensure that the science underpin-ning "precision" is robust, interpretable and well-suited to meet the policy, ethical and social questions that such approaches raise, the mathematical methods for data analysis should be transparent, robust and able to adapt to errors and uncertainties. In particular, precision methodologies should capture the complexity of data, yet produce tractable descriptions at the relevant resolution while preserving intelligibility and traceability, so that they can be used by practitioners to aid decision-making. Through several case studies in this domain of precision healthcare, we argue that this vision requires the development of new mathematical frameworks, both in modelling and in data analysis and interpretation.
McGrath T, Jones NS, ten Wolde PR, et al., 2017, Biochemical Machines for the Interconversion of Mutual Information and Work (vol 118, 028101, 2017), PHYSICAL REVIEW LETTERS, Vol: 118, ISSN: 0031-9007
McGrath T, Jones NS, Wolde PRT, et al., 2017, A biochemical machine for the interconversion of mutual information and work, Physical Review Letters, Vol: 118, ISSN: 1079-7114
We propose a physically-realisable biochemical device that is coupled to abiochemical reservoir of mutual information, fuel molecules and a chemicalbath. Mutual information allows work to be done on the bath even when the fuelmolecules appear to be in equilibrium; alternatively, mutual information can becreated by driving from the fuel or the bath. The system exhibits diversebehaviour, including a regime in which the information, despite increasingduring the reaction, enhances the extracted work. We further demonstrate that amodified device can function without the need for external manipulation,eliminating the need for a complex and potentially costly control.
Hoitzing H, Johnston IG, Jones NS, 2017, Stochastic models for evolving cellular populations of mitochondria: Disease, development, and ageing, Stochastic Processes, Multiscale Modeling, and Numerical Methods for Computational Cellular Biology, Pages: 287-314, ISBN: 9783319626260
© Springer International Publishing AG 2017. Mitochondria are essential cellular organelles whose dysfunction is associated with ageing, cancer, mitochondrial diseases, and many other disorders. They contain their own genomes (mtDNA), of which thousands can be present in a single cell. These genomes are repeatedly replicated and degraded over time, and are prone to mutations. If the fraction of mutated genomes (heteroplasmy) exceeds a certain threshold, cellular defects can arise. The dynamics of mtDNAs over time and the accumulation of mutant genomes form a rich and vital stochastic process, the understanding of which provides important insights into disease progression. Numerous mathematical models have been constructed to provide a better understanding of how mitochondrial dysfunctions arise and, importantly, how clinical interventions can alleviate disease symptoms. For a given mean heteroplasmy, an increased variance-and thus a wider cell-to-cell heteroplasmy distribution-implies a higher probability of exceeding a given threshold value, meaning that stochastic models are essential to describe mtDNA disease. Mitochondria can undergo fusion and fission events with each other making the mitochondrial population a dynamic network that continuously changes its morphology, and allowing for the possibility of exchange of mtDNA molecules: coupled stochastic physical and genetic dynamics thus govern cellular mtDNA populations. Here, an overview is given of the kinds of stochastic mathematical models constructed describing mitochondria, their implications, and currently existing open problems.
Johnston IG, Jones NS, 2016, Evolution of cell-to-cell variability in stochastic, controlled, heteroplasmic mtDNA populations, American Journal of Human Genetics, Vol: 99, Pages: 1150-1162, ISSN: 1537-6605
Populations of physiologically vital mitochondrial DNA (mtDNA) molecules evolve in cells under control from the nucleus. The evolution of populations of mixed mtDNA types is complicated and poorly understood, and variability of these controlled admixtures plays a central role in the inheritance and onset of genetic disease. Here, we develop a mathematical theory describing the evolution of, and variability in, these stochastic populations for any type of cellular control, showing that cell-to-cell variability in mtDNA and mutant load inevitably increases with time, according to rates that we derive and which are notably independent of the mechanistic details of feedback signaling. We show with a set of experimental case studies that this theory explains disparate quantitative results from classical and modern experimental and computational research on heteroplasmy variance in different species. We demonstrate that our general model provides a host of specific insights, including a modification of the often-used but hard-to-interpret Wright formula to correspond directly to biological observables, the ability to quantify selective and mutational pressure in mtDNA populations, and characterization of the pronounced variability inevitably arising from the action of possible mtDNA quality-control mechanisms. Our general theoretical framework, supported by existing experimental results, thus helps us to understand and predict the evolution of stochastic mtDNA populations in cell biology.
Larson HJ, de Figueiredo A, Xiahong Z, et al., 2016, The state of vaccine confidence 2016: global insights through a 67-country survey, EBioMedicine, Vol: 12, Pages: 295-301, ISSN: 2352-3964
BackgroundPublic trust in immunization is an increasingly important global health issue. Losses in confidence in vaccines and immunization programmes can lead to vaccine reluctance and refusal, risking disease outbreaks and challenging immunization goals in high- and low-income settings. National and international immunization stakeholders have called for better monitoring of vaccine confidence to identify emerging concerns before they evolve into vaccine confidence crises.MethodsWe perform a large-scale, data-driven study on worldwide attitudes to immunizations. This survey – which we believe represents the largest survey on confidence in immunization to date – examines perceptions of vaccine importance, safety, effectiveness, and religious compatibility among 65,819 individuals across 67 countries. Hierarchical models are employed to probe relationships between individual- and country-level socio-economic factors and vaccine attitudes obtained through the four-question, Likert-scale survey.FindingsOverall sentiment towards vaccinations is positive across all 67 countries, however there is wide variability between countries and across world regions. Vaccine-safety related sentiment is particularly negative in the European region, which has seven of the ten least confident countries, with 41% of respondents in France and 36% of respondents in Bosnia & Herzegovina reporting that they disagree that vaccines are safe (compared to a global average of 13%). The oldest age group (65 +) and Roman Catholics (amongst all faiths surveyed) are associated with positive views on vaccine sentiment, while the Western Pacific region reported the highest level of religious incompatibility with vaccines. Countries with high levels of schooling and good access to health services are associated with lower rates of positive sentiment, pointing to an emerging inverse relationship between vaccine sentiments and socio-economic status.ConclusionsRegular monitoring of vaccine
de Figueiredo A, Johnston IG, Smith DM, et al., 2016, Forecasted trends in vaccination coverage and correlations with socioeconomic factors: a global time-series analysis over 30 years., Lancet Global Health, Vol: 4, Pages: e726-e735, ISSN: 2214-109X
BACKGROUND: Incomplete immunisation coverage causes preventable illness and death in both developing and developed countries. Identification of factors that might modulate coverage could inform effective immunisation programmes and policies. We constructed a performance indicator that could quantitatively approximate measures of the susceptibility of immunisation programmes to coverage losses, with an aim to identify correlations between trends in vaccine coverage and socioeconomic factors. METHODS: We undertook a data-driven time-series analysis to examine trends in coverage of diphtheria, tetanus, and pertussis (DTP) vaccination across 190 countries over the past 30 years. We grouped countries into six world regions according to WHO classifications. We used Gaussian process regression to forecast future coverage rates and provide a vaccine performance index: a summary measure of the strength of immunisation coverage in a country. FINDINGS: Overall vaccine coverage increased in all six world regions between 1980 and 2010, with variation in volatility and trends. Our vaccine performance index identified that 53 countries had more than a 50% chance of missing the Global Vaccine Action Plan (GVAP) target of 90% worldwide coverage with three doses of DTP (DTP3) by 2015. These countries were mostly in sub-Saharan Africa and south Asia, but Austria and Ukraine also featured. Factors associated with DTP3 immunisation coverage varied by world region: personal income (Spearman's ρ=0·66, p=0·0011) and government health spending (0·66, p<0·0001) were informative of immunisation coverage in the Eastern Mediterranean between 1980 and 2010, whereas primary school completion was informative of coverage in Africa (0·56, p<0·0001) over the same period. The proportion of births attended by skilled health staff correlated significantly with immunisation coverage across many world regions. INTERPRETATION: Our vaccine performance inde
de Figueiredo A, Johnston IG, Smith DMD, et al., 2016, Forecasting time-series trends in vaccination coverage and their links with socio-economic factors: A global analysis over 30 years, Lancet Global Health, ISSN: 2214-109X
Background Incomplete immunisation coverage causes preventable illness and death in both the developing anddeveloped world. Identifying factors that may modulate coverage can inform effective immunisation programmes andpolicies.Methods We perform a data-driven analysis of unprecedented scale, examining time-varying trends in Diphtheriatetanus-pertussiscoverage across 190 countries over the past three decades. Gaussian process regression is employedto forecast future coverage rates and provide a Vaccine Performance Index: a summary measure of the strength ofimmunisation coverage in a country.Findings Overall vaccine coverage has increased in all five world regions between 1980 and 2010, with markedvariation in volatility and trends. Our Vaccine Performance Index identifies 53 countries with a less than 50% chanceof missing the Global Vaccine Action Plan (GVAP) target of 90% worldwide DTP3 coverage by 2015, in agreementwith recent immunisation data. These countries are mostly sub-Saharan and South Asian, but Austria and Ukraine inEurope also feature. Factors associated with DTP3 immunisation coverage vary by world-region: personal income(! = 0.66, ' < 0.001) and government health spending (! = 0.66, ' < 0.01) are particularly informative in theEastern Mediterranean between 1980 and 2010, whilst primary school completion is informative in Africa (! =0.56, ' < 0.001) over the same time. The fraction of births attended by skilled health staff is significantly informativeacross many world regionsInterpretation A Vaccine Performance Index can highlight countries at risk identifying the strength and resilience ofimmunisation programmes. Weakening correlations with socio-economic factors indicate a need to tackle vaccineconfidence whereas strengthening correlations points to clear factors to address.
Heaton L, Jones NS, Fricker M, 2015, Energetic constraints on fungal growth, American Naturalist, Vol: 187, ISSN: 1537-5323
Saprotrophic fungi are obliged to spend energy on growth, reproduction and substrate digestion.To understand the trade-offs involved, we developed a model which, for any given growth rate,identifies the strategy that maximises the fraction of energy that could possibly be spent on reproduction.Our model's predictions of growth rates and bioconversion effciencies are consistent withempirical findings, and it predicts the optimal investment in reproduction, resource acquisition andbiomass recycling for a given environment and time scale of reproduction. Thus if the timescaleof reproduction is large compared to the time required for the fungus to double in size, the modelsuggests that the total energy available for reproduction is maximal when a very small fraction of theenergy budget is spent on reproduction. The model also suggests that fungi growing on substrateswith a high concentration of low molecular weight compounds will not benefit from recycling: theyshould be able to grow more rapidly and allocate more energy to reproduction without recycling. Incontrast, recycling offers considerable benefits to fungi growing on recalcitrant substrates, where theindividual hyphae are not crowded, and the time taken to consume resource is significantly longerthan the fungus doubling time.
Johnston IG, Jones NS, 2015, Closed-form stochastic solutions for non-equilibrium dynamics and inheritance of cellular components over many cell divisions, Proceedings of the Royal Society A: Mathematical, Physical & Engineering Sciences, Vol: 471, ISSN: 1364-5021
Stochastic dynamics govern many important processes in cellular biology, and an underlying theoretical approach describing these dynamics is desirable to address a wealth of questions in biology and medicine. Mathematical tools exist for treating several important examples of these stochastic processes, most notably gene expression and random partitioning at single-cell divisions or after a steady state has been reached. Comparatively little work exists exploring different and specific ways that repeated cell divisions can lead to stochastic inheritance of unequilibrated cellular populations. Here we introduce a mathematical formalism to describe cellular agents that are subject to random creation, replication and/or degradation, and are inherited according to a range of random dynamics at cell divisions. We obtain closed-form generating functions describing systems at any time after any number of cell divisions for binomial partitioning and divisions provoking a deterministic or random, subtractive or additive change in copy number, and show that these solutions agree exactly with stochastic simulation. We apply this general formalism to several example problems involving the dynamics of mitochondrial DNA during development and organismal lifetimes.
Johnston IG, Burgstaller JP, Havlicek V, et al., 2015, Stochastic modelling, Bayesian inference, and new in vivo measurements elucidate the debated mtDNA bottleneck mechanism, eLife, Vol: 4, ISSN: 2050-084X
Dangerous damage to mitochondrial DNA (mtDNA) can be ameliorated during mammalian development through a highly debated mechanism called the mtDNA bottleneck. Uncertainty surrounding this process limits our ability to address inherited mtDNA diseases. We produce a new, physically motivated, generalisable theoretical model for mtDNA populations during development, allowing the first statistical comparison of proposed bottleneck mechanisms. Using approximate Bayesian computation and mouse data, we find most statistical support for a combination of binomial partitioning of mtDNAs at cell divisions and random mtDNA turnover, meaning that the debated exact magnitude of mtDNA copy number depletion is flexible. New experimental measurements from a wild-derived mtDNA pairing in mice confirm the theoretical predictions of this model. We analytically solve a mathematical description of this mechanism, computing probabilities of mtDNA disease onset, efficacy of clinical sampling strategies, and effects of potential dynamic interventions, thus developing a quantitative and experimentally-supported stochastic theory of the bottleneck.
Hoitzing H, Johnston IG, Jones NS, 2015, What is the function of mitochondrial networks? A theoretical assessment of hypotheses and proposal for future research, Bioessays, Vol: 37, Pages: 687-700, ISSN: 1521-1878
Mitochondria can change their shape from discrete isolated organelles to a large continuous reticulum. The cellular advantages underlying these fused networks are still incompletely understood. In this paper, we describe and compare hypotheses regarding the function of mitochondrial networks. We use mathematical and physical tools both to investigate existing hypotheses and to generate new ones, and we suggest experimental and modelling strategies. Among the novel insights we underline from this work are the possibilities that (i) selective mitophagy is not required for quality control because selective fusion is sufficient; (ii) increased connectivity may have non-linear effects on the diffusion rate of proteins; and (iii) fused networks can act to dampen biochemical fluctuations. We hope to convey to the reader that quantitative approaches can drive advances in the understanding of the physiological advantage of these morphological changes.
de Figueiredo A, Johnston I, Smith DMD, et al., 2015, Changing socioeconomic determinants of childhood vaccines: a global analysis over three decades, LANCET GLOBAL HEALTH, Vol: 3, Pages: 20-20, ISSN: 2214-109X
King J, Jones N, King J, 2015, Simulation of information spreading following a crisis, City Evacuations: An Interdisciplinary Approach, Pages: 39-62, ISBN: 9783662438763
© Springer-Verlag Berlin Heidelberg 2015. In this chapter we consider how information about a crisis spreads. We consider scenarios, and models thereof, which are variants of the susceptible/infected model from epidemiology. The populace is initially unaware that a crisis has occurred. When the crisis begins, awareness that a crisis has occurred spreads throughout the populace via a combination of broadcast media and social feedback; eventually the entire populace becomes aware of the crisis. We investigate transitions in our models from a completely unaware populace to a completely aware populace, focusing particularly on the speed of the process and the relative impact of different media types. Our models’ behaviour depends heavily on the input parameters which dictate the strengths of different spreading mechanisms. As much as possible we draw values for these parameters from real data. These parameters vary significantly depending on the time of day. For example, the number of people who become aware almost immediately because they are tuned in to broadcast media when the crisis occurs ranges from about 2% to about 47%. In addition, the timescale on which an alert unfolds means that our models should incorporate dynamic parameters, i.e., parameters that change as the alert unfolds. With regard to the relative impact of different media types in our models, we note that, within our model, social media such as Facebook and Twitter are much less important than traditional media, primarily by virtue of their smaller audience and less frequent use. We also identify a critical timescale: the length of time it takes someone with the TV/Radio on to realize there is a crisis and then to relate it to someone else. This realize-and-relate timescale is likely to have an important role in shaping the early course of events in daytime crisis spreading.
© Springer-Verlag Berlin Heidelberg 2015. Evacuating a city is a complex problem that involves issues of governance, preparedness education, warning, information sharing, population dynamics, resilience and recovery. As natural and anthropogenic threats to cities grow, it is an increasingly pressing problem for policy makers and practitioners. The book is the result of a unique interdisciplinary collaboration between researchers in the physical and social sciences to consider how an interdisciplinary approach can help plan for large scale evacuations.nbsp;It draws on perspectives from physics, mathematics, organisation theory, economics, sociology and education.nbsp;Importantly it goes beyond disciplinary boundaries and considers how interdisciplinary methods are necessary to approach a complex problem involving human actors and increasingly complex communications and transportation infrastructures. Using real world case studies and modelling the book considers new approaches to evacuation dynamics. It addresses questions of complexity, not only in terms of theory, but examining the latest challenges for cities and emergency responders. Factors such as social media, information quality and visualisation techniques are examined to consider the 'new' dynamics of warning and informing, evacuation and recovery.
Johnston IG, Rickett BC, Jones NS, 2014, Explicit Tracking of Uncertainty Increases the Power of Quantitative Rule-of-Thumb Reasoning in Cell Biology, BIOPHYSICAL JOURNAL, Vol: 107, Pages: 2612-2617, ISSN: 0006-3495
Schwarzlaender M, Wagner S, Ermakova YG, et al., 2014, The 'mitoflash' probe cpYFP does not respond to superoxide, NATURE, Vol: 514, Pages: E12-E14, ISSN: 0028-0836
El Zawily AM, Schwarzlaender M, Finkemeier I, et al., 2014, FRIENDLY Regulates Mitochondrial Distribution, Fusion, and Quality Control in Arabidopsis, PLANT PHYSIOLOGY, Vol: 166, Pages: 808-U517, ISSN: 0032-0889
Burgstaller JP, Johnston IG, Jones NS, et al., 2014, mtDNA Segregation in Heteroplasmic Tissues Is Common In Vivo and Modulated by Haplotype Differences and Developmental Stage, CELL REPORTS, Vol: 7, Pages: 2031-2041, ISSN: 2211-1247
Fulcher BD, Little MA, Jones NS, 2013, Highly comparative time-series analysis: the empirical structure of time series and their methods, Journal of the Royal Society Interface, Vol: 10, ISSN: 1742-5689
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.