PeerJ Preprints: Mathematical Biologyhttps://peerj.com/preprints/index.atom?journal=peerj&subject=1900Mathematical Biology articles published in PeerJ PreprintsCoccolith arrangement follows Eulerian mathematics in the coccolithophore Emiliania huxleyihttps://peerj.com/preprints/34572017-12-102017-12-10Kai XuDavid HutchinsKunshan Gao
Background. The globally abundant coccolithophore, Emiliania huxleyi, plays an important ecological role in oceanic carbon biogeochemistry by forming a cellular covering of plate-like CaCO3 crystals (coccoliths) and fixing CO2. It is unknown how the cells arrange different sizes of coccoliths to maintain full coverage as the cell surface area changes due to growth and cell division.
Methods. We used Euler’s polyhedron formula and simulation software CaGe, validated with the geometries of coccoliths, to analyses the coccolith topology of coccosphere and the arrange mechanism.
Results. The cells arrange each of the coccoliths to interlock with 4–6 others to keep pace with cell growth and cell division.
Conclusions. This study represents an example of how natural selection has arrived at a solution based on Euler’s polyhedral formula in response to the challenge of maintaining a CaCO3 covering on coccolithophore cells as cell size changes.
Background. The globally abundant coccolithophore, Emiliania huxleyi, plays an important ecological role in oceanic carbon biogeochemistry by forming a cellular covering of plate-like CaCO3 crystals (coccoliths) and fixing CO2. It is unknown how the cells arrange different sizes of coccoliths to maintain full coverage as the cell surface area changes due to growth and cell division.Methods. We used Euler’s polyhedron formula and simulation software CaGe, validated with the geometries of coccoliths, to analyses the coccolith topology of coccosphere and the arrange mechanism.Results. The cells arrange each of the coccoliths to interlock with 4–6 others to keep pace with cell growth and cell division.Conclusions. This study represents an example of how natural selection has arrived at a solution based on Euler’s polyhedral formula in response to the challenge of maintaining a CaCO3 covering on coccolithophore cells as cell size changes.Cancer growth, metastasis and control likewise Go gaming: an Ising model approachhttps://peerj.com/preprints/34342017-11-292017-11-29Didier Barradas-BautistaMatías AlvaradoGerminal CochoMark Agostino
This work aims for modeling and simulating the metastasis of cancer, via the analogy between the cancer process and the board game Go. In the game of Go, black stones play first, could correspond to metastasis of cancer. Moreover, playing white stones on the second turn would correspond to the inhibition of cancer invasion. Mathematical modeling and algorithmic simulation of Go may, therefore, benefit the efforts to deploy therapies to surpass cancer illness by providing insight into the cellular growth and expansion over a tissue area. In this paper, we use the Ising Hamiltonian, an energy model to describe the energy exchange in interacting particles, to propose the modeling of cancer dynamics. Parameters in the energy function refer the biochemical elements that induce cancer metastasis; as well as, the biochemical immune system process of response.
This work aims for modeling and simulating the metastasis of cancer, via the analogy between the cancer process and the board game Go. In the game of Go, black stones play first, could correspond to metastasis of cancer. Moreover, playing white stones on the second turn would correspond to the inhibition of cancer invasion. Mathematical modeling and algorithmic simulation of Go may, therefore, benefit the efforts to deploy therapies to surpass cancer illness by providing insight into the cellular growth and expansion over a tissue area. In this paper, we use the Ising Hamiltonian, an energy model to describe the energy exchange in interacting particles, to propose the modeling of cancer dynamics. Parameters in the energy function refer the biochemical elements that induce cancer metastasis; as well as, the biochemical immune system process of response.The mathematics of extinction across scales: from populations to the biospherehttps://peerj.com/preprints/33672017-10-242017-10-24Colin CarlsonKevin BurgioTad DallasWayne Getz
The sixth mass extinction poses an unparalleled quantitative challenge to conservation biologists. Mathematicians and ecologists alike face the problem of developing models that can scale predictions of extinction rates from populations to the level of a species, or even to an entire ecosystem. We review some of the most basic stochastic and analytical methods of calculating extinction risk at different scales, including population viability analysis, stochastic metapopulation occupancy models, and the species area relationship. We also consider two major extensions of theory: the possibility of evolutionary rescue from extinction in a changing environment, and the posthumous assignment of an extinction date from sighting records. In the case of the latter, we provide a new example using data on Spix's macaw (Cyanopsitta spixii), the "rarest bird in the world," to demonstrate the challenges associated with extinction date research.
The sixth mass extinction poses an unparalleled quantitative challenge to conservation biologists. Mathematicians and ecologists alike face the problem of developing models that can scale predictions of extinction rates from populations to the level of a species, or even to an entire ecosystem. We review some of the most basic stochastic and analytical methods of calculating extinction risk at different scales, including population viability analysis, stochastic metapopulation occupancy models, and the species area relationship. We also consider two major extensions of theory: the possibility of evolutionary rescue from extinction in a changing environment, and the posthumous assignment of an extinction date from sighting records. In the case of the latter, we provide a new example using data on Spix's macaw (Cyanopsitta spixii), the "rarest bird in the world," to demonstrate the challenges associated with extinction date research.On the exponent in the von Bertalanffy growth modelhttps://peerj.com/preprints/33032017-09-292017-09-29Katharina Renner-MartinNorbert BrunnerManfred KühleitnerGeorg NowakKlaus Scheicher
Bertalanffy proposed the differential equation m´(t) = p × m (t) a –q × m (t) for the description of the mass growth of animals as a function m(t) of time t. He suggested that the solution using the metabolic scaling exponent a = 2/3 (von Bertalanffy growth function VBGF) would be universal for vertebrates. Several authors questioned universality, as for certain species other models would provide a better fit. This paper reconsiders this question. Using the Akaike information criterion it proposes a testable definition of ‘weak universality’ for a taxonomic group of species. (It roughly means that a model has an acceptable fit to most data sets of that group.) This definition was applied to 60 data sets from literature (37 about fish and 23 about non-fish species) and for each dataset an optimal metabolic scaling exponent 0 ≤ a opt < 1 was identified, where the model function m(t) achieved the best fit to the data. Although in general this optimal exponent differed widely from a = 2/3 of the VBGF, the VBGF was weakly universal for fish, but not for non-fish. This observation supported the conjecture that the pattern of growth for fish may be distinct. The paper discusses this conjecture.
Bertalanffy proposed the differential equation m´(t) = p × m (t) a –q × m (t) for the description of the mass growth of animals as a function m(t) of time t. He suggested that the solution using the metabolic scaling exponent a = 2/3 (von Bertalanffy growth function VBGF) would be universal for vertebrates. Several authors questioned universality, as for certain species other models would provide a better fit. This paper reconsiders this question. Using the Akaike information criterion it proposes a testable definition of ‘weak universality’ for a taxonomic group of species. (It roughly means that a model has an acceptable fit to most data sets of that group.) This definition was applied to 60 data sets from literature (37 about fish and 23 about non-fish species) and for each dataset an optimal metabolic scaling exponent 0 ≤ a opt < 1 was identified, where the model function m(t) achieved the best fit to the data. Although in general this optimal exponent differed widely from a = 2/3 of the VBGF, the VBGF was weakly universal for fish, but not for non-fish. This observation supported the conjecture that the pattern of growth for fish may be distinct. The paper discusses this conjecture.Genome rearrangements and phylogeny reconstruction in Yersinia pestishttps://peerj.com/preprints/32232017-09-052017-09-05Olga O BochkarevaNatalia O DranenkoElena S OcheredkoGerman M KanevskyYaroslav N LozinskyVera A KhalaychevaIrena I ArtamonovaMikhail S Gelfand
Genome rearrangements have played an important role in the evolution of Yersinia pestis from its progenitor Yersinia pseudotuberculosis. Traditional phylogenetic trees for Y. pestis based on sequence comparison have short internal branches and low bootstrap supports as only a small number of nucleotide substitutions have occurred. On the other hand, even a small number of genome rearrangements may resolve topological ambiguities in a phylogenetic tree.
We reconstructed the evolutionary history of genome rearrangements in Y. pestis. We also reconciled phylogenetic trees for each of the three CRISPR-loci to obtain an integrated scenario of the CRISPR-cassette evolution. We detected numerous parallel inversions and gain/loss events by the analysis of contradictions between the obtained evolutionary trees. We also tested the hypotheses that large within-replichore inversions tend to be balanced by subsequent reversal events and that the core genes less frequently switch the chain by inversions. Both predictions were not confirmed.
Our data indicate that an integrated analysis of sequence-based and inversion-based trees enhances the resolution of phylogenetic reconstruction. In contrast, reconstructions of strain relationships based on solely CRISPR loci may not be reliable, as the history is obscured by large deletions, obliterating the order of spacer gains. Similarly, numerous parallel gene losses preclude reconstruction of phylogeny based on gene content.
Genome rearrangements have played an important role in the evolution of Yersinia pestis from its progenitor Yersinia pseudotuberculosis. Traditional phylogenetic trees for Y. pestis based on sequence comparison have short internal branches and low bootstrap supports as only a small number of nucleotide substitutions have occurred. On the other hand, even a small number of genome rearrangements may resolve topological ambiguities in a phylogenetic tree.We reconstructed the evolutionary history of genome rearrangements in Y. pestis. We also reconciled phylogenetic trees for each of the three CRISPR-loci to obtain an integrated scenario of the CRISPR-cassette evolution. We detected numerous parallel inversions and gain/loss events by the analysis of contradictions between the obtained evolutionary trees. We also tested the hypotheses that large within-replichore inversions tend to be balanced by subsequent reversal events and that the core genes less frequently switch the chain by inversions. Both predictions were not confirmed.Our data indicate that an integrated analysis of sequence-based and inversion-based trees enhances the resolution of phylogenetic reconstruction. In contrast, reconstructions of strain relationships based on solely CRISPR loci may not be reliable, as the history is obscured by large deletions, obliterating the order of spacer gains. Similarly, numerous parallel gene losses preclude reconstruction of phylogeny based on gene content.Integrating ecology and evolution to study hypothetical dynamics of algal blooms and Muller’s ratchet using Evolvixhttps://peerj.com/preprints/32182017-09-012017-09-01Sarah NortheyCourtney HoveJustine KaoJon IdeJanel McKinneyLaurence Loewe
Algal blooms have been the subject of considerable research as they occur over various spatial and temporal scales and can produce toxins that disrupt their ecosystem. Algal blooms are often governed by nutrient availability however other limitations exist. Algae are primary producers and therefore subject to predation which can keep populations below levels supported by nutrient availability. If algae as prey mutate to gain the ability to produce toxins deterring predators, they may increase their survival rates and form blooms unless other factors counter their effective increase in growth rate. Where might such mutations come from? Clearly, large populations of algae will repeatedly experience mutations knocking-out DNA repair genes, increasing mutation rates, and with them the chance of acquiring de-novo mutations producing a toxin against predators. We investigate this hypothetical scenario by simulation in the Evolvix modeling language. We modeled a sequence of steps that in principle can allow a typical asexual algal population to escape predation pressure and form a bloom with the help of mutators. We then turn our attention to the unavoidable side effect of generally increased mutation rates, many slightly deleterious mutations. If these accumulate at sufficient speed, their combined impact on fitness might place upper limits on the duration of algal blooms. These steps are required: (1) Random mutations result in the loss of DNA repair mechanisms. (2) Increased mutation rates make it more likely to acquire the ability to produce toxins by altering metabolism. (3) Toxins deter predators providing algae with growth advantages that can mask linked slightly deleterious mutational effects. (4) Reduced predation pressure enables blooms if algae have sufficient nutrients. (5) Lack of recombination results in the accumulation of slightly deleterious mutations as predicted by Muller’s ratchet. (6) If fast enough, deleterious mutation accumulation eventually leads to mutational meltdown of toxic blooming algae. (7) Non-mutator algal populations are not affected due to ongoing predation pressure. Our simulation models integrate ecological continuous-time dynamics of predator-prey systems with the population genetics of a simplified Muller’s ratchet model using Evolvix. Evolvix maps these models to Continuous-Time Markov Chain models that can be simulated deterministically or stochastically depending on the question. The current model is incomplete; we plan to investigate many parameter combinations to produce a more robust model ensemble with stable links to reasonable parameter estimates. However, our model already has several intriguing features that may allow for the eventual development of observation methods for monitoring ecosystem health. Our work also highlights a growing need to simulate integrated models combining ecological processes, multi-level population dynamics, and evolutionary genetics in a single computational run.
Algal blooms have been the subject of considerable research as they occur over various spatial and temporal scales and can produce toxins that disrupt their ecosystem. Algal blooms are often governed by nutrient availability however other limitations exist. Algae are primary producers and therefore subject to predation which can keep populations below levels supported by nutrient availability. If algae as prey mutate to gain the ability to produce toxins deterring predators, they may increase their survival rates and form blooms unless other factors counter their effective increase in growth rate. Where might such mutations come from? Clearly, large populations of algae will repeatedly experience mutations knocking-out DNA repair genes, increasing mutation rates, and with them the chance of acquiring de-novo mutations producing a toxin against predators. We investigate this hypothetical scenario by simulation in the Evolvix modeling language. We modeled a sequence of steps that in principle can allow a typical asexual algal population to escape predation pressure and form a bloom with the help of mutators. We then turn our attention to the unavoidable side effect of generally increased mutation rates, many slightly deleterious mutations. If these accumulate at sufficient speed, their combined impact on fitness might place upper limits on the duration of algal blooms. These steps are required: (1) Random mutations result in the loss of DNA repair mechanisms. (2) Increased mutation rates make it more likely to acquire the ability to produce toxins by altering metabolism. (3) Toxins deter predators providing algae with growth advantages that can mask linked slightly deleterious mutational effects. (4) Reduced predation pressure enables blooms if algae have sufficient nutrients. (5) Lack of recombination results in the accumulation of slightly deleterious mutations as predicted by Muller’s ratchet. (6) If fast enough, deleterious mutation accumulation eventually leads to mutational meltdown of toxic blooming algae. (7) Non-mutator algal populations are not affected due to ongoing predation pressure. Our simulation models integrate ecological continuous-time dynamics of predator-prey systems with the population genetics of a simplified Muller’s ratchet model using Evolvix. Evolvix maps these models to Continuous-Time Markov Chain models that can be simulated deterministically or stochastically depending on the question. The current model is incomplete; we plan to investigate many parameter combinations to produce a more robust model ensemble with stable links to reasonable parameter estimates. However, our model already has several intriguing features that may allow for the eventual development of observation methods for monitoring ecosystem health. Our work also highlights a growing need to simulate integrated models combining ecological processes, multi-level population dynamics, and evolutionary genetics in a single computational run.An invitation to modeling: building a community with shared explicit practiceshttps://peerj.com/preprints/32152017-09-012017-09-01Kam D DahlquistMelissa L AikensJoseph T DauerSamuel S DonovanCarrie Diaz EatonHannah Callender HighlanderKristin P JenkinsJohn R JungckM Drew LaMarGlenn LedderRobert L MayesRichard C Schugart
Models and the process of modeling are fundamental to the discipline of biology, and therefore should be incorporated into undergraduate biology courses. In this essay, we draw upon the literature and our own teaching experiences to provide practical suggestions for how to introduce models and modeling to introductory biology students. We begin by demonstrating the ubiquity of models in biology, including representations of the process of science itself. We advocate for a model of the process of science that highlights parallel tracks of mathematical and experimental modeling investigations. With this recognition, we suggest ways in which instructors can call students’ attention to biological models more explicitly by using modeling language, facilitating metacognition about the use of models, and employing model-based reasoning. We then provide guidance on how to begin to engage students in the process of modeling, encouraging instructors to scaffold a progression to mathematical modeling. We use the Hardy-Weinberg Equilibrium model to provide specific pedagogical examples that illustrate our suggestions. We propose that by making even a small shift in the way models and modeling are discussed in the classroom, students will gain understanding of key biological concepts, practice realistic scientific inquiry, and build quantitative and communication skills.
Models and the process of modeling are fundamental to the discipline of biology, and therefore should be incorporated into undergraduate biology courses. In this essay, we draw upon the literature and our own teaching experiences to provide practical suggestions for how to introduce models and modeling to introductory biology students. We begin by demonstrating the ubiquity of models in biology, including representations of the process of science itself. We advocate for a model of the process of science that highlights parallel tracks of mathematical and experimental modeling investigations. With this recognition, we suggest ways in which instructors can call students’ attention to biological models more explicitly by using modeling language, facilitating metacognition about the use of models, and employing model-based reasoning. We then provide guidance on how to begin to engage students in the process of modeling, encouraging instructors to scaffold a progression to mathematical modeling. We use the Hardy-Weinberg Equilibrium model to provide specific pedagogical examples that illustrate our suggestions. We propose that by making even a small shift in the way models and modeling are discussed in the classroom, students will gain understanding of key biological concepts, practice realistic scientific inquiry, and build quantitative and communication skills.Sicegar: R package for sigmoidal and double-sigmoidal curve fittinghttps://peerj.com/preprints/31162017-07-312017-07-31M. Umut CaglarAshley I. TeufelClaus O Wilke
Sigmoidal and double-sigmoidal dynamics are commonly observed in many areas of biology. Here we present sicegar, an R package for the automated fitting and classification of sigmoidal and double-sigmodial data. The package categorizes data into one of three categories, "no signal", "sigmodial", or "double sigmoidal", by rigorously fitting a series of mathematical models to the data. The data is labeled as "ambiguous" if neither the sigmoidal nor double-sigmoidal model fit the data well. In addition to performing the classification, the package also reports a wealth of metrics as well as biologically meaningful parameters describing the sigmoidal or double-sigmoidal curves. In extensive simulations, we find that the package performs well, can recover the original dynamics even under fairly high noise levels, and will typically classify curves as "ambiguous" rather than misclassifying them. The package is available on CRAN and comes with extensive documentation and usage examples.
Sigmoidal and double-sigmoidal dynamics are commonly observed in many areas of biology. Here we present sicegar, an R package for the automated fitting and classification of sigmoidal and double-sigmodial data. The package categorizes data into one of three categories, "no signal", "sigmodial", or "double sigmoidal", by rigorously fitting a series of mathematical models to the data. The data is labeled as "ambiguous" if neither the sigmoidal nor double-sigmoidal model fit the data well. In addition to performing the classification, the package also reports a wealth of metrics as well as biologically meaningful parameters describing the sigmoidal or double-sigmoidal curves. In extensive simulations, we find that the package performs well, can recover the original dynamics even under fairly high noise levels, and will typically classify curves as "ambiguous" rather than misclassifying them. The package is available on CRAN and comes with extensive documentation and usage examples.Are extinction opinions extinct?https://peerj.com/preprints/30952017-07-182017-07-18Tamsin E LeeClive BowmanDavid L Roberts
Extinction models vary in the information they require, the simplest considering the rate of certain sightings only. More complicated methods include uncertain sightings and allow for variation in the reliability of uncertain sightings. Generally extinction models require expert opinion, either as a prior belief that a species is extinct, or to establish the quality of a sighting record, or both. Is this subjectivity necessary?
We present two models to explore whether the individual quality of sightings, judged by experts, is strongly informative of the probability of extinction: the `quality breakpoint method' and the `quality as variance method'. For the first method we use the Barbary lion as an exemplar. For the second method we use the Barbary lion, Alaotra grebe, Jamaican petrel and Pohnpei starling as exemplars.
The `quality breakpoint method' uses certain and uncertain sighting records, and the quality of uncertain records, to establish whether a change point in the rate of sightings can be established using a simultaneous Bayesian optimisation with a non-informative prior. For the Barbary lion, there is a change in subjective quality of sightings around 1930. Unexpectedly sighting quality increases after this date. This suggests that including quality scores from experts can lead to irregular effects and may not offer reliable results. As an alternative, we use quality as a measure of variance around the sightings, not a change in quality. This leads to predictions with larger standard deviations, however the results remain consistent across any prior belief of extinction. Nonetheless, replacing actual quality scores with random quality scores showed little difference, inferring that the quality scores from experts are superfluous.
Therefore, we deem the expensive process of obtaining pooled expert estimates as unnecessary and even when used we recommend that sighting data should have minimal input from experts in terms of assessing the sighting quality at a fine scale. Rather, sightings should be classed as certain or uncertain, using a framework that is as independent of human bias as possible.
Extinction models vary in the information they require, the simplest considering the rate of certain sightings only. More complicated methods include uncertain sightings and allow for variation in the reliability of uncertain sightings. Generally extinction models require expert opinion, either as a prior belief that a species is extinct, or to establish the quality of a sighting record, or both. Is this subjectivity necessary?We present two models to explore whether the individual quality of sightings, judged by experts, is strongly informative of the probability of extinction: the `quality breakpoint method' and the `quality as variance method'. For the first method we use the Barbary lion as an exemplar. For the second method we use the Barbary lion, Alaotra grebe, Jamaican petrel and Pohnpei starling as exemplars.The `quality breakpoint method' uses certain and uncertain sighting records, and the quality of uncertain records, to establish whether a change point in the rate of sightings can be established using a simultaneous Bayesian optimisation with a non-informative prior. For the Barbary lion, there is a change in subjective quality of sightings around 1930. Unexpectedly sighting quality increases after this date. This suggests that including quality scores from experts can lead to irregular effects and may not offer reliable results. As an alternative, we use quality as a measure of variance around the sightings, not a change in quality. This leads to predictions with larger standard deviations, however the results remain consistent across any prior belief of extinction. Nonetheless, replacing actual quality scores with random quality scores showed little difference, inferring that the quality scores from experts are superfluous.Therefore, we deem the expensive process of obtaining pooled expert estimates as unnecessary and even when used we recommend that sighting data should have minimal input from experts in terms of assessing the sighting quality at a fine scale. Rather, sightings should be classed as certain or uncertain, using a framework that is as independent of human bias as possible.Comparing enzyme activity modifiers equations through the development of global data fitting templates in Excelhttps://peerj.com/preprints/30942017-07-182017-07-18Ryan Walsh
The classical way of defining enzyme inhibition has obscured the distinction between inhibitory effect and the inhibitor binding constant. This article examines the relationship between the simple binding curve used to define biomolecular interactions and the standard inhibitory term (1+([I]/Ki)). By understanding how this term relates to binding curves which are ubiquitously used to describe biological processes, a modifier equation which distinguishes between inhibitor binding and the inhibitory effect, is examined. This modifier equation which can describe both activation and inhibition is compared to standard inhibitory equations with the development of global data fitting templates in Excel, and via the global fitting of these equations to previously reported enzyme kinetic data. This equation and the template developed in this article should prove to be useful tools in the study of enzyme inhibition and activation.
The classical way of defining enzyme inhibition has obscured the distinction between inhibitory effect and the inhibitor binding constant. This article examines the relationship between the simple binding curve used to define biomolecular interactions and the standard inhibitory term (1+([I]/Ki)). By understanding how this term relates to binding curves which are ubiquitously used to describe biological processes, a modifier equation which distinguishes between inhibitor binding and the inhibitory effect, is examined. This modifier equation which can describe both activation and inhibition is compared to standard inhibitory equations with the development of global data fitting templates in Excel, and via the global fitting of these equations to previously reported enzyme kinetic data. This equation and the template developed in this article should prove to be useful tools in the study of enzyme inhibition and activation.