PeerJ: Mathematical Biologyhttps://peerj.com/articles/index.atom?journal=peerj&subject=1900Mathematical Biology articles published in PeerJUse and misuse of temperature normalisation in meta-analyses of thermal responses of biological traitshttps://peerj.com/articles/43632018-02-092018-02-09Dimitrios - Georgios KontopoulosBernardo García-CarrerasSofía SalThomas P. SmithSamraat Pawar
There is currently unprecedented interest in quantifying variation in thermal physiology among organisms, especially in order to understand and predict the biological impacts of climate change. A key parameter in this quantification of thermal physiology is the performance or value of a rate, across individuals or species, at a common temperature (temperature normalisation). An increasingly popular model for fitting thermal performance curves to data—the Sharpe-Schoolfield equation—can yield strongly inflated estimates of temperature-normalised rate values. These deviations occur whenever a key thermodynamic assumption of the model is violated, i.e., when the enzyme governing the performance of the rate is not fully functional at the chosen reference temperature. Using data on 1,758 thermal performance curves across a wide range of species, we identify the conditions that exacerbate this inflation. We then demonstrate that these biases can compromise tests to detect metabolic cold adaptation, which requires comparison of fitness or rate performance of different species or genotypes at some fixed low temperature. Finally, we suggest alternative methods for obtaining unbiased estimates of temperature-normalised rate values for meta-analyses of thermal performance across species in climate change impact studies.
There is currently unprecedented interest in quantifying variation in thermal physiology among organisms, especially in order to understand and predict the biological impacts of climate change. A key parameter in this quantification of thermal physiology is the performance or value of a rate, across individuals or species, at a common temperature (temperature normalisation). An increasingly popular model for fitting thermal performance curves to data—the Sharpe-Schoolfield equation—can yield strongly inflated estimates of temperature-normalised rate values. These deviations occur whenever a key thermodynamic assumption of the model is violated, i.e., when the enzyme governing the performance of the rate is not fully functional at the chosen reference temperature. Using data on 1,758 thermal performance curves across a wide range of species, we identify the conditions that exacerbate this inflation. We then demonstrate that these biases can compromise tests to detect metabolic cold adaptation, which requires comparison of fitness or rate performance of different species or genotypes at some fixed low temperature. Finally, we suggest alternative methods for obtaining unbiased estimates of temperature-normalised rate values for meta-analyses of thermal performance across species in climate change impact studies.Complex versus simple models: ion-channel cardiac toxicity predictionhttps://peerj.com/articles/43522018-02-052018-02-05Hitesh B. Mistry
There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model Bnet was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the Bnet model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.
There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model Bnet was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the Bnet model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.Sicegar: R package for sigmoidal and double-sigmoidal curve fittinghttps://peerj.com/articles/42512018-01-162018-01-16M. Umut CaglarAshley I. TeufelClaus O. Wilke
Sigmoidal and double-sigmoidal dynamics are commonly observed in many areas of biology. Here we present sicegar, an R package for the automated fitting and classification of sigmoidal and double-sigmoidal data. The package categorizes data into one of three categories, “no signal,” “sigmoidal,” or “double-sigmoidal,” by rigorously fitting a series of mathematical models to the data. The data is labeled as “ambiguous” if neither the sigmoidal nor double-sigmoidal model fit the data well. In addition to performing the classification, the package also reports a wealth of metrics as well as biologically meaningful parameters describing the sigmoidal or double-sigmoidal curves. In extensive simulations, we find that the package performs well, can recover the original dynamics even under fairly high noise levels, and will typically classify curves as “ambiguous” rather than misclassifying them. The package is available on CRAN and comes with extensive documentation and usage examples.
Sigmoidal and double-sigmoidal dynamics are commonly observed in many areas of biology. Here we present sicegar, an R package for the automated fitting and classification of sigmoidal and double-sigmoidal data. The package categorizes data into one of three categories, “no signal,” “sigmoidal,” or “double-sigmoidal,” by rigorously fitting a series of mathematical models to the data. The data is labeled as “ambiguous” if neither the sigmoidal nor double-sigmoidal model fit the data well. In addition to performing the classification, the package also reports a wealth of metrics as well as biologically meaningful parameters describing the sigmoidal or double-sigmoidal curves. In extensive simulations, we find that the package performs well, can recover the original dynamics even under fairly high noise levels, and will typically classify curves as “ambiguous” rather than misclassifying them. The package is available on CRAN and comes with extensive documentation and usage examples.On the exponent in the Von Bertalanffy growth modelhttps://peerj.com/articles/42052018-01-042018-01-04Katharina Renner-MartinNorbert BrunnerManfred KühleitnerWerner Georg NowakKlaus Scheicher
Von Bertalanffy proposed the differential equation m′(t) = p × m(t)a − q × m(t) for the description of the mass growth of animals as a function m(t) of time t. He suggested that the solution using the metabolic scaling exponent a = 2/3 (Von Bertalanffy growth function VBGF) would be universal for vertebrates. Several authors questioned universality, as for certain species other models would provide a better fit. This paper reconsiders this question. Based on 60 data sets from literature (37 about fish and 23 about non-fish species) it optimizes the model parameters, in particular the exponent 0 ≤ a < 1, so that the model curve achieves the best fit to the data. The main observation of the paper is the large variability in the exponent, which can vary over a very large range without affecting the fit to the data significantly, when the other parameters are also optimized. The paper explains this by differences in the data quality: variability is low for data from highly controlled experiments and high for natural data. Other deficiencies were biologically meaningless optimal parameter values or optimal parameter values attained on the boundary of the parameter region (indicating the possible need for a different model). Only 11 of the 60 data sets were free of such deficiencies and for them no universal exponent could be discerned.
Von Bertalanffy proposed the differential equation m′(t) = p × m(t)a − q × m(t) for the description of the mass growth of animals as a function m(t) of time t. He suggested that the solution using the metabolic scaling exponent a = 2/3 (Von Bertalanffy growth function VBGF) would be universal for vertebrates. Several authors questioned universality, as for certain species other models would provide a better fit. This paper reconsiders this question. Based on 60 data sets from literature (37 about fish and 23 about non-fish species) it optimizes the model parameters, in particular the exponent 0 ≤ a < 1, so that the model curve achieves the best fit to the data. The main observation of the paper is the large variability in the exponent, which can vary over a very large range without affecting the fit to the data significantly, when the other parameters are also optimized. The paper explains this by differences in the data quality: variability is low for data from highly controlled experiments and high for natural data. Other deficiencies were biologically meaningless optimal parameter values or optimal parameter values attained on the boundary of the parameter region (indicating the possible need for a different model). Only 11 of the 60 data sets were free of such deficiencies and for them no universal exponent could be discerned.Neither slim nor fat: estimating the mass of the dodo (Raphus cucullatus, Aves, Columbiformes) based on the largest sample of dodo bones to datehttps://peerj.com/articles/41102017-12-052017-12-05Anneke H. van HeterenRoland C.H. van DierendonckMaria A.N.E. van EgmondSjang L. ten HagenJippe Kreuning
The dodo (Raphus cucullatus) might be the most enigmatic bird of all times. It is, therefore, highly remarkable that no consensus has yet been reached on its body mass; previous scientific estimates of its mass vary by more than 100%. Until now, the vast amount of bones stored at the Natural History Museum in Mauritius has not yet been studied morphometrically nor in relation to body mass. Here, a new estimate of the dodo’s mass is presented based on the largest sample of dodo femora ever measured (n = 174). In order to do this, we have used the regression method and chosen our variables based on biological, mathematical and physical arguments. The results indicate that the mean mass of the dodo was circa 12 kg, which is approximately five times as heavy as the largest living Columbidae (pigeons and doves), the clade to which the dodo belongs.
The dodo (Raphus cucullatus) might be the most enigmatic bird of all times. It is, therefore, highly remarkable that no consensus has yet been reached on its body mass; previous scientific estimates of its mass vary by more than 100%. Until now, the vast amount of bones stored at the Natural History Museum in Mauritius has not yet been studied morphometrically nor in relation to body mass. Here, a new estimate of the dodo’s mass is presented based on the largest sample of dodo femora ever measured (n = 174). In order to do this, we have used the regression method and chosen our variables based on biological, mathematical and physical arguments. The results indicate that the mean mass of the dodo was circa 12 kg, which is approximately five times as heavy as the largest living Columbidae (pigeons and doves), the clade to which the dodo belongs.On the relationship between tumour growth rate and survival in non-small cell lung cancerhttps://peerj.com/articles/41112017-11-292017-11-29Hitesh B. Mistry
A recurrent question within oncology drug development is predicting phase III outcome for a new treatment using early clinical data. One approach to tackle this problem has been to derive metrics from mathematical models that describe tumour size dynamics termed re-growth rate and time to tumour re-growth. They have shown to be strong predictors of overall survival in numerous studies but there is debate about how these metrics are derived and if they are more predictive than empirical end-points. This work explores the issues raised in using model-derived metric as predictors for survival analyses. Re-growth rate and time to tumour re-growth were calculated for three large clinical studies by forward and reverse alignment. The latter involves re-aligning patients to their time of progression. Hence, it accounts for the time taken to estimate re-growth rate and time to tumour re-growth but also assesses if these predictors correlate to survival from the time of progression. I found that neither re-growth rate nor time to tumour re-growth correlated to survival using reverse alignment. This suggests that the dynamics of tumours up until disease progression has no relationship to survival post progression. For prediction of a phase III trial I found the metrics performed no better than empirical end-points. These results highlight that care must be taken when relating dynamics of tumour imaging to survival and that bench-marking new approaches to existing ones is essential.
A recurrent question within oncology drug development is predicting phase III outcome for a new treatment using early clinical data. One approach to tackle this problem has been to derive metrics from mathematical models that describe tumour size dynamics termed re-growth rate and time to tumour re-growth. They have shown to be strong predictors of overall survival in numerous studies but there is debate about how these metrics are derived and if they are more predictive than empirical end-points. This work explores the issues raised in using model-derived metric as predictors for survival analyses. Re-growth rate and time to tumour re-growth were calculated for three large clinical studies by forward and reverse alignment. The latter involves re-aligning patients to their time of progression. Hence, it accounts for the time taken to estimate re-growth rate and time to tumour re-growth but also assesses if these predictors correlate to survival from the time of progression. I found that neither re-growth rate nor time to tumour re-growth correlated to survival using reverse alignment. This suggests that the dynamics of tumours up until disease progression has no relationship to survival post progression. For prediction of a phase III trial I found the metrics performed no better than empirical end-points. These results highlight that care must be taken when relating dynamics of tumour imaging to survival and that bench-marking new approaches to existing ones is essential.The intervals method: a new approach to analyse finite element outputs using multivariate statisticshttps://peerj.com/articles/37932017-10-132017-10-13Jordi Marcé-NoguéSoledad De Esteban-TrivignoThomas A. PüschelJosep Fortuny
Background
In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology.
Methods
The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods.
Results
Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species.
Discussion
We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches.
Background
In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology.
Methods
The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods.
Results
Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species.
Discussion
We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches.Multi-scale immunoepidemiological modeling of within-host and between-host HIV dynamics: systematic review of mathematical modelshttps://peerj.com/articles/38772017-09-282017-09-28Nargesalsadat DorratoltajRyan Nikin-BeersStanca M. CiupeStephen G. EubankKaja M. Abbas
Objective
The objective of this study is to conduct a systematic review of multi-scale HIV immunoepidemiological models to improve our understanding of the synergistic impact between the HIV viral-immune dynamics at the individual level and HIV transmission dynamics at the population level.
Background
While within-host and between-host models of HIV dynamics have been well studied at a single scale, connecting the immunological and epidemiological scales through multi-scale models is an emerging method to infer the synergistic dynamics of HIV at the individual and population levels.
Methods
We reviewed nine articles using the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) framework that focused on the synergistic dynamics of HIV immunoepidemiological models at the individual and population levels.
Results
HIV immunoepidemiological models simulate viral immune dynamics at the within-host scale and the epidemiological transmission dynamics at the between-host scale. They account for longitudinal changes in the immune viral dynamics of HIV+ individuals, and their corresponding impact on the transmission dynamics in the population. They are useful to analyze the dynamics of HIV super-infection, co-infection, drug resistance, evolution, and treatment in HIV+ individuals, and their impact on the epidemic pathways in the population. We illustrate the coupling mechanisms of the within-host and between-host scales, their mathematical implementation, and the clinical and public health problems that are appropriate for analysis using HIV immunoepidemiological models.
Conclusion
HIV immunoepidemiological models connect the within-host immune dynamics at the individual level and the epidemiological transmission dynamics at the population level. While multi-scale models add complexity over a single-scale model, they account for the time varying immune viral response of HIV+ individuals, and the corresponding impact on the time-varying risk of transmission of HIV+ individuals to other susceptibles in the population.
Objective
The objective of this study is to conduct a systematic review of multi-scale HIV immunoepidemiological models to improve our understanding of the synergistic impact between the HIV viral-immune dynamics at the individual level and HIV transmission dynamics at the population level.
Background
While within-host and between-host models of HIV dynamics have been well studied at a single scale, connecting the immunological and epidemiological scales through multi-scale models is an emerging method to infer the synergistic dynamics of HIV at the individual and population levels.
Methods
We reviewed nine articles using the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) framework that focused on the synergistic dynamics of HIV immunoepidemiological models at the individual and population levels.
Results
HIV immunoepidemiological models simulate viral immune dynamics at the within-host scale and the epidemiological transmission dynamics at the between-host scale. They account for longitudinal changes in the immune viral dynamics of HIV+ individuals, and their corresponding impact on the transmission dynamics in the population. They are useful to analyze the dynamics of HIV super-infection, co-infection, drug resistance, evolution, and treatment in HIV+ individuals, and their impact on the epidemic pathways in the population. We illustrate the coupling mechanisms of the within-host and between-host scales, their mathematical implementation, and the clinical and public health problems that are appropriate for analysis using HIV immunoepidemiological models.
Conclusion
HIV immunoepidemiological models connect the within-host immune dynamics at the individual level and the epidemiological transmission dynamics at the population level. While multi-scale models add complexity over a single-scale model, they account for the time varying immune viral response of HIV+ individuals, and the corresponding impact on the time-varying risk of transmission of HIV+ individuals to other susceptibles in the population.Are extinction opinions extinct?https://peerj.com/articles/36632017-08-112017-08-11Tamsin E. LeeClive BowmanDavid L. Roberts
Extinction models vary in the information they require, the simplest considering the rate of certain sightings only. More complicated methods include uncertain sightings and allow for variation in the reliability of uncertain sightings. Generally extinction models require expert opinion, either as a prior belief that a species is extinct, or to establish the quality of a sighting record, or both. Is this subjectivity necessary? We present two models to explore whether the individual quality of sightings, judged by experts, is strongly informative of the probability of extinction: the ‘quality breakpoint method’ and the ‘quality as variance method’. For the first method we use the Barbary lion as an exemplar. For the second method we use the Barbary lion, Alaotra grebe, Jamaican petrel and Pohnpei starling as exemplars. The ‘quality breakpoint method’ uses certain and uncertain sighting records, and the quality of uncertain records, to establish whether a change point in the rate of sightings can be established using a simultaneous Bayesian optimisation with a non-informative prior. For the Barbary lion, there is a change in subjective quality of sightings around 1930. Unexpectedly sighting quality increases after this date. This suggests that including quality scores from experts can lead to irregular effects and may not offer reliable results. As an alternative, we use quality as a measure of variance around the sightings, not a change in quality. This leads to predictions with larger standard deviations, however the results remain consistent across any prior belief of extinction. Nonetheless, replacing actual quality scores with random quality scores showed little difference, inferring that the quality scores from experts are superfluous. Therefore, we deem the expensive process of obtaining pooled expert estimates as unnecessary, and even when used we recommend that sighting data should have minimal input from experts in terms of assessing the sighting quality at a fine scale. Rather, sightings should be classed as certain or uncertain, using a framework that is as independent of human bias as possible.
Extinction models vary in the information they require, the simplest considering the rate of certain sightings only. More complicated methods include uncertain sightings and allow for variation in the reliability of uncertain sightings. Generally extinction models require expert opinion, either as a prior belief that a species is extinct, or to establish the quality of a sighting record, or both. Is this subjectivity necessary? We present two models to explore whether the individual quality of sightings, judged by experts, is strongly informative of the probability of extinction: the ‘quality breakpoint method’ and the ‘quality as variance method’. For the first method we use the Barbary lion as an exemplar. For the second method we use the Barbary lion, Alaotra grebe, Jamaican petrel and Pohnpei starling as exemplars. The ‘quality breakpoint method’ uses certain and uncertain sighting records, and the quality of uncertain records, to establish whether a change point in the rate of sightings can be established using a simultaneous Bayesian optimisation with a non-informative prior. For the Barbary lion, there is a change in subjective quality of sightings around 1930. Unexpectedly sighting quality increases after this date. This suggests that including quality scores from experts can lead to irregular effects and may not offer reliable results. As an alternative, we use quality as a measure of variance around the sightings, not a change in quality. This leads to predictions with larger standard deviations, however the results remain consistent across any prior belief of extinction. Nonetheless, replacing actual quality scores with random quality scores showed little difference, inferring that the quality scores from experts are superfluous. Therefore, we deem the expensive process of obtaining pooled expert estimates as unnecessary, and even when used we recommend that sighting data should have minimal input from experts in terms of assessing the sighting quality at a fine scale. Rather, sightings should be classed as certain or uncertain, using a framework that is as independent of human bias as possible.Investigating the running abilities of Tyrannosaurus rex using stress-constrained multibody dynamic analysishttps://peerj.com/articles/34202017-07-182017-07-18William I. SellersStuart B. PondCharlotte A. BrasseyPhilip L. ManningKarl T. Bates
The running ability of Tyrannosaurus rex has been intensively studied due to its relevance to interpretations of feeding behaviour and the biomechanics of scaling in giant predatory dinosaurs. Different studies using differing methodologies have produced a very wide range of top speed estimates and there is therefore a need to develop techniques that can improve these predictions. Here we present a new approach that combines two separate biomechanical techniques (multibody dynamic analysis and skeletal stress analysis) to demonstrate that true running gaits would probably lead to unacceptably high skeletal loads in T. rex. Combining these two approaches reduces the high-level of uncertainty in previous predictions associated with unknown soft tissue parameters in dinosaurs, and demonstrates that the relatively long limb segments of T. rex—long argued to indicate competent running ability—would actually have mechanically limited this species to walking gaits. Being limited to walking speeds contradicts arguments of high-speed pursuit predation for the largest bipedal dinosaurs like T. rex, and demonstrates the power of multiphysics approaches for locomotor reconstructions of extinct animals.
The running ability of Tyrannosaurus rex has been intensively studied due to its relevance to interpretations of feeding behaviour and the biomechanics of scaling in giant predatory dinosaurs. Different studies using differing methodologies have produced a very wide range of top speed estimates and there is therefore a need to develop techniques that can improve these predictions. Here we present a new approach that combines two separate biomechanical techniques (multibody dynamic analysis and skeletal stress analysis) to demonstrate that true running gaits would probably lead to unacceptably high skeletal loads in T. rex. Combining these two approaches reduces the high-level of uncertainty in previous predictions associated with unknown soft tissue parameters in dinosaurs, and demonstrates that the relatively long limb segments of T. rex—long argued to indicate competent running ability—would actually have mechanically limited this species to walking gaits. Being limited to walking speeds contradicts arguments of high-speed pursuit predation for the largest bipedal dinosaurs like T. rex, and demonstrates the power of multiphysics approaches for locomotor reconstructions of extinct animals.