Inference for time-delay differential equations
Modeling dynamic systems often involves feedbacks among components. However, there is time delay for these systems to sense and respond to feedbacks. Delay differential equations (DDEs) are commonly for these purposes. Normally, the model equations and noisy observations from the dynamic systems are provided while parameters are unknown and required for estimation. We extend manifold-constrained Gaussian process inference (MAGI) to conduct parameter inference in DDEs using noisy and sparse observations. This method imposes a Gaussian process model over a time series data conditional on the manifold constraint that the DDEs must be satisfied under a Bayesian framework. To get a computational-efficiency algorithm, linear interpolation is applied to approximate the values of the lagged state variables. In this work, we also obtain some error bounds for the derivatives of the state variables along with relevant simulation results to justify the approximation method is valid. Moreover, we present two simulation examples, including the Hutchinson equation and the lac operon system, and a real-world application using Ontario COVID data, to illustrate the efficiency of our model.
Date and Time
-
Langue de la présentation orale
Anglais
Langue des supports visuels
Anglais