arXiv Analytics

Sign in

arXiv:1904.10566 [math.NA]AbstractReferencesReviewsResources

Time-Varying Matrix Eigenanalyses via Zhang Neural Networks and look-Ahead Finite Difference Equations

Frank Uhlig, Yunong Zhang

Published 2019-04-23Version 1

This paper adapts look-ahead and backward finite difference formulas to compute future eigenvectors and eigenvalues of piecewise smooth time-varying symmetric matrix flows $A(t)$. It is based on the Zhang Neural Network (ZNN) model for time-varying problems and uses the associated error function $E(t) = A(t)V(t) - V(t) D(t)$ or $e_i(t) = A(t)v_i(t) -\la_i(t)v_i(t)$ with the Zhang design stipulation that $\dot E(t) = - \eta E(t)$ or $\dot e_i(t) = - \eta e_i(t)$ with $\eta > 0$ so that $E(t)$ and $e(t)$ decrease exponentially over time. This leads to a discrete-time differential equation of the form $P(t_k) \dot z(t_k) = q(t_k)$ for the eigendata vector $z(t_k)$ of $A(t_k)$. Convergent look-ahead finite difference formulas of varying error orders then allow us to express $z(t_{k+1})$ in terms of earlier $A$ and $z$ data. Numerical tests, comparisons and open questions complete the paper.

Related articles:
arXiv:1904.10568 [math.NA] (Published 2019-04-23)
Zhang Neural Networks for Fast and Accurate Computations of the Field of Values