Jinchao Xu - Selected Publications#


[1] Parallel Multilevel Preconditioners, with J. Bramble and J. Pasciak, Math. Comp., 1990, Vol. 55, 191, pp. 1-22.

The algorithm presented and analyzed in this paper is now known as the Bramble-Pasciak-Xu (BPX) preconditioner, which is one of the two fundamental multigrid algorithms for solving large-scale discretized partial differential equations.
(1074 Google Scholar citations)

[2] Convergence Estimates for Product Iterative Methods with Applications to Domain Decomposition, with J. Bramble, J. Pasciak and J. Wang, Math. Comp., 1991, Vol. 57, 195, pp. 121.

This paper gives the first optimal convergence estimate for the overlapping domain decomposition method with arbitrary number of subdomains.
(395 Google Scholar citations)

[3] Convergence Estimates for Multigrid Algorithms without Regularity Assumption, with J. Bramble, J. Pasciak and J. Wang, Math. Comp., 1991, Vol. 57, 195, pp. 23-45.

This paper gives the first optimal convergence estimate for multigrid method for elliptic problem without regularity assumption. The technique in the paper made it possible to analyze multigrid methods for a very large class of problems (such as adaptive grids) that can not be analyzed by classic methods.
(410 Google Scholar citations)

[4] Iterative Methods by Space Decomposition and Subspace Correction, SIAM Review, 1992, Vol. 34, No. 4, pp. 581-613.

By using the concept of space decomposition and subspace corrections, this paper gives a unified framework for the design and analysis of most linear iterative methods including multigrid, domain decomposition, Gauss-Seidel and Jacobi iterative and preconditioning methods.
(1594 Google Scholar citations)

[5] Two-grid Discretization Techniques for Linear and Nonlinear PDE, SIAM J. of Numer. Anal., 1996, Vol. 33, No. 5, pp. 1759-1777.

The two-grid method, proposed in this paper, emerged to be a widely studied discretization technique in the literature for a variety of problems such as non-symmetric, nonlinear and coupled partial differential equations. The idea is to first use a coarse grid to resolve non-symmetry, nonlinear and coupling and then use a fine grid to discretize a reduced system that is symmetric, linear and decoupled respectively.
(733 Google Scholar citations)

[6] The Method of Alternating Projections and the Method of Subspace Corrections in Hilbert Space, with L. Zikatanov, J. of AMS, 2002, Vol. 15, pp. 573-597.

This paper gives the sharpest possible estimate for the convergence of the subspace correction method by using what is now known as the Xu-Zikatanov (XZ) identity. Most of the existing estimates (which have been studied in hundreds of papers) can be easily derived from the XZ-identity. It also give a sharp generalization of the original convergence theory for alternating projection methods for two subspace case (von Neumann 1933) to multiple subspace case.
(300 Google Scholar citations)

[7] Nodal Auxiliary Space Preconditioning in H(curl) and H(div) Spaces, with R. Hiptmair, SIAM J. Numer. Anal., 2007, Vol. 45, No. 6, pp. 2483-2509.

The algorithm proposed and analyzed in this paper is now known as the Hiptmair-Xu (HX) preconditioner and it was featured in a 2008 DOE Report on "Recent Significant Advances (10 breakthroughs) in Computational Science" which shows that HX preconditioner outperforms existing algorithms for magnetohydrodynamics (MHD) related applications by tens and sometimes hundreds of times.
(375 Google Scholar citations)

[8] Algebraic Multigrid Methods, with L. Zikatanov, Acta Numerica, 2017, 597-721.

In this invited paper, a unified framework and theory is developed to derive and analyze in a coherent manner for a large variety of algebraic multigrid (AMG) methods for solving large-scale systems of equations.
(200 Google Scholar citations)

[9] MgNet: A Unified Framework of Multigrid and Convolutional Neural Network, with J. He, Science China Mathematics, 2019, pp. 1-24.

This paper relates two totally different fields, namely multigrid methods for solving partial differential equations and convolutional neural networks for image classification by a unified algorithm known as MgNet. In particular, it shows that an effective and new class of convolutional neural network models can be directly obtained by making some minor modification of a geometric multigrid method.
(101 Google Scholar citations)

[10] ReLU Deep Neural Networks and linear Finite Elements, with J. He, L. Li and C. Zheng, J. Comput. Math., 2020, Vol.38, No.3, pp. 502–527.

This paper discuss the relationship between deep neural networks (DNN) with rectified linear unit (ReLU) function as the activation function and continuous piece-wise linear (CPWL) functions, especially CPWL functions from the simplicial linear finite element method (FEM). It essentially shows that ReLU-DNN and FEM present the same class of piecewise linear functions but with totally different structures.
(223 Google Scholar citations)

CITATION REPORT:
  • Basic statistics from Google Scholar (as of May 13, 2023):
    • 18453 citations-
    • H-index= 65, i10-index= 194
  • Ranked 5th in the world in number of citations for mathematicians (excluding statisticians) for the period 1991-
    • 2001 by the Institute for Scientific Information (see Science Watch, May/June 2002, Vol. 13, No. 3.)
  • Three papers are among the top 20 most-cited articles from past volumes of Mathematics of Computation (see https://www.ams.org/publications/journals/journalsframework/AMSMathViewer)

Imprint Privacy policy « This page (revision-5) was last changed on Wednesday, 24. May 2023, 08:15 by System
  • operated by