Skip to content

GMRES for Poisson Equation#6367

Open
ajnonaka wants to merge 14 commits intoBLAST-WarpX:developmentfrom
ajnonaka:gmres_for_warpx
Open

GMRES for Poisson Equation#6367
ajnonaka wants to merge 14 commits intoBLAST-WarpX:developmentfrom
ajnonaka:gmres_for_warpx

Conversation

@ajnonaka
Copy link
Copy Markdown
Contributor

@ajnonaka ajnonaka commented Nov 6, 2025

GMRES implementation of poisson solver, enabled with warpx.poisson_solver = gmres.

Instead of pure multigrid via the AMReX MLMG class, this uses an MLMG-preconditioned GMRES algorithm to solve the Poisson equation. More documentation forthcoming.

@dpgrote
Copy link
Copy Markdown
Member

dpgrote commented Feb 11, 2026

I've tried this out and it works quite nicely. It's fully functional for my needs, in 2D, with Dirichlet and Neumann boundaries, and embedded boundaries. Comparing it to multigrid, it is consistently faster.

On a Mac with the small grid (80x200), the multigrid takes ~0.047 s per solve, the gmres 0.028 s per solve. (Though by comparison, the multigrid solver in Warp takes 0.0024 s per solve.)

On AMD GPU for a larger grid (360x800), the multigrid takes 0.058 s and gmres 0.048 s per solve, not a big difference.

On parallel CPU (256 processors) with the larger grid, multigrid taks 0.032 s, and gmres 0.029 s, also not a big difference. (For reference, the multigrid solver in Warp takes 0.01 s per solve.)

@JustinRayAngus
Copy link
Copy Markdown
Contributor

What is (if any) the precondition matrix solver?

@dpgrote
Copy link
Copy Markdown
Member

dpgrote commented Feb 11, 2026

What is (if any) the precondition matrix solver?

It says that it does 1 MLMG cycle.

@ajnonaka ajnonaka marked this pull request as ready for review April 7, 2026 18:22
@ax3l ax3l added the component: implicit solvers Anything related to implicit solvers label Apr 7, 2026
@ax3l ax3l requested a review from WeiqunZhang April 7, 2026 22:38
@ax3l
Copy link
Copy Markdown
Member

ax3l commented Apr 7, 2026

@ajnonaka can you merge development again and check why the 1D test fails? I somehow have no permission to push to this branch.

Comment on lines +464 to +479
#if defined(WARPX_DIM_1D_Z)
amrex::Array<amrex::MultiFab*, 1>{
warpx.m_fields.get(warpx::fields::FieldType::Efield_fp, Direction{2}, lev)
}
#elif defined(WARPX_DIM_XZ) || defined(WARPX_DIM_RZ)
amrex::Array<amrex::MultiFab*, 2>{
warpx.m_fields.get(warpx::fields::FieldType::Efield_fp, Direction{0}, lev),
warpx.m_fields.get(warpx::fields::FieldType::Efield_fp, Direction{2}, lev)
}
#elif defined(WARPX_DIM_3D)
amrex::Array<amrex::MultiFab *, 3>{
warpx.m_fields.get(warpx::fields::FieldType::Efield_fp, Direction{0}, lev),
warpx.m_fields.get(warpx::fields::FieldType::Efield_fp, Direction{1}, lev),
warpx.m_fields.get(warpx::fields::FieldType::Efield_fp, Direction{2}, lev)
}
#endif
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dpgrote Is electrostatic mode in WarpX not setup for RCYLINDER and RSPHERE geometries?

Copy link
Copy Markdown
Member

@dpgrote dpgrote Apr 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JustinRayAngus Those geometries are not supported in electrostatic. Also, this block of code is for EB which is also not supported.

} else {
amrex::GMRESMLMG gmsolve(mlmg);
gmsolve.solve( *phi[lev], *rho[lev], relative_tolerance, absolute_tolerance );
linop->postSolve({phi[lev]});
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the need for linop->postSolve() here ? Why is it not needed when using mlmg?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MLMG calls postSolve within, which sets the value of phi in covered cells to the Dirichlet value. GMRES initially left phi in covered cells at zero; I added this call so the plotfiles would match up in the covered cells.

Comment on lines 405 to 409
amrex::MLMG mlmg(*linop); // actual solver defined here
mlmg.setVerbose(verbosity);
mlmg.setMaxIter(max_iters);
mlmg.setConvergenceNormType(amrex::MLMGNormType::greater);
mlmg.setNoGpuSync(true);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now it doesn't look like this PR is setup to change additional parameters used by GMRES. I think it would be a good ideal to make the linear solver a member variable defined once on initialization, then you could have control over more options. For example, you may want to manually set the restart length:
m_linear_solver->setRestartLength( m_linsol_restart_length );

See Source/NonlinearSolvers/NewtonSolver.H for an example of what I'm referring to.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reva wants to discuss at the next dev meeting; the MLMG option has no exposure either.

Comment on lines +425 to +426
amrex::GMRESMLMG gmsolve(mlmg);
gmsolve.solve( *phi[lev], *rho[lev], relative_tolerance, absolute_tolerance );
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@RemiLehe commented: can we also handle max_iters here? Otherwise it is silently ignored if user-set

set with warpx.magnetostatic_solver_max_iters
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

component: implicit solvers Anything related to implicit solvers

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants