Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nonlinear solid solver #24

Merged
merged 49 commits into from
May 25, 2020
Merged

Nonlinear solid solver #24

merged 49 commits into from
May 25, 2020

Conversation

davidscn
Copy link
Member

@davidscn davidscn commented Apr 4, 2020

This PR adds a new nonlinear solver written in deal.II

Here a small list of features not yet implemented and to further work on

  • implement body forces
  • implement time stepping scheme
  • implement contributions in the nonlinear system due to inertia
  • add parameters needed for dynamic processes
  • port to preCICE
  • add documentation in the code
    change license
  • change min version

This implementation is not directly the theory described in #15, but it resolves it anyway.

davidscn added 5 commits April 4, 2020 11:39
- read parameter file according to command line
- remove unneccessary parameter options
- replace deprecated functions
- remove option for automated differentiation
@davidscn davidscn changed the base branch from master to develop April 4, 2020 11:34
@davidscn davidscn requested a review from uekerman April 4, 2020 11:46
@davidscn
Copy link
Member Author

@uekerman
The solver part is now ready! You might already want to review. I removed all previous comments, so they are probably missing at some points.

A few details here and some points to discuss, also where we might want to invest some additional work:

  • I implemented an implicit Newmark method, since I was pretty sure, that it will work. I think an extension to a generalized alpha method is not too much work and gives further flexibility to the scheme. Although we can select now already certain properties e.g. energy conservation, convergence order up to two, numerical dissipation, stability.. depending on the parameters
  • The nonlinear time steps are solved by a Newton scheme and the user can specify a relative criterion for the residuum and displacement, which both need to be satisfied to be converged.
  • The linear system can be solved directly, where deal uses UMFPACK or iteratively using a CG method. The direct solver is quiet fast for a moderate problem size. The CG method depends hardly on the preconditioner. Here, we currently use SSOR, but I think multigrid algorithm could be better, or we use an ILU decomposition, which we reuse for several Newton iterations.
    For performance reason, we could make the CG criterion depending on the Newton iteration, so that we solve only highly accurate in the last Newton iteration. I'm thinking about these method, since the iterative solver setup is in my opinion too slow at the moment. You could try yourself, if you like.
  • The solver includes a nonlinear compressible neo-hookean material. Hence, we are now nonlinear in the material model and geometrical.
  • I validated the setup with the time dependent CSM benchmark. Let me know, if you wwant me to visualize the results or upload a comparison.
  • We should think about a parallelization, to make the solver also for larger systems applicable. Deal has either its own class for distributed vectors or PETSc wrappers are used.

@davidscn
Copy link
Member Author

and as a last point:

  • I think we should move the (upcoming) precice specific functions to a separate include file. This is not consistent with the linear case adapter, but I think it is good for users to clearly see, what is precice specific and which parts belong to the actual code. Also, the code is getting quite long already and it is probably a bad idea to overrun the 2000 lines.

@uekerman
Copy link
Member

Looks good 👌

* I implemented an implicit Newmark method, since I was pretty sure, that it will work. I think an extension to a generalized alpha method is not too much work and gives further flexibility to the scheme.  Although we can select now already certain properties e.g. energy conservation, convergence order up to two, numerical dissipation, stability.. depending on the parameters

Yes, should not be too complicated, but also "only" nice-to-have for our purpose in my opinion.

* The nonlinear time steps are solved by a Newton scheme and the user can specify a relative criterion for the residuum and displacement, which both need to be satisfied to be converged.

With displacement you mean the update in the Newton iteration? If yes I guess only the residuum should be sufficient.

* The linear system can be solved directly, where deal uses UMFPACK or iteratively using a CG method. The direct solver is quiet fast for a moderate problem size. The CG method depends hardly on the preconditioner. Here, we currently use SSOR, but I think multigrid algorithm could be better, or we use an ILU decomposition, which we reuse for several Newton iterations.
  For performance reason, we could make the CG criterion depending on the Newton iteration, so that we solve only highly accurate in the last Newton iteration. I'm thinking about these method, since the iterative solver setup is in my opinion too slow at the moment. You could try yourself, if you like.

I guess that for now "only" having a direct solver is sufficient. The purpose of the solver should anyway not be to handle very large cases.

* The solver includes a nonlinear compressible neo-hookean material. Hence, we are now nonlinear in the material model and geometrical.

Great 👍

* I validated the setup with the time dependent CSM benchmark. Let me know, if you wwant me to visualize the results or upload a comparison.

Great again 👍 Could you please upload a screenshot and mention the commit you used here? Just as a future reference.

* We should think about a parallelization, to make the solver also for larger systems applicable. Deal has either its own class for distributed vectors or PETSc wrappers are used.

Similar argument as above. I don't think we need parallelization for the moment. If we go for it later on shared-memory parallelization with a direct solver is completely sufficient.

* I think we should move the (upcoming) precice specific functions to a separate include file. This is not consistent with the linear case adapter, but I think it is good for users to clearly see, what is precice specific and which parts belong to the actual code. Also, the code is getting quite long already and it is probably a bad idea to overrun the 2000 lines.

I agree. It could be even good to split up the solver in several files and better hide some complexity. We could then also think about modifying the linear solver in a similar way? Maybe they could even use the same "adapter file" then? (last two points have lower priority)

@davidscn
Copy link
Member Author

Alright, here the results of CSM3, feel free to compare t against the original paper.
There is a 'large scale oscillation' in the results, since I don't estimate/ initialize the acceleration in the beginning/assume it as zero. This is not suited in case of body forces, as in this case. For coupling it should be sufficient, since the acceleration/ motion develops mostly during the simulation.
I will add a TODO in the code and probably implement it later.
Computed with 8f73a02
Sorry for the ugly stretching, this is the default paraView export.

Grid:
	 Reference volume: 0.0070202
Triangulation:
	 Number of active cells: 50
	 Polynomial degree: 4
	 Number of degrees of freedom: 1818
    Setting up quadrature point data...


+---------------------------------------------+------------+------------+
| Total wallclock time elapsed since start    | 2.179e+02s |            |
|                                             |            |            |
| Section                         | no. calls |  wall time | % of total |
+---------------------------------+-----------+------------+------------+
| Assemble linear system          |      4761 | 1.053e+02s |  4.83e+01% |
| Linear solver                   |      3761 | 1.030e+02s |  4.73e+01% |
| Setup system                    |         1 | 3.659e-02s | 0.000e+00% |
+---------------------------------+-----------+------------+------------+

@davidscn
Copy link
Member Author

davidscn commented May 13, 2020

Here is a first result of the FSI3 benchmark. Unfortunately, the simulation breaks at this point, which is a little bit curious for me. I have not seen any complete graph for the y-displacement, only for later times. It might take some time for the amplitude to stagnate.
Any opinions on that? What do you think?
EDIT: Looking a bit closer at the Fluid participant, it seems the simulation is still diverging over time. The pressure grows over time up to 40 (starting from ~4) and has also some larger negative values.

@BenjaminRodenberg
Copy link
Member

BenjaminRodenberg commented May 13, 2020

Here is a first result of the FSI3 benchmark. Unfortunately, the simulation breaks at this point, which is a little bit curious for me. I have not seen any complete graph for the y-displacement, only for later times. It might take some time for the amplitude to stagnate.
Any opinions on that? What do you think?
EDIT: Looking a bit closer at the Fluid participant, it seems the simulation is still diverging over time. The pressure grows over time up to 40 (starting from ~4) and has also some larger negative values.

Hi David,

just a more or less educated guess: depending on your time stepping scheme your coupled simulation can become unstable. We are working on this in precice/precice#133. Richard, for example, made this observation with a FSI case in his thesis (see Fig 5.10) when using the generalized alpha method with certain parameters. Modifying the parameters (and thus the way the time stepping works) helped in the end.

Best regards,
Benjamin

@davidscn
Copy link
Member Author

Modifying the parameters (and thus the way the time stepping works) helped in the end.

Very interesting. Also the fact, that his simulation converged for the larger time step of 0.005s. In my current setup 0.005 works without any problem using our tutorial setups. For smaller sizes (as in the reference) I needed to select a different time integration in the Fluid solver to make this stable.

Since these kind of instabilities might be of interest for many users, it might be worth opening a discourse issue collecting some lessons learnt or best guesses similar to the FAQs already there.

Copy link
Member Author

@davidscn davidscn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added now some comments and need to push the applied changes.
Still, I need to add more documentation in the whole code WIP.

davidscn added 2 commits May 23, 2020 10:35
Add documentation in the main code
Add ouput path to keep preCICE simulation directory clean
Simplify loops to range based loops
@davidscn davidscn linked an issue May 24, 2020 that may be closed by this pull request
Co-authored-by: Benjamin Uekermann <benjamin.uekermann@gmail.com>
@davidscn
Copy link
Member Author

Perfect. I will merge here. The linear solver integration becomes its own PR

@davidscn davidscn merged commit 519bd58 into develop May 25, 2020
@davidscn davidscn deleted the nonlinear_solid_solver branch June 3, 2020 11:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Extend to nonlinear solid mechanics
4 participants