A framework for structural shape optimization based on automatic differentiation (AD) and the adjoint method, enabled by JAX.
Developed by Gaoyuan Wu @ Princeton.
We have a recent publication in Structural and Multidisciplinary Optimization where you can find details regarding this framework. Please share our project with others and cite us if you find it interesting and helpful. Cite us using:
@article{wu_framework_2023,
title = {A framework for structural shape optimization based on automatic differentiation, the adjoint method and accelerated linear algebra},
volume = {66},
issn = {1615-1488},
url = {https://doi.org/10.1007/s00158-023-03601-0},
doi = {10.1007/s00158-023-03601-0},
language = {en},
number = {7},
urldate = {2023-06-21},
journal = {Structural and Multidisciplinary Optimization},
author = {Wu, Gaoyuan},
month = jun,
year = {2023},
keywords = {Adjoint method, Automatic differentiation, Bézier surface, Form finding, JAX, Shape optimization, Shell structure},
pages = {151},
}
- Automatic differentiation (AD): an easy and accurate way for gradient evaluation. The implementation of AD avoids deriving derivatives manually or trauncation errors from numerical differentiation.
- Acclerated linear algebra (XLA) and just-in-time compilation: these features in JAX boost the gradient evaluation
- Hardware acceleration: run on GPUs and TPUs for faster experience.
- Form finding based on finite element analysis (FEA) and optimization theory
Here is an implementation of JaxSSO to form-find a structure inspired by Mannheim Multihalle using simple gradient descent. (First photo credit to Daniel Lukac)
We consider the minimization of the strain energy by changing the shape of structures, which is equivalent to maximizing the stiffness and reducing the
bending in the structure. The mathematical formulation of this problem is as follows, where no additional constraints are considered.
To implement gradient-based optimization, one needs to calculate
- Conduct FEA to get
$\mathbf u$ - Conduct sensitivity analysis to get
$\frac{\partial \mathbf{K}}{\partial x_i}$ .
Install it with pip: pip install JaxSSO
JaxSSO is written in Python and requires:
- numpy >= 1.22.0.
- JAX: "JAX is Autograd and XLA, brought together for high-performance machine learning research." Please refer to this link for the installation of JAX.
-
Nlopt: Nlopt is a library for nonlinear optimization. It has Python interface, which is implemented herein. Refer to this link for the installation of Nlopt. Alternatively, you can use
pip install nlopt
, please refer to nlopt-python. - scipy.
The project provides you with interactive examples with Google Colab for quick start:
- 2D-arch: form-finding of a 2d-arch
- 3D-arch: form-finding of a 3d-arch
- Mannheim Multihalle: form-finding of Mannheim Multihalle