Abstract
In this work, we propose to accelerate the computational speed of structural optimization by using a machine learning-assisted structural optimization (MLaSO) scheme. A new machine learning model is proposed for integrating within structural optimization using one mesh, training globally during structural optimization based on the selected historic results of a chosen optimization quantity in previous iterations. At selected iterations, the trained neural network predicts the update of the chosen optimization quantity so that the solution can be updated without conducting finite element analysis and sensitivity analysis. The proposed MLaSO scheme can be easily integrated into different structural optimization methods and used to solve many design problems without preparing additional training datasets. As a demonstration, MLaSO is integrated within the solid isotropic material with penalization topology optimization algorithm to solve four 2D design problems. The performance and benefits of MLaSO, in terms of prediction accuracy and computational efficiency, are demonstrated based on the present numerical results.
Similar content being viewed by others
Abbreviations
- \({\varvec{x}}\) :
-
Design variable vector
- \(\overline{{\varvec{x}} }\) :
-
Processed design variable vector
- \({\varvec{K}}\) :
-
Global stiffness matrix
- \({\varvec{U}}\) :
-
Global displacement vector
- \({\mathbf{k}}_{0}\) :
-
Stiffness matrix for a solid element
- \(\mathbf{u}\) :
-
Element displacement vector
- \({\varvec{F}}\) :
-
Global load vector
- \({\varvec{\Phi}}\) :
-
Neural network input vector
- \({\varvec{\Psi}}\) :
-
Neural network output vector
- \(\widehat{{\varvec{\Psi}}}\) :
-
Neural network prediction output
- \({\widehat{{\varvec{\Psi}}}}_{0}\) :
-
Initial guess
- \(\overline{{\varvec{w}} }\) :
-
Radial basis weight matrix
- \({\varvec{w}}\) :
-
Matrix of weights
- \({\varvec{b}}\) :
-
Vector of bias
- \({\varvec{\omega}}\) :
-
Collective representation of w and b
- \({\varvec{h}}\) :
-
Hidden layer
- \({\varvec{d}}\) :
-
Search direction vector
- \(\overline{{\varvec{d}} }\) :
-
Processed search direction vector
- \(\boldsymbol{\alpha }\) :
-
Move step size vector
- \({\varvec{e}}{\varvec{r}}\) :
-
Accumulated error vector
- \({\varvec{\varepsilon}}\) :
-
Prediction error vector
- \(\sigma\) :
-
Activation function
- \({\beta }_{e}\) :
-
Dynamic smoothing factor for element e
- \(T\) :
-
Machine learning model
- \({f}_{e}\) :
-
Error function
- \(g\) :
-
Constraint function
- \(f\) :
-
Objective function
- \({E}_{e}\) :
-
Elastic module for element e
- \({x}_{e}\) :
-
Element density vector
- \(\gamma\) :
-
Dynamic weightage
- \({r}_{l}\) :
-
Learning rate
- \({r}_{u}\) :
-
Upper bound of initial weights and bias
- \(\Delta \gamma\) :
-
The increment of dynamic weightage
- \(n\) :
-
Sample size
- \({v}_{e}\) :
-
Element volume
- \({V}_{f}\) :
-
Volume limit
- \(V\) :
-
Design domain volume
- \(p\) :
-
Material penalty factor
- \(R\) :
-
Element centroid distance
- \(k\) :
-
Iteration number in MLaSO
- \({k}_{\text{so}}\) :
-
Iteration number in structural optimization
- \({k}_{\text{p}}\) :
-
Prediction iteration number
- \({k}_{\text{r}}\) :
-
Routine iteration number
- \(\Delta k\) :
-
Increment of iteration number
- \({k}_{\text{s}}\) :
-
Entry point
- \({k}_{\text{c}}\) :
-
Total iteration number at convergence
- \({k}_{\text{rc}}\) :
-
Total routine iteration number at convergence
- \(l\) :
-
Step number for convergence criterion
- \({n}_{\text{k}}\) :
-
Sample size for convergence criterion
- \({\varepsilon }_{\text{C}}\) :
-
Compliance difference
- \({k}_{\text{pc}}\) :
-
Total prediction iteration number at convergence
- \({\varepsilon }_{{k}_{\text{rc}}}\) :
-
Difference in the total number of routine iterations at convergence
- \({\epsilon }_{\text{m}}\) :
-
Relative difference in the prediction and its exact value
- \({\epsilon }_{\text{c}}\) :
-
Relative difference in objective function
- \(M\) :
-
Total number of training samples collected
- \(q\) :
-
Penalty in the error function
- \({\tau }_{\text{c}}\) :
-
Structural optimization converging criterion
- \({\tau }_{\text{t}}\) :
-
Training converging criterion
- \(C\) :
-
Structure compliance
- \({L}_{2}\) :
-
Norm-2 distance
- \({\varvec{\delta}}\) :
-
Collective representation of φ and \(\widehat{\boldsymbol{\varphi }}\)
- \({N}_{e}\) :
-
Number of elements
- \({N}_{m}\) :
-
Number of neurons in mth hidden layer
- \({N}_{\text{h}}\) :
-
Number of hidden layers
- \({t}_{\text{train}}\) :
-
Training time
- \({t}_{\text{Train}}\) :
-
Total training time
- \({t}_{\text{pred}}\) :
-
Prediction time
- \({t}_{\text{Pred}}\) :
-
Total prediction time
- \({N}_{\text{z}}\) :
-
Number of nonzero elements in the global stiffness matrix
- \(\boldsymbol{\varphi }\) :
-
Chosen optimization quantity obtained in routine iteration
- \(\overline{\boldsymbol{\varphi } }\) :
-
Exponential averaged chosen optimization quantity
- \(\widehat{\boldsymbol{\varphi }}\) :
-
Chosen optimization quantity obtained by neural network prediction
- \({N}_{\text{in}}\) :
-
Number of training data points for the training input
- \({N}_{\text{out}}\) :
-
Number of training data points for the training output
- \({t}_{\text{opt}}\) :
-
Total computational time spent on solving a structural optimization problem
- \({t}_{\text{Total}}\) :
-
Total computational time spent by MLaSO
- \({t}_{\text{FEA}}\) :
-
Computational time of one finite element analysis
- \({t}_{\text{der}}\) :
-
Computational time of one sensitivity analysis
- \({t}_{\text{up}}\) :
-
Computational time of one design variable update
- \({t}_{\text{d}}\) :
-
Total computational time difference
- \({t}_{\text{s}}\) :
-
Total time-saving
- \(i,j\) :
-
Element index number i and j
- \(\text{min}\) :
-
Minimum value
- \(\text{max}\) :
-
Maximum value
- \(\text{ref}\) :
-
Results obtained using top88
- \(e\) :
-
Element e
- \(m\) :
-
mth hidden layer
- \(k\) :
-
Iteration number
- \({k}_{\text{n}}\) :
-
Epoch number in a training loop
- \({k}_{\text{r}}\) :
-
Routine iteration number
References
Abueidda KS, Sobh NA (2020) Topology optimization of 2D structures with nonlinearities using deep learning. Comput Struct 237:106283
Amir O, Bendsøe MP, Sigmund O (2009) Approximate reanalysis in topology optimization. Int J Numer Meth Eng 78(12):1474–1491
Andreassen E, Clausen A, Schevenels M, Lazarov BS, Sigmund O (2011) Efficient topology optimization in MATLAB using 88 lines of code. Struct Multidisc Optim 43(1):1–16
Banga S, Gehani H, Bhilare S, Patel SJ, Kara1 LB (2018) 3D topology optimization using convolutional neural networks. arXiv: 1808.07440
Bendsøe MP, Sigmund O (1995) Optimization of structural topology, shape, and material. Springer, Berlin
Cang R, Yao H, Ren Y (2019) One-shot generation of near-optimal topology through theory-driven machine learning. Comput Aided Design 109:12–21. https://doi.org/10.1016/j.cad.2018.12.008
Chandrasekhar A, Suresh K (2021) TOuNN: Topology optimization using neural networks. Struct Multidisc Optim 63(3):1135–1149. https://doi.org/10.1007/s00158-020-02748-4
Chen F, Xu W, Zhang H, Wang Y, Cao J, Wang M, Ren H, Zhu J, Zhang Y (2018) Topology optimized design, fabrication, and characterization of a soft cable-driven gripper. IEEE Robot Autom Lett 3(3):2463–2470. https://doi.org/10.1109/lra.2018.2800115
Chi H, Zhang Y, Tang TLE, Mirabella L, Dalloro L, Song L, Paulino GH (2021) Universal machine learning for topology optimization. Comput Methods Appl Mech Eng 375:112739
Eschenauer HA, Olhoff N (2001) Topology optimization of continuum structures: a review. Appl Mech Rev 54(4):331–390. https://doi.org/10.1115/1.1388075
Ferrari F, Sigmund O (2020) A new generation 99 line Matlab code for compliance topology optimization and its extension to 3D. Struct Multidisc Optim 62(4):2211–2228. https://doi.org/10.1007/s00158-020-02629-w
He B, Tang W, Huang S, Hou S, Cai H (2016) Towards low-carbon product architecture using structural optimization for lightweight. Int J Adv Manuf Technol 83(5–8):1419–1429. https://doi.org/10.1007/s00170-015-7676-z
Januszkiewicz K, Banachowicz M (2017) Nonlinear shaping architecture designed with using evolutionary structural optimization tools. IOP Conf Ser Mater Sci Eng 245(8):082042. https://doi.org/10.1088/1757-899X/245/8/082042
Kallioras NA, Lagaros ND (2020) DL-SCALE: A novel deep learning-based model order upscaling scheme for solving topology optimization problems. Neural Comput Appl. https://doi.org/10.1007/s00521-020-05480-8
Kim YY, Yoon GH (2000) Multi-resolution multi-scale topology optimization—a new paradigm. Int J Solids Struct 37(39):5529–5559. https://doi.org/10.1016/s0020-7683(99)00251-6
Kim SY, Kim IY, Mechefske CK (2012) A new efficient convergence criterion for reducing computational expense in topology optimization: reducible design variable method. Int J Numer Meth Eng 90(6):752–783. https://doi.org/10.1002/nme.3343
Kollmann HT, Abueidda DW, Koric S, Guleryuz E, Sobh NA (2020) Deep learning for topology optimization of 2D metamaterials. Mater Des 196:109098. https://doi.org/10.1016/j.matdes.2020.109098
Lei X, Liu C, Du Z, Zhang W, Guo X (2019) Machine learning-driven real-time topology optimization under moving morphable component-based framework. J Appl Mech 86(1):011004. https://doi.org/10.1115/1.4041319
Liao Z, Zhang Y, Wang Y, Li W (2019) A triple acceleration method for topology optimization. Struct Multidisc Optim 60(2):727–744. https://doi.org/10.1007/s00158-019-02234-6
Lin Q, Hong J, Liu Z, Li B, Wang J (2018) Investigation into the topology optimization for conductive heat transfer based on deep learning approach. Int Commun Heat Mass Transfer 97:103–109. https://doi.org/10.1016/j.icheatmasstransfer.2018.07.001
Liu CH, Huang GF, Chiu CH, Pai TY (2018) Topology synthesis and optimal design of an adaptive compliant gripper to maximize output displacement. J Intell Rob Syst 90(3):287–304. https://doi.org/10.1007/s10846-017-0671-x
MacBain K, Spillers W (2009) Structural optimization. Springer, New York
Nie Z, Lin T, Jiang H, Kara LB (2021) Topologygan: topology optimization using generative adversarial networks based on physical fields over the initial domain. J Mech Design 143(3):31715
Petković D, Pavlović ND, Shamshirband S, Anuar NB (2013) Development of a new type of passively adaptive compliant gripper. Ind Robot Int J 40(6):610–623. https://doi.org/10.1108/ir-12-2012-452
Sosnovik, I., & Oseledets, I. (2019). Neural networks for topology optimization. Russ J Numer Anal Math Model 34(4):215–223. arXiv:1709.09578v1
Trappenberg T (2020) Fundamentals of machine learning, 1st edn. Oxford University Press, Oxford
Vasista S, Tong L (2012) Design and testing of pressurized cellular planar morphing structures. AIAA J 50(6):1328–1338. https://doi.org/10.2514/1.j051427
Vasista S, Tong L (2013) Topology-optimized design and testing of a pressure-driven morphing-aerofoil trailing-edge structure. AIAA J 51(8):1898–1907. https://doi.org/10.2514/1.j052239
Vasista S, Tong L (2014) Topology optimisation via the moving iso-surface threshold method: implementation and application. The Aeronautical Journal 118(1201):315–342. https://doi.org/10.1017/s0001924000009143
Venkayya V (1978) Structural optimization: a review and some recommendations. Int J Numer Meth Eng 13(2):203–228. https://doi.org/10.1002/nme.1620130202
Wold S (1994) Exponentially weighted moving principal components analysis and projections to latent structures. Chemom Intell Lab Syst 23(1):149–161. https://doi.org/10.1016/0169-7439(93)E0075-F
Wu J, Dick C, Westermann R (2015) A system for high-resolution topology optimization. IEEE Trans Visual Comput Graphics 22(3):1195–1208. https://doi.org/10.1109/TVCG.2015.2502588
Yu Y, Hur T, Jung J, Jang IG (2019) Deep learning for determining a near-optimal topological design without any iteration. Struct Multidisc Optim 59(3):787–799. https://doi.org/10.1007/s00158-018-2101-5
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Replication of results
The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.
Additional information
Responsible Editor: Jianbin Du
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix: Algorithms for MLaSO-\({\varvec{d}}\)
Appendix: Algorithms for MLaSO-\({\varvec{d}}\)
Rights and permissions
About this article
Cite this article
Xing, Y., Tong, L. A machine learning-assisted structural optimization scheme for fast-tracking topology optimization. Struct Multidisc Optim 65, 105 (2022). https://doi.org/10.1007/s00158-022-03181-5
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s00158-022-03181-5