Prediction of Rolling Force Based on a Fusion of Extreme Learning Machine and Self Learning Model of Rolling Force

  • AZiGuLi
  • Can Cui
  • Yonghong XieEmail author
  • Shuang Ha
  • Xiaochen Wang
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 686)


Aiming at the rolling force model of hot strip rolling mill, the forecasting technique based on ELM (extreme learning machine) is presented in this paper. Initially, the variables associated with control rolling are identified by the analysis of the traditional formula of rolling force, in order to ensure the effectiveness of the model, and then apply ELM network to predict models. Production data is applied to train and test the above network, and compare with the modified calculated value of rolling force, which got from the self-learning model of rolling force. The results show that the thickness can be predicted more rapidly and exactly, which can meet the actual demand of production, when this model and the rolling force learning model are integrated.


ELM Rolling force model Self learning 


  1. 1.
    Sun, Y.K.: Model of Control Hot Strip Mill, pp. 124–163. Metallurgical Industry Press, Beijng (2002)Google Scholar
  2. 2.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neuro-Computing 70, 489–501 (2006)Google Scholar
  3. 3.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feed-forward neural networks. In: Proceedings of International Joint Conference on Neural Networks (IJCNN 2004), pp. 985–990 (2004)Google Scholar
  4. 4.
    Lonen, J., Kamarainen, J.-K., Lampinen, J.: Differential evolution training algorithm for feed- forward neural networks. Neural Process. Lett. 7(1), 93–105 (2003)Google Scholar
  5. 5.
    Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feed-forward networks with random hidden nodes. IEEE Trans. Neural Netw. 17(4), 879–892 (2006)CrossRefGoogle Scholar
  6. 6.
    Huang, G.B., Chen, L.: Convex incremental extreme learning machine. Neurocomputing 70, 3056–3062 (2007)CrossRefGoogle Scholar
  7. 7.
    Xie, J., Jiang, S., Xie, W., et al.: An efficient global K-means clustering algorithm. J. Comput. 6(2), 271–279 (2011)CrossRefGoogle Scholar
  8. 8.
    Al-Zoubi, M.B., Hudaib, A., Huneiti, A., et al.: New efficient strategy to accelerate k-means clustering algorithm. Am. J. Appl. Sci. 5(9), 1247–1250 (2008)CrossRefGoogle Scholar
  9. 9.
    Gaemperle, R., Mueller, S.D., Koumoutsakos, P.: A parameter study for differential evolution. In: Grmela, A., Mastorakis, N.E. (eds.) Advances in Intelligent Systems, Fuzzy Systems, Evolutionary Computation, pp. 293–298. WSEAS Press (2002)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • AZiGuLi
    • 1
    • 2
  • Can Cui
    • 1
    • 2
  • Yonghong Xie
    • 1
    • 2
    Email author
  • Shuang Ha
    • 1
    • 2
  • Xiaochen Wang
    • 3
  1. 1.School of Computer and Communication EngineeringUniversity of Science and Technology BeijingBeijingChina
  2. 2.Beijing Key Laboratory of Knowledge Engineering for Materials ScienceBeijingChina
  3. 3.School of Electronic EngineeringXidian UniversityXi’anChina

Personalised recommendations