Advertisement

Computational intelligence based models for prediction of elemental composition of solid biomass fuels from proximate analysis

  • Suhas B. Ghugare
  • Shishir Tiwary
  • Sanjeev S. TambeEmail author
Original Article

Abstract

Biomass is a renewable and sustainable source of “green” energy. The elemental composition comprising carbon (C), hydrogen (H) and oxygen (O) as major components, is an important measure of the biomass fuel’s energy content. Its knowledge is also valuable in: (a) computing material balance in a biomass-based process, (b) designing and operating biomass utilizing efficient and clean combustors, gasifiers and boilers, (c) fixing the quantity of oxidants required for biomass combustion/gasification, and (d) determining the volume and composition of the combustion/gasification gases. Obtaining the elemental composition of a biomass fuel via ultimate analysis is an expensive and time-consuming task. In comparison, proximate analysis that determines fixed carbon, ash, volatile matter and moisture content is a cruder characterization of the fuel and easier to perform. Thus, there exists a need for models possessing high accuracies for predicting the elemental composition of a solid biomass fuel from its proximate analysis constituents. Accordingly, this study utilizes three computational intelligence (CI) formalisms, namely, genetic programming, artificial neural networks and support vector regression, for developing nonlinear models for the prediction of C, H and O fractions of solid biomass fuels. A large database of 830 biomasses has been used in the stated model development. A comparison of the prediction accuracy and generalization performance of the nine CI-based models (three each for C, H and O) with that of the currently available linear models indicates that the CI-based models have consistently and significantly outperformed their linear counterparts. The models developed in this study have proved to be the best models for the prediction of elemental composition of solid biomass fuels from their proximate analyses.

Keywords

Biomass fuels Elemental composition Ultimate analysis Proximate analysis Computational intelligence 

Notes

Acknowledgments

This study is partly supported by the Council of Scientific and Industrial Research (CSIR), Government of India, New Delhi, under Network project (TAPCOAL).

Conflict of interest

The authors declare no conflict of interest.

Supplementary material

13198_2014_324_MOESM1_ESM.doc (983 kb)
Supplementary material 1 (DOC 983 kb)

References

  1. ASTM D5142-04 (2004) Standard test method for proximate analysis of the analysis sample of coal and coke by instrumental procedure. ASTM International, West ConshohockenGoogle Scholar
  2. ASTM D3176-09 (2009) Standard practice for ultimate analysis of coal and coke. ASTM International, PennsylvaniaGoogle Scholar
  3. Bansal S, Roy S, Larachi F (2012) Support vector regression models for trickle bed reactors. Chem Eng J 207–208:822–831CrossRefGoogle Scholar
  4. Barmpalexis P, Kachrimanis K, Tsakonas A, Georgarakis E (2011) Symbolic regression via genetic programming in the optimization of a controlled release pharmaceutical formulation. Chemom Intell Lab Syst 107:75–82CrossRefGoogle Scholar
  5. Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, OxfordzbMATHGoogle Scholar
  6. Bouhouche S, Yazid LL, Hocine S, Bast J (2010) Evaluation using online support-vector-machines and fuzzy reasoning. Application to condition monitoring of speeds rolling process. Control Eng Pract 18:1060–1068CrossRefGoogle Scholar
  7. Cordero T, Marquez F, Rodriquez MJ, Rodriguez JJ (2001) Predicting heating values of lignocellulosic and carbonaceous materials from proximate analysis. Fuel 80:1567–1571. doi: 10.1016/S0016-2361(01)00034-5 CrossRefGoogle Scholar
  8. Desai K, Badhe Y, Tambe SS, Kulkarni BD (2006) Soft-sensor development for fed-batch bioreactors using support vector regression. Biochem Eng J 27(770):225–239CrossRefGoogle Scholar
  9. ECN Phyllis (2012) The composition of biomass and waste. http://www.ecn.nl/phyllis/single.html. Accessed 15 April 2014
  10. Freeman JA, Skapura DM (1991) Neural Networks: Algorithms, Applications, and Programming Techniques. Addison-Wesley, ReadingzbMATHGoogle Scholar
  11. Ghugare SB, Tiwary S, Elangovan V, Tambe SS (2014) Prediction of higher heating value of solid biomass fuels using artificial intelligence formalisms. Bioenergy Res 7:681–692. doi: 10.1007/s12155-013-9393-5 CrossRefGoogle Scholar
  12. Holland JH (1975) Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann ArborGoogle Scholar
  13. Ivanciuc O (2007) Applications of support vector machines in chemistry, In: Lipkowitz KB, Cundari TR (ed) Reviews in computational chemistry. Wiley-VCH, Weinheim, pp 291–400Google Scholar
  14. Khadem SA, Jahromi IR, Zolghadr A, Ayatollahi S (2014) Pressure and temperature functionality of paraffin-carbon dioxide interfacial tension using genetic programming and dimension analysis (GPDA) method. J Nat Gas Sci Eng 20:407–413CrossRefGoogle Scholar
  15. Kovačič M, Šarler B (2014) Genetic programming prediction of the natural gas consumption in a steel plant. Energy 66:273–284CrossRefGoogle Scholar
  16. Koza JR (1990) Genetically breeding populations of computer programs to solve problems in artificial intelligence, In: Proceedings of the 2nd international IEEE conference on tools for artificial intelligence, pp 819–827Google Scholar
  17. Marsland S (2009) Machine learning an algorithmic perspective. Chapman & Hall/CRC, Boca RatonGoogle Scholar
  18. Mierswa I, Wurst M, Klinkenberg R, Scholz M, Euler T (2006) YALE: rapid prototyping for complex data mining tasks, In: Proceedings of the 12th ACM SIGKDD international conference on knowledge discovery and data mining (KDD-06)Google Scholar
  19. Parikh J, Channiwala SA, Ghosal GK (2007) A correlation for calculating elemental composition from proximate analysis of biomass materials. Fuel 86:1710–1719. doi: 10.1016/j.fuel.2006.12.029 CrossRefGoogle Scholar
  20. Patil-Shinde V, Kulkarni T, Kulkarni R, Chavan PD, Sharma T, Sharma BK, Tambe SS, Kulkarni BD (2014) Artificial intelligence-based modeling of high ash coal Gasification in a pilot plant scale fluidized bed gasifier. Ind Engg Chem Res (In press). doi: 10.1021/ie500593j
  21. Rumelhart DE, Hinton GE, William RJ (1986) Learning representation by back-propagating errors. Nature 323:533–536. doi: 10.1038/323533a0 CrossRefzbMATHGoogle Scholar
  22. Russo M, Leotta G, Pugliatti PM, Gigliucci G (2014) Genetic programming for photovoltaic plant output forecasting. Sol Energy 105:264–273CrossRefGoogle Scholar
  23. Schmidt M, Lipson H (2009) Distilling free-form natural laws from experimental data. Science 324:81–85. doi: 10.1126/science.1165893 CrossRefGoogle Scholar
  24. Sharma S, Tambe SS (2014) Soft-sensor development for biochemical systems using Genetic Programming. Biochem Eng J 85:89–100. doi: 10.1016/j.bej.2014.02.007 CrossRefGoogle Scholar
  25. Shen J, Zhu S, Liu X, Zhang H, Tan J (2010) The prediction of elemental composition of biomass based on proximate analysis. Energy Convers Manag 51(5):983–987. doi: 10.1016/j.enconman.2009.11.039 CrossRefGoogle Scholar
  26. Steiger JH (1980) Tests for comparing elements of a correlation matrix. Psychol Bull 87:245–251CrossRefGoogle Scholar
  27. Tambe SS, Kulkarni BD, Deshpande PB (1996) Elements of artificial neural networks with selected applications in chemical engineering, and chemical & biological sciences. Simulation & Advanced Controls Inc., LouisvilleGoogle Scholar
  28. Tan PN, Steinbach M, Kumar V (2006) Introduction to data mining. Pearson Education Inc, MichiganGoogle Scholar
  29. Vapnik V (1995) The nature of statistical learning theory. Springer Verlag, New YorkCrossRefzbMATHGoogle Scholar
  30. Vapnik V, Chervonenkis A (1974) Theory of pattern recognition. Nauka, MoscowzbMATHGoogle Scholar
  31. Wang X, Li Y, Hu Y, Wang Y (2008) Synthesis of heat-integrated complex distillation systems via Genetic Programming. Comput Chem Eng 32:1908–1917CrossRefGoogle Scholar
  32. Yu J (2012) A Bayesian inference based two-stage support vector regression framework for soft sensor development in batch bioprocesses. Comput Chem Eng 41:134–144CrossRefGoogle Scholar
  33. Zaidi S (2012) Development of support vector regression (SVR)-based model for prediction of circulation rate in a vertical tube thermosiphon reboiler. Chem Eng Sci 69:514–521CrossRefGoogle Scholar
  34. Zurada JM (1992) Introduction to artificial neural systems. West Publishing Co., St. PaulGoogle Scholar

Copyright information

© The Society for Reliability Engineering, Quality and Operations Management (SREQOM), India and The Division of Operation and Maintenance, Lulea University of Technology, Sweden 2014

Authors and Affiliations

  • Suhas B. Ghugare
    • 1
  • Shishir Tiwary
    • 1
    • 2
  • Sanjeev S. Tambe
    • 1
    Email author
  1. 1.Chemical Engineering and Process Development DivisionCSIR-National Chemical LaboratoryPuneIndia
  2. 2.CSIR-Central Institute of Mining and Fuel Research (CIMFR)DhanbadIndia

Personalised recommendations