The optimization of attribute selection in decision tree-based production control systems

Original Article

Abstract

This study develops a learning-based production control system (PCS) to support a manufacturing system to make on-line decisions that are robust in the face of various production requirements. Selecting essential system attributes (or features) based on various production requirements to construct PCS knowledge bases is a critical issue because of the existence of a large amount of shop floor information in a manufacturing system. However, a classical decision tree (DT) learning approach to construct dynamic dispatching knowledge bases does not consider the optimal subset of system attributes in the problem domain. To resolve this problem, this study develops a hybrid genetic algorithm/decision tree (GA/DT) approach for DT-based PCS. The hybrid GA/DT approach is used to simultaneously evolve an optimal subset of system attributes and determine learning parameters of the DT from a large set of candidate manufacturing system attributes according to various performance measures. For a given feature subset and learning parameters of a DT decoded by a GA, a DT was applied to evaluate the fitness in the GA process and to generate the PCS knowledge base. The results demonstrate that the proposed GA/DT-based PCS has, according to various performance criteria, a better long term system performance than those obtained with classical DT-based PCS and the heuristic individual dispatching rules, according to various performance criteria.

Keywords

Decision tree (DT) learning Dynamic dispatching Feature selection  Genetic algorithm (GA) Production control 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Jones AT, Mclean CR (1986) A proposed hierarchical control architecture for automated manufacturing systems. J Manuf Syst 5 (1):15–25Google Scholar
  2. 2.
    Cho H, Wysk RA (1993) A robust adaptive scheduler for an intelligent workstation controller. Int J Prod Res 31(4): 771–789Google Scholar
  3. 3.
    Cho H, Wysk RA (1995) Intelligent workstation controller for computer-integrated manufacturing: problems and models. J Manuf Syst 14(4):252–263CrossRefGoogle Scholar
  4. 4.
    Chen CC, Yih Y (1996) Identifying attributes for knowledge-based development in dynamic scheduling environments. Int J Prod Res 34(6):1739–1755MATHGoogle Scholar
  5. 5.
    Chen CC, Yih Y, Wu YC (1999) Auto-bias selection for learning-based scheduling systems. Int J Prod Res 37(9):1987–2002CrossRefMATHGoogle Scholar
  6. 6.
    Park SC, Raman N, Shaw MJ (1997) Adaptive scheduling in dynamic flexible manufacturing systems: a dynamic rule selection approach. IEEE Trans Robot Automat 13(4): 486–502CrossRefGoogle Scholar
  7. 7.
    Shiue YR, Su CT (2002) An enhanced knowledge representation for decision-tree based learning adaptive scheduling. Int J Adv Manu Technol 16(1):48–60Google Scholar
  8. 8.
    Shiue YR, Su CT (2003) An enhanced knowledge representation for decision-tree based learning adaptive scheduling. Int J Comput Integr Manuf 16(1):48–60Google Scholar
  9. 9.
    Priore P, De la Fuente D, Gomez A, Puente J (2001) A review of machine learning in dynamic scheduling of flexible manufacturing systems. Artif Intell Eng Des Anal Manuf 15(3):251–263MATHGoogle Scholar
  10. 10.
    Goldberg DE (1989) Genetic Algorithms in search, optimization, and machine learning. Addison-Wesley, BostonGoogle Scholar
  11. 11.
    Rumelhart DE, Hinton GE, Williams RJ (1986) Learning internal representations by error propagation. In: Rumelhart DE, McClelland JL (eds) Parallel distributed processing: explorations in the microstructure of cognition. MIT press, Cambridge, MAGoogle Scholar
  12. 12.
    Sun YL, YIH Y (1996) An intelligent controller for manufacturing cells. Int J Prod Res 34(8): 2353–2373MATHGoogle Scholar
  13. 13.
    Arzi Y, Iaroslavitz L (1999) Neural network-based adaptive production control system for flexible manufacturing cell under a random environment. IIE Trans 31(3):217–230CrossRefGoogle Scholar
  14. 14.
    Shaw MJ, Park S, Raman N (1992) Intelligent scheduling with machine learning capabilities: The induction of scheduling knowledge. IIE Trans 24(2):156–168Google Scholar
  15. 15.
    Kim CO, Min HS, Yih Y (1998) Integration of inductive learning and neural networks for multi-objective FMS scheduling. Int J Prod Res 36(9):2497–2509CrossRefMATHGoogle Scholar
  16. 16.
    Arzi Y, Iaroslavitz L (2000) Operating an FMC by a decision-tree-based adaptive production control system. Int J Prod Res 38(3):675–697CrossRefMATHGoogle Scholar
  17. 17.
    Quinlan JR (1986) Induction of decision trees. Mach Learn 1(1):81–106Google Scholar
  18. 18.
    Quinlan JR (1987) Simplifying decision trees. Int J Man Mach Stud 27(3):221–234Google Scholar
  19. 19.
    Quinlan JR (1993) C4.5: programs for machine learning. Kaufmann, San FranciscoGoogle Scholar
  20. 20.
    Siedlecki W, Sklansky J (1988) On automatic feature selection. Int J Pattern Recognit Artif Intell 2(2):197–220CrossRefGoogle Scholar
  21. 21.
    Siedlecki W, and Sklansky J (1989) A note on genetic algorithms for large-scale feature selection. Pattern Recognit Lett 10:335–347CrossRefMATHGoogle Scholar
  22. 22.
    Montazeri M, Van Wassenhove LN (1990) Analysis of scheduling rules for an FMS. Int J Prod Res 28(4):785–802Google Scholar
  23. 23.
    Blackstone JH, Philips DT Jr., Hogg GL (1982) A state-of-the-art survey of dispatching rules for manufacturing job shop operations. Int J Prod Res 20(1):27–45Google Scholar
  24. 24.
    Baker KR (1984) Sequencing rules and due-date assignments in a job shop. Manage Sci 30(9):1093–1104Google Scholar
  25. 25.
    Sabuncuoglu I (1998) A study of scheduling rules of flexible manufacturing systems: a simulation approach. Int J Prod Res 36(5):527–546CrossRefMATHGoogle Scholar
  26. 26.
    Mitchell TM (1997) Machine learning. McGraw-Hill, New YorkGoogle Scholar
  27. 27.
    DeJong K (1975) An analysis and behavior of a class of genetic adaptive systems. Dissertation, Department of Computer and Communication Sciences, University of Michigan, Ann Arbor, MIGoogle Scholar
  28. 28.
    SIMPLE ++ (2000) Reference manual version 7.0. AESOP, StuttgartGoogle Scholar
  29. 29.
    See5 (2004) See5: an informal tutorial. Cited 02/05. http://www.rulequest.com/see5-win.htmlGoogle Scholar
  30. 30.
    SAS (1998) Data mining and the case for sampling. SAS Institute best practices paper, SAS Institute Inc., Cary, NCGoogle Scholar

Copyright information

© Springer-Verlag 2005

Authors and Affiliations

  1. 1.Department of Information ManagementHuafan UniversityTaipei HsienTaiwan
  2. 2.Department of Industrial ManagementNational Formosa UniversityYunlinTaiwan

Personalised recommendations