Skip to main content

Fractal Analogies for General Intelligence

  • Conference paper
Artificial General Intelligence (AGI 2012)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7716))

Included in the following conference series:

Abstract

A theory of general intelligence must account for how an intelligent agent can map percepts into actions at the level of human performance. We describe a new approach to this percept-to-action mapping. Our approach is based on four ideas: the world exhibits fractal self-similarity at multiple scales, the design of mind reflects the design of the world, similarity and analogy form the core of intelligence, and fractal representations provide a powerful technique for perceptual similarity and analogy. We divide our argument into two parts. In the first part, we describe a technique of fractal analogies and show how it gives human-level performance on an intelligence test called the Odd One Out. In the second, we describe how the fractal technique enables the percept-to-action mapping in a simple, simulated world.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Arkin, R.: Behavior-Based Robotics. The MIT Press, Boston (1998)

    Google Scholar 

  2. Barnsley, M., Hurd, L.: Fractal Image Compression. A.K. Peters, Boston (1992)

    Google Scholar 

  3. Biederman, I.: Recognition-by-Components: A Theory of Human Image Understanding. Psychological Review 94, 115–147 (1987)

    Article  Google Scholar 

  4. Davis, R., Shrobe, H., Szolovits, P.: What is a Knowledge Representation? AI Magazine 14(1), 17–33 (1993)

    Google Scholar 

  5. Davies, J., Goel, A.: Visual Analogy in Problem Solving. In: Proc. 17th International Joint Conference on Artificial Intelligence, IJCAI 2001, pp. 377–382. Morgan Kaufmann (August 2001)

    Google Scholar 

  6. Davies, J., Goel, A., Yaner, P.: Proteus: Visuospatial Analogy in Problem Solving. Knowledge-Based Systems 21(7), 636–654 (2008)

    Article  Google Scholar 

  7. Davies, J., Goel, J., Nersessian, N.: A Computational Model of Visual Analogies in Design. Cognitive Systems Research 10, 204–215 (2009)

    Article  Google Scholar 

  8. Davies, J., Nersessian, N., Goel, A.: Visual Models in Analogical Problem Solving. Foundations of Science 10(1), 133–152 (2005)

    Article  Google Scholar 

  9. Diascro, M.N., Brody, N.: Odd-man-out and intelligence. Intelligence 19(1), 79–92 (1994)

    Article  Google Scholar 

  10. Gentner, D.: Structure-Mapping: A Theoretical Framework for Analogy. Cognitive Science 7(2), 155–170 (1983)

    Article  Google Scholar 

  11. Hampshire, A.: The Odd One Out Test of Intelligence (2010), http://www.cambridgebrainsciences.com/browse/reasoning/test/oddoneout

  12. Haugeland, J. (ed.): Mind Design: Philosophy, Psychology and Artificial Intelligence. MIT Press (1981)

    Google Scholar 

  13. Hertzmann, A., Jacobs, C.E., Oliver, N., Curless, B., Salesin, D.: Image analogies. Computer Graphics 25(4), 327–340 (2001) (SIGGRAPH 2001 Conference Proceedings)

    Google Scholar 

  14. Hofstadter, D., Fluid Analogies Research Group (eds.): Fluid concepts & creative analogies: Computer models of the fundamental mechanisms of thought. Basic Books, New York (1995)

    Google Scholar 

  15. Indurkhya, B.: On creation of features and change of representation. Journal of Japanese Cognitive Science Society 5(2), 43–56 (1998)

    Google Scholar 

  16. Kunda, M., McGreggor, K., Goel, A.: Taking a Look (Literally!) at the Raven’s Intelligence Test: Two Visual Solution Strategies. In: Proc. 32nd Annual Meeting of the Cognitive Science Society, Portland (August 2010)

    Google Scholar 

  17. Kunda, M., McGreggor, K., Goel, A.: A computational model for solving problems from the Raven’s Progressive Matrices intelligence test using iconic visual representations. To appear in Cognitive Systems Research (in press)

    Google Scholar 

  18. Lovett, A., Lockwood, K., Forbus, K.: Modeling Cross-Cultural Performance on the Visual Oddity Task. In: Freksa, C., Newcombe, N.S., Gärdenfors, P., Wölfl, S. (eds.) Spatial Cognition VI. LNCS (LNAI), vol. 5248, pp. 378–393. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  19. Mandelbrot, B.: The fractal geometry of nature. W.H. Freeman, San Francisco (1982)

    MATH  Google Scholar 

  20. Markou, M., Singh, S.: Novelty Detection: A Review-Part 1: Statistical Approaches. Signal Processing 83(12), 2481–2497 (2003a)

    Article  MATH  Google Scholar 

  21. Markou, M., Singh, S.: Novelty Detection: A Review-Part 2: Neural Network Based Approaches. Signal Processing 83(12), 2481–2497 (2003b)

    Article  MATH  Google Scholar 

  22. McGreggor, K., Kunda, M., Goel, A.: Fractal Analogies: Preliminary Results from the Raven’s Test of Intelligence. In: Proc. International Conference on Computational Creativity, Mexico City, Mexico, April 27-29 (2011)

    Google Scholar 

  23. Norman, D.: Cognition in the Head and in the World: An Introduction to the Special Issue on Situated Action. Cognitive Science 17, 1–6 (1993)

    Article  MathSciNet  Google Scholar 

  24. Neto, H., Nehmzow, U.: Visual Novelty Detection with Automatic Scale Selection. Robotics and Autonomous Systems 55, 693–701 (2007)

    Article  Google Scholar 

  25. O’Reilly, R.C., Johnson, M.H.: Object Recognition and Sensitive Periods: A Computational Analysis of Visual Imprinting. Neural Computation 6, 357–389 (1994)

    Article  Google Scholar 

  26. Reynolds, C. W.: Flocks, Herds, and Schools: A Distributed Behavioral Model. Computer Graphics 21(4), 25–34 (1987) (SIGGRAPH 1987 Conference Proceedings)

    Google Scholar 

  27. Reynolds, C.W.: Steering Behaviors For Autonomous Characters. In: Proceedings of Game Developers Conference 1999 held in San Jose, California, pp. 763–782. Miller Freeman Game Group, San Francisco (1999)

    Google Scholar 

  28. Russell, S., Norvig, P.: Artificial Intelligence: A Modern Approach. Prentice-Hall (2003)

    Google Scholar 

  29. Tversky, A.: Features of similarity. Psychological Review 84(4), 327–352 (1977)

    Article  Google Scholar 

  30. Wagemans, J., Elder, J., Kubovy, M., Palmer, S., Peterson, M., Singh, M., von der Heydt, R.: A century of Gestalt psychology in visual perception: I. Perceptual grouping and figure-ground organization. Psychological Bulletin (in press A)

    Google Scholar 

  31. Wagemans, J., Feldman, J., Gerpshtein, S., Kimchi, R., Pomerantz, J., van der Helm, P., van Leeuwen: A Century of Gestalt Psychology in Visual Perception II. Conceptual and Theoretical Foundations. Psychological Bulletin (in press B)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

McGreggor, K., Goel, A. (2012). Fractal Analogies for General Intelligence. In: Bach, J., Goertzel, B., Iklé, M. (eds) Artificial General Intelligence. AGI 2012. Lecture Notes in Computer Science(), vol 7716. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35506-6_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-35506-6_19

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-35505-9

  • Online ISBN: 978-3-642-35506-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics