Execution of Written Tasks by a Biologically-Inspired Artificial Brain

  • Sebastian Narvaez
  • Angel Garcia
  • Raul Ernesto Gutierrez
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10337)

Abstract

Communicating with machines in the same way we do with other people has been a long-time goal in computer science. One of its many advantages would be the ability to give instructions to our computers without the need of learning how to use specific software or programming languages. Since we’re dealing with human language, it would make sense to use a model of the human brain to build a system with such capabilities. In this work, the Hierarchical Temporal Memory algorithms are explored and evaluated as a biologically inspired tool capable of working with natural language. It’s proposed that task execution can be achieved by training the algorithms to map certain sentences with keywords that correspond to the tasks. Different encoders are tested, that translate words into a proper representation for the algorithms. The configuration of algorithms and encoders with the highest success rate is able to correctly map up to 90% of the sentences from a custom training set. The behaviour of the success rates does not vary greatly between different subsets of the training set, suggesting that the learning system is able to find patterns and make inferences about missing data.

Keywords

Hierarchical Temporal Memory Natural language processing Neural networks Task execution Language understanding 

References

  1. 1.
    Ahmad, S., Hawkins, J.: Properties of sparse distributed representations and their application to hierarchical temporal memory. arXiv preprint arXiv:1503.07469 (2015)
  2. 2.
    Hawkins, J., Ahmad, S., Dubinsky, D.: Hierarchical temporal memory, including HTM cortical learning algorithms. Technical report, Numenta (2011). http://numenta.com/learn/hierarchical-temporal-memory-white-paper.html
  3. 3.
    Hunter, J.D.: Matplotlib: a 2D graphics environment. Comput. Sci. Eng. 9(3), 90–95 (2007)Google Scholar
  4. 4.
    Ling, Z.-H., Kang, S., Zen, H., Senior, A., Schuster, M., Qian, X.-J., Meng, H., Deng, L.: Deep learning for acoustic modeling in parametric speech generation: a systematic review of existing techniques and future trends. IEEE Sig. Process. Mag. 32, 35–52 (2015)CrossRefGoogle Scholar
  5. 5.
    Miller, G.A.: Wordnet: a lexical database for English. Commun. ACM 38(11), 39–41 (1995)CrossRefGoogle Scholar
  6. 6.
    Morato, J., Marzal, M.Á., Llórens, J., Moreiro, J.: Wordnet applications. In: Global Wordnet Conference, vol. 2, pp. 270–278 (2004)Google Scholar
  7. 7.
    Numenta. Nupic faq. http://numenta.org/faq/, September 2016
  8. 8.
  9. 9.
    Purdy, S.: Encoding data for HTM systems. arXiv preprint arXiv:1602.05925 (2016)
  10. 10.
    Waskom, M., Botvinnik, O., Drewokane, Hobson, P., Halchenko, Y., Lukauskas, S., Warmenhoven, J., Cole, J.B., Hoyer, S., Vanderplas, J., Gkunter, Villalba, S., Quintero, E., Martin, M., Miles, A., Meyer, K., Augspurger, T., Yarkoni, T., Bachant, P., Evans, C., Fitzgerald, C., Nagy, T., Ziegler, E., Megies, T., Wehner, D., St-Jean, S., Coelho, L.P., Hitz, G., Lee, A., Rocher, L.: seaborn: v0.7.0, January 2016. Zenodo. http://doi.org/10.5281/zenodo.45133
  11. 11.
    De Sousa Webber, F.E.: Semantic folding theory. Technical report, Cortical.io (2015). http://www.cortical.io/static/downloads/semantic-folding-theory-white-paper.pdf

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Sebastian Narvaez
    • 1
  • Angel Garcia
    • 1
  • Raul Ernesto Gutierrez
    • 1
  1. 1.Universidad del ValleCaliColombia

Personalised recommendations