Advertisement

Learning Two-Input Linear and Nonlinear Analog Functions with a Simple Chemical System

  • Peter BandaEmail author
  • Christof Teuscher
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8553)

Abstract

The current biochemical information processing systems behave in a pre-determined manner because all features are defined during the design phase. To make such unconventional computing systems reusable and programmable for biomedical applications, adaptation, learning, and self-modification based on external stimuli would be highly desirable. However, so far, it has been too challenging to implement these in wet chemistries. In this paper we extend the chemical perceptron, a model previously proposed by the authors, to function as an analog instead of a binary system. The new analog asymmetric signal perceptron learns through feedback and supports Michaelis-Menten kinetics. The results show that our perceptron is able to learn linear and nonlinear (quadratic) functions of two inputs. To the best of our knowledge, it is the first simulated chemical system capable of doing so. The small number of species and reactions and their simplicity allows for a mapping to an actual wet implementation using DNA-strand displacement or deoxyribozymes. Our results are an important step toward actual biochemical systems that can learn and adapt.

Keywords

Chemical perceptron analog perceptron supervised learning chemical computing RNMSE linear function quadratic function 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Wang, W., Li, S., Mair, L., Ahmed, S., Huang, T.J., Mallouk, T.E.: Acoustic Propulsion of Nanorod Motors Inside Living Cells. Angewandte Chemie International Edition 53(12), 3201–3204 (2014)CrossRefGoogle Scholar
  2. 2.
    LaVan, D.A., McGuire, T., Langer, R.: Small-scale systems for in vivo drug delivery. Nature Biotechnology 21(10), 1184–1191 (2003)CrossRefGoogle Scholar
  3. 3.
    Haykin, S.: Neural networks and learning machines, 3rd edn. Pearson, New Jersey (2009)Google Scholar
  4. 4.
    Bray, D.: Protein molecules as computational elements in living cells. Nature 376(6538), 307–312 (1995)CrossRefGoogle Scholar
  5. 5.
    Mills, A.P., Yurke, B., Platzman, P.M.: Article for analog vector algebra computation. Biosystems 52(1-3), 175–180 (1999)CrossRefGoogle Scholar
  6. 6.
    Kim, J., Hopfield, J.J., Winfree, E.: Neural network computation by in vitro transcriptional circuits. In: Saul, L.K., Weiss, Y., Bottou, L. (eds.) Advances in Neural Information Processing Systems, vol. 17, pp. 681–688. MIT Press (2004)Google Scholar
  7. 7.
    Rosenblatt, F.: The perceptron: A probabilistic model for information storage and organisation in the brain. Psychological Review 65, 368–408 (1958)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Qian, L., Winfree, E., Bruck, J.: Neural network computation with DNA strand displacement cascades. Nature 475(7356), 368–372 (2011)CrossRefGoogle Scholar
  9. 9.
    Banda, P., Teuscher, C., Lakin, M.R.: Online learning in a chemical perceptron. Artificial Life 19(2), 195–219 (2013)CrossRefGoogle Scholar
  10. 10.
    Banda, P., Teuscher, C., Stefanovic, D.: Training an asymmetric signal perceptron through reinforcement in an artificial chemistry. Journal of the Royal Society Interface 11(93) (2014)Google Scholar
  11. 11.
    Moles, J., Banda, P., Teuscher, C.: Delay line as a chemical reaction network (under review). Parallel Processing Letters (2014)Google Scholar
  12. 12.
    Espenson, J.: Chemical kinetics and reaction mechanisms. McGraw-Hill, Singapore (1995)Google Scholar
  13. 13.
    Copeland, R.A.: Enzymes: A practical introduction to structure, mechanism, and data analysis, 2nd edn. John Wiley & Sons, Inc., New York (2002)Google Scholar
  14. 14.
    Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Networks 2(5), 359–366 (1989)CrossRefGoogle Scholar
  15. 15.
    Rojas, R.: Neural networks: A systematic introduction. Springer, Berlin (1996)Google Scholar
  16. 16.
    Lakin, M.R., Minnich, A., Lane, T., Stefanovic, D.: Towards a biomolecular learning machine. In: Durand-Lose, J., Jonoska, N. (eds.) UCNC 2012. LNCS, vol. 7445, pp. 152–163. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  17. 17.
    Stojanovic, M.N., Stefanovic, D.: A deoxyribozyme-based molecular automaton. Nature Biotechnology 21(9), 1069–1074 (2003)CrossRefGoogle Scholar
  18. 18.
    Liu, J., Cao, Z., Lu, Y.: Functional nucleic acid sensors. Chemical Reviews 109(5), 1948–1998 (2009); PMID: 19301873Google Scholar
  19. 19.
    Soloveichik, D., Seelig, G., Winfree, E.: DNA as a universal substrate for chemical kinetics. Proceedings of the National Academy of Sciences of the United States of America 107(12), 5393–5398 (2010)CrossRefGoogle Scholar
  20. 20.
    Zhang, D.Y., Seelig, G.: Dynamic DNA nanotechnology using strand-displacement reactions. Nature Chemistry 3(2), 103–113 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Department of Computer SciencePortland State UniversityBroadwayUSA
  2. 2.Department of Electrical and Computer EngineeringPortland State UniversityBroadwayUSA

Personalised recommendations