Skip to main content

Learning to Generate Ambiguous Sequences

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11953))

Included in the following conference series:

  • 2718 Accesses

Abstract

In this paper, we experiment with methods for obtaining binary sequences with a random probability mass function and with low autocorrelation and use it to generate ambiguous outcomes.

Outputs from a neural network are mixed and shuffled, resulting in binary sequences whose probability mass function is non-convergent, constantly moving and changing.

Empirical comparison with algorithms that generate ambiguity shows that the sequences generated by the proposed method have a significantly lower serial dependence. Therefore, the method is useful in scenarios where observes can see and record the outcome of each draw sequentially, by hindering the ability to make useful statistical inferences.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://github.com/HaskellAmbiguity/AmbiguityGenerator.

References

  1. Arló-Costa, H., Helzner, J.: Iterated random selection as intermediate between risk and uncertainty. In: Manuscript, Carnegie Mellon University and Columbia University. In the electronic proceedings of the 6th International Symposium on Imprecise Probability: Theories and Applications. Citeseer (2009)

    Google Scholar 

  2. Brock, A., Donahue, J., Simonyan, K.: Large scale GAN training for high fidelity natural image synthesis. arXiv preprint arXiv:1809.11096 (2018)

  3. Ellsberg, D.: Risk, ambiguity, and the savage axioms. Q. J. Econ. 75, 643–669 (1961)

    Article  MathSciNet  Google Scholar 

  4. Gilboa, I., Schmeidler, D.: Maxmin expected utility with non-unique prior. In: Uncertainty in Economic Theory, pp. 141–151. Routledge (2004)

    Google Scholar 

  5. Goodfellow, I., et al.: Generative adversarial nets. In: Advances in Neural Information Processing Systems, pp. 2672–2680 (2014)

    Google Scholar 

  6. Guidolin, M., Rinaldi, F.: Ambiguity in asset pricing and portfolio choice: a review of the literature. Theory Decis. 74(2), 183–217 (2013)

    Article  MathSciNet  Google Scholar 

  7. Kast, R., Lapied, A., Roubaud, D.: Modelling under ambiguity with dynamically consistent Choquet random walks and Choquet-Brownian motions. Econ. Model. 38, 495–503 (2014)

    Article  Google Scholar 

  8. Kim, K., Kwak, M., Choi, U.J.: Investment under ambiguity and regime-switching environment. Available at SSRN 1424604 (2009)

    Google Scholar 

  9. Knight, F.: Risk, Uncertainty and Profit. Kelley and Millman Inc., New York (1921)

    Google Scholar 

  10. Ledig, C., et al.: Photo-realistic single image super-resolution using a generative adversarial network. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4681–4690 (2017)

    Google Scholar 

  11. Riedel, F., Sass, L.: Ellsberg games. Theory Decis. 76(4), 469–509 (2014)

    Article  MathSciNet  Google Scholar 

  12. Schröder, D.: Investment under ambiguity with the best and worst in mind. Math. Financ. Econ. 4(2), 107–133 (2011)

    Article  MathSciNet  Google Scholar 

  13. Stecher, J., Shields, T., Dickhaut, J.: Generating ambiguity in the laboratory. Manag. Sci. 57(4), 705–712 (2011)

    Article  Google Scholar 

  14. Xu, T., et al.: AttnGAN: fine-grained text to image generation with attentional generative adversarial networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1316–1324 (2018)

    Google Scholar 

  15. Yu, Y., Gong, Z., Zhong, P., Shan, J.: Unsupervised representation learning with deep convolutional neural network for remote sensing images. In: Zhao, Y., Kong, X., Taubman, D. (eds.) ICIG 2017. LNCS, vol. 10667, pp. 97–108. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-71589-6_9

    Chapter  Google Scholar 

  16. Zhu, J.Y., Park, T., Isola, P., Efros, A.A.: Unpaired image-to-image translation using cycle-consistent adversarial networks. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2223–2232 (2017)

    Google Scholar 

Download references

Acknowledgments

This research was partially supported by Sapientia Foundation – Institute for Scientific Research (KPI). L. Szilágyi is János Bolyai Fellow of the Hungarian Academy of Sciences.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David Iclanzan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Iclanzan, D., Szilágyi, L. (2019). Learning to Generate Ambiguous Sequences. In: Gedeon, T., Wong, K., Lee, M. (eds) Neural Information Processing. ICONIP 2019. Lecture Notes in Computer Science(), vol 11953. Springer, Cham. https://doi.org/10.1007/978-3-030-36708-4_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-36708-4_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-36707-7

  • Online ISBN: 978-3-030-36708-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics