Abstract
Godfrey-Smith advocates for linking deception in sender-receiver games to the existence of undermining signals. I present games in which deceptive signals can be arbitrarily frequent, without this undermining information transfer between sender and receiver.
Similar content being viewed by others
Notes
There is another option (suggested to me by an anonymous reviewer): denying that M 1—which, as we have seen, has a tautologous propositional content—has meaning at all. It would be, perhaps, rather a non-message. This is, I think, a good illustration of the way in which Skyrms' informational contents are more explanatory than propositional contents, at least in the context of these games: if M 1 was just a non-message one would expect the receiver to respond to it by producing A 3 alone; after all, non-messages, presumably, are communicatively inert, and A 3 is the best response to the mix of two-thirds of S 1 and one third of S 2 that the unconditional probabilities of these states produce. In fact, the receiver responds by mixing four fifths of A 1 and one fifth of A 3 —that is, by correctly adapting its response to the information that M 1 carries about world states. This provides evidence that M 1 is meaningful.
Both here and in the game discussed in Sect. 6 the receiver resorts to partial pooling. This could be taken to imply that the receiver's responsiveness to the sender is decreasing: in the receiver's final response, A 2 carries less information about the message sent by the sender than it did in the original Nash equilibrium. On the other hand, A 3 carries more information about the message sent by the sender than it did in the Nash equilibrium: the receiver, one might say, is increasing its responsiveness to the faithful messages that the sender is still sending.
This conclusion, as I have said, is based on data gathered randomly from the space of all cheap-talk 3 × 3 × 3 games. It remains an open question whether restricting the data-gathering to biologically salient regions of this space (i.e., regions in which games can be used to model actual sender–receiver interactions in nature) would show deceptive, not non-maintaining messages to be less (or more, or equally) prevalent.
There are at least two other respects in which this numerical exploration should be supplemented by further work: first, although casual inspection does not reveal salient common features among the 251 games in the sample that have deceptive, not non-maintaining signals, it is possible (perhaps likely) that more systematic exploration might uncover such similarities in the payoff structure of these games. Second, in this paper I am using Nash equilibria as the target equilibrium concept. This should be complemented with a study of how accessible to evolution, and how dynamically stable, those equilibria are.
Although this is not pooling in the strict sense, I will grant that this response counts as pooling for the purposes of the definition of non-maintaining message: the game does not allow for a pooling, always-buy strategy, and this is as close as the receiver can get to such a strategy.
References
Akerlof, G. (1970). The market for ‘Lemons:’ Quality uncertainty and the market mechanism. The Quarterly Journal of Economics, 84, 488–500.
Crawford, V. P., & Sobel, J. (1982). Strategic information transmission. Econometrica, 50(6), 1431–1451.
Godfrey-Smith, P. (2011). Signals: Evolution, learning & information, by Brian Skyrms. Mind, 120(480), 1288–1297.
Godfrey-Smith, P. & Martínez, M. (2013). Communication and common interest. PLOS Computational Biology, 9(11).
Johnstone, R. A. (1997). The evolution of animal signals. In J. R. Krebs & N. B. Davies (Eds.), Behavioural ecology: An evolutionary approach (pp. 155–178). Oxford: Blackwell.
Lemke, C. E. (1965). Bimatrix equilibrium points and mathematical programming. Management Science, 11, 681–689.
Lewis, D. (1969). Convention. Cambridge: Harvard University Press.
McKelvey, R. D., McLennan, A. M., & Turocy, T. L. (2010). Gambit: Software tools for game theory. Version 0.2010.09.01. http://www.gambit-project.org.
Searcy, W., & Nowicki, S. (2005). The evolution of animal communication. Princeton: Princeton University Press.
Skyrms, B. (1996). Evolution of the social contract. Cambridge: Cambridge University Press.
Skyrms, B. (2010). Signals: Evolution, learning & information. New York: Oxford University Press.
Spence, M. (1973). Job market signaling. Quarterly Journal of Economics, 87(3), 355–374.
Wagner, E. (2012). Deterministic chaos and the evolution of meaning. The British Journal for the Philosophy of Science, 63, 547–575.
Zahavi, A. (1975). Mate selection: A selection for a handicap. Journal of Theoretical Biology, 53, 205–214.
Zollman, K. J., Bergstrom, C. T., & Huttegger, S. M. (2013). Between cheap and costly signals: The evolution of partially honest communication. In Proceedings of the Royal Society B 20121878. doi:10.1098/rspb.2012.187.
Acknowledgments
I would like to thank Peter Godfrey-Smith and two anonymous reviewers for their helpful comments to earlier drafts. Research for this paper was supported by the Spanish government via research grants MCINN FFI2011-26853 and CSD2009-0056 (CONSOLIDER INGENIO).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Martínez, M. Deception in Sender–Receiver Games. Erkenn 80, 215–227 (2015). https://doi.org/10.1007/s10670-014-9623-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10670-014-9623-z