An agent aims to secure his projected needs by attempting to build a set of (business) relationships with other agents. A relationship is built by exchanging private information, and is characterised by its intimacy — degree of closeness — and balance — degree of fairness. Each argumentative interaction between two agents then has two goals: to satisfy some immediate need, and to do so in a way that develops the relationship in a desired direction. An agent’s desire to develop each relationship in a particular way then places constraints on the argumentative utterances. The form of negotiation described is argumentative interaction constrained by a desire to develop such relationships.


Private Information Multiagent System Uncertain Environment Argumentation Strategy World Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Arcos, J.L., Esteva, M., Noriega, P., Rodríguez, J.A., Sierra, G: Environment engineering for multiagent systems. Journal on Engineering Applications of Artificial Intelligence 18 (2005)Google Scholar
  2. 2.
    Cheeseman, P., Stutz, J.: Bayesian Inference and Maximum Entropy Methods in Science and Engineering, chap. On The Relationship between Bayesian and Maximum Entropy Inference, pp. 445–461. American Institute of Physics, Melville, NY, USA (2004)Google Scholar
  3. 3.
    Jaynes, E.: Information theory and statistical mechanics: Part I. Physical Review 106, 620–630 (1957)CrossRefMathSciNetGoogle Scholar
  4. 4.
    Kalfoglou, Y, Schorlemmer, M.: IF-Map: An ontology-mapping method based on information-flow theory. In: S. Spaccapietra, S. March, K. Aberer (eds.) Journal on Data Semantics I, Lecture Notes in Computer Science, vol. 2800, pp. 98–127. Springer-Verlag: Heidelberg, Germany (2003)Google Scholar
  5. 5.
    Lewicki, R.J., Saunders, D.M., Minton, J.W.: Essentials of Negotiation. McGraw Hill (2001)Google Scholar
  6. 6.
    Maslow, A.H.: A theory of human motivation. Psychological Review 50, 370–396 (1943)CrossRefGoogle Scholar
  7. 7.
    Paris, J.: Common sense and maximum entropy. Synthese 117(1), 753–93 (1999)MathSciNetGoogle Scholar
  8. 8.
    Rahwan, I., Ramchurn, S., Jennings, N., McBurney, P., Parsons, S., Sonenberg, E.: Argumentation-based negotiation. Knowledge Engineering Review 18(4), 343–375 (2003)CrossRefGoogle Scholar
  9. 9.
    Sierra, C, Debenham, J.: Trust and honour in information-based agency. In: P. Stone, G. Weiss (eds.) Proceedings Fifth International Conference on Autonomous Agents and Multi Agent Systems AAMAS-2006, pp. 1225–1232. ACM Press, New York, Hakodate, Japan (2006)CrossRefGoogle Scholar
  10. 10.
    Sierra, C, Debenham, J.: Information-based agency. In: Proceedings of Twentieth International Joint Conference on Artificial Intelligence IJCAI-07, pp. 1513–1518. Hyderabad, India (2007)Google Scholar
  11. 11.
    Sierra, C, Debenham, J.: The LOGIC Negotiation Model. In: Proceedings Sixth International Conference on Autonomous Agents and Multi Agent Systems AAMAS-2007, pp. 1026–1033. Honolulu, Hawai’i (2007)Google Scholar

Copyright information

© Springer-Verlag London Limited 2009

Authors and Affiliations

  • John Debenham
    • 1
  • Carles Sierra
    • 2
  1. 1.University of TechnologySydneyAustralia
  2. 2.Institut d’Investigaciö en Intel-ligència Artificial — NIA, Spanish Scientific Research CouncilCSICBellaterra, CataloniaSpain

Personalised recommendations