FAIR researchers have demonstrated that dialog agents with differing goals can negotiate with other bots or people and reach mutual decisions.  

There were cases where agents initially feigned interest in a valueless item, only to later “compromise” by conceding it — an effective negotiating tactic that people use regularly. This behavior was not programmed by the researchers but was discovered by the bot as a method for trying to achieve its goals.