When two persons participate in a discussion, they not only exchange the concepts and ideas they are discussing, they also express attitudes, feelings and commitments regarding their partner: they express interpersonal stances. Endowed with backchannel model, several virtual agents are able to react to their partners’ behaviour through their non-verbal behaviour. In this paper, we go beyond this approach, proposing and testing a model that enables agents to express a dyadic stance, marker of effective communication: agents will naturally co-construct a shared dyadic stance if and only if their interpersonal stance is reciprocally positive. We focus on smile, which conveys interpersonal stance and is a particularly efficient signal for co-regulation of communication. With this model, a virtual agent, only capable to control its own individual parameters, can, in fact, modulate and control the dyadic stance appearing when it interacts with its partner. The evaluation of the model through a user perceptive study has enabled us to validate that the dyadic stance is significantly perceived as more positive (mutual understanding, attention, agreement, interest, pleasantness) when reinforcement of smile is reciprocal.