The paper “Behavior Matching in Multimodal Communication Is Synchronized” (PDF) published in the recent issue of Cognitive Science journal explores synchronisation phenomena that occur during communication. Specifically, the authors are showing that synchronization occurs not only in oscillatory motions (e.g. postural sway), but also in nonoscillatory behaviors, such as facial expression, language, gestural behavior. They claim that the reason for this emergent synchronization is to reduce the cognitive load and to solve the problem for interaction.
To quote: “In the same sense, verbal interaction across people may proﬁt from an active constraining of the space of possible behaviors by cognitive mechanisms such as priming, mirroring, imitation, and so on. Emergent synchronization within any number of modalities is the general description of a solution to the dangerous degrees of freedom of interaction. When two people meet face to face, perplexity is high: A very large selection of possible linguistic and non-linguistic behaviors could take place. Multimodal synchronization can reduce degrees of freedom markedly, when one person serves as the constraint for another, and they become in an important (but approximate) sense a functional, coordinative unit (sometimes termed ‘‘coordinative structure,’’ Kugler et al., 1980). Synchronizing within many behaviors may relieve the cognitive system of the burden of constantly computing the next behavior in each of classes 1 to some large n during a task. ‘‘Joint cognitive ofﬂoading’’ from one person onto another may assist the cognitive system by reducing detailed planning for each behavioral channel during interaction (Garrod & Pickering, 2009).”