Recent theoretical discussion of dyadic coordination has focused on issues of synchronization, entrainment, alignment, and convergence. All of these terms refer to matching of specific behavioral and linguistic events, such that members of a dyad coordinate by “doing the same thing.” Communicative behaviors tend to be highly variable, like most human behaviors. These tendencies suggest the possibility of complexity matching: Statistical measures of behavioral complexity may converge in certain types of dyadic interaction. In the present study, acoustic speech signals of interlocutors were measured in two conversational conditions, one argumentative and the other affiliative. Signal complexity was measured in terms of heavy tails and power laws in the distributional and temporal properties of acoustic event series, respectively. Parameters of statistical functions were found to vary by conversation type, as did their matching between interlocutors. Results demonstrate a new way to quantify the coordination of interlocutors in terms of complexity matching.