The involvement of emotion in lexical processing has gained substantial attention, with many studies showing differential processing of words with negative emotional content. It is unknown, however, whether these interactions are restricted to conscious perception, or extend to preconscious processing as well. Here we examine the role of emotional content on preconscious face and word processing, taking advantage of interocular suppression to render a stimulus invisible for a short duration, and using an orthogonal spatial task (location discrimination) to identify the time at which a stimulus emerges from visual suppression. Crucially, consistent patterns were observed for emotional content across modality: negative stimuli (angry faces and negative words) took longer to emerge than positively valenced stimuli (happy faces and positive words). We discuss how differences in task demands can produce apparently incompatible patterns of results, and show how these different results can be reconciled within attentional accounts of negativity bias.