Task influence has long been known to play a major role in the way our eyes scan a scene. Interestingly, how the task modulates attention when interacting with objects has been less investigated. Only few studies have contrasted the distribution of eye fixations during viewing and grasping objects. How is at- tention differently deployed when different actions have to be planned on objects in contrast to a purely perceptual viewing condition? To investigate these issues, we conducted an eye- tracking experiment showing participants 2D images of real- world objects. In blocks of trials, participants were asked either to assign the displayed objects to one of two classes (classification task), to mimic lifting the object (lifting task), or to mimic opening the object (opening task). Mean fixation locations and attention heatmaps show different modes in gaze distribution around task-relevant locations, in accordance with previous literature. Reaction times, measured by button release in the manual response, suggest that the more demanding the task in terms of motor planning the longer the latency in movement initiation. Results show that even on simplified, two dimensional displays the eyes reveal the current intentions of the participants. Moreover, the results suggest elaborate cognitive processes at work and confirm anticipatory behavioral control. We conclude with suggesting that the strongly predictive information contained in eye movements data may be used for advanced, highly intuitive, user-friendly brain-computer interfaces.