When newborns leave the enclosed spatial environment of the uterus and arrive in the outside world, they are faced with a new audiovisual environment of dynamic objects, actions and events both close to themselves and further away. One particular challenge concerns matching and making sense of the visual and auditory cues specifying object motion 1, 2, 3, 4, 5. Previous research shows that adults prioritise the integration of auditory and visual information indicating looming (for example ) and that rhesus monkeys can integrate multisensory looming, but not receding, audiovisual stimuli . Despite the clear adaptive value of correctly perceiving motion towards or away from the self — for defence against and physical interaction with moving objects — such a perceptual ability would clearly be undermined if newborns were unable to correctly match the auditory and visual cues to such motion. This multisensory perceptual skill has scarcely been studied in human ontogeny. Here we report that newborns only a few hours old are sensitive to matches between changes in visual size and in auditory intensity. This early multisensory competence demonstrates that, rather than being entirely naïve to their new audiovisual environment, newborns can make sense of the multisensory cue combinations specifying motion with respect to themselves.