Categories
technology Uncategorized

Fixing Skype’s “eye contact” problem

Skype and FaceTime don’t let you look directly into your loved ones’ eyes. What if they did?

My extended family lives all over the country—from Vermont to Pennsylvania to Colorado to Oregon to Hawaii. That makes visiting relatives expensive—and, with a nine-month-old, nearly impossible. Video chat—services like Skype and FaceTime—are a godsend. Our daughter learns that she has a wider family who loves her; aunts, uncles, and grandparents can witness her cuteness first-hand.

Although we’re grateful that Skype exists, it’s still a poor substitute for sitting face-to-face. Physical touch can’t be digitized, and the stuttery, low-res video feed filters out key non-verbal communication.

But Skype feels impersonal for another, less obvious reason: there’s no eye contact. Because the front-facing camera is mounted above the screen, you can’t look at the camera and at your loved ones at the same time. Instead, you look past them, never quite meeting their gaze.

That’s a tricky technological problem to solve. One approach would be to place the camera behind the screen—laying the remote video feed on top of the image sensor. Back in 2007, Apple filed a patent application for a device that worked this way. It’s a clever idea that never became an actual product.

Apple’s recent acquisition of FaceShift got me thinking; could you solve the “look me in the eyes” problem with software instead of hardware? FaceShift enables “markerless” motion capture, in which a user’s facial movements drive a cartoon avatar’s live performance. The demos showcase everyday people, transformed into mutant warriors, killer clowns, and pug puppies—all in real-time.

That’s a fun parlor trick, but what if this tech were deployed more subtly, to achieve a more profound goal: meeting your loved one’s gaze? The software would still alter your image in real-time, but instead of dressing you up like Shrek, it would shift your perceived eyeline away from the screen and into the camera. FaceShift would detect your eye color, paint over your “real” eye with white, then redraw your pupil and iris so that they seemed to look at the camera. Imagine a video chat app that included the “fun” options (making you a pirate or a zombie), but whose default mode made just this slight tweak.

If the feature were implemented well,[1] users might not even notice that their own video feed was being altered. They’d simply sense a deeper, more personal connection to the person with whom they’re chatting. Ironically, the image would be artificially manipulated, but the conversation would feel more authentic.[2]

Pair some silly name (“FaceTime Presence” or “Skype Gaze”) with some schmaltzy copy (“Look into each’s others eyes, from anywhere”), and you’ve got a very marketable feature.


  1. Done poorly, this feature could prove hilarious—or terrifying. What would Grandma think if Little Susie suddenly became an eyeless demon child?  ↩

  2. This raises all sorts of philosophical questions. What makes a person’s representation “real”? Can an image be less accurate, yet more authentic? Is it problematic to “fix” an avatar that purportedly represents the “real you”? What if video chat software could remove your blemishes or take off ten pounds? Would you enable it?  ↩