Imagining the future of AI photography

tech

Portrait mode and corollary features (e.g. portrait lighting) are halting first steps towards a true AI-augmented camera. Here’s a fanciful look at our smartphone future:


It’s April 4, 2027, and Julie is making good progress. For the seventh time that day, she clambers up the squeaky attic ladder and crouch-steps her way to a tall pile of cardboard boxes. She squints at the next box in the stack, just making out her mother’s scrawl: “FAMILY PHOTOS.” It slides easily across the dusty floorboards, and Julie descends the ladder, flopping the box from step to step above her.

With a heavy sigh, she sets the box down on a dining room chair. Her kitchen scissors make quick work of the duct tape; Julie glances inside—and winces. No neat stacks, no carefully-curated photo albums. Instead, the box is full to the brim with loose snapshots, unlabeled and unsorted. Just a few years back, organizing these photos would have taken an entire weekend.

Fortunately for Julie, times have changed. She works quickly, plucking photos from the box and tossing them into a messy grid on the table. Within a few minutes, she has strewn hundreds of memories across the oak panels. They’re arranged in no particular order; Julie spots a baby photo of her grandmother from the 40s, adjacent to a faded Kodak print of Aunt Susie driving in the mid–70s. The very next snapshot in the row is a Polaroid from Christmas 1991; her little brother triumphantly lifts a brand-new video game console package above his head.

With a nostalgic smile, Julie whips out her smartphone and opens the photo enhancement app that makes this spring cleaning possible. The real work begins; she waves the device over the table, lazily panning its viewfinder across the rows and columns of snapshots.

As she does, the camera does its magic. Each individual photograph is extracted, cropped, and saved to its own file. The process is nearly instant; after just a minute or two of haphazard scanning, the app beeps and confirms that it’s captured all the data it needs. Julie sweeps the impromptu collage into a waiting trash cash.

It’s almost hard to believe how much she trusts the phone to capture these photos. Once, she would have been horrified to throw away such precious memories. Now, in a single day, she has filled a half-dozen garbage bags with old snapshots.

As she breaks down the empty cardboard box, the phone app (and the cloud service that powers it) does its own tidying up. First, it leverages machine learning to automatically recognize every object in every photo: that’s a baby beneath a maple tree in late fall. That’s a 1976 AMC Hornet. That’s a Sega Genesis.

With that context in hand, the service can clean up the photos. First, the easy stuff: wiping away physical scratches. Removing decades’ worth of discoloration and fade. Filling in missing, damaged details using robust healing algorithms. The AMC logo on the Hornet’s hood, obliterated by a coffee stain on the photo, is now recreated from a library of vintage car logos. A gouge in the leaves gets repaired too; maple leaves have a distinctive shape, and the app generates a few more to fill the hole. The Sega Genesis, motion-blurred in the boy’s excitement, is sharpened using actual product photography.

The restoration isn’t limited to inanimate objects, though. The app knows that it’s Aunt Susie who’s sitting behind the wheel of the Hornet, even though she’s obscured by glare on the windshield and some lens flare. Using Susie’s existing visual profile, the tool sharpens her face and glasses and fills in the blown-out highlights with data from other images.

The service automatically assigns metadata to each image, too. Every calendar, clock, Christmas tree, or party hat in the background helps the service narrow down the image date. Less obvious visual clues can help, too; the app might recognize that ’76 Hornet from previous scans and assign the photo of Susie to the same era. Even the girl’s tight perm could help to date the photo; given enough data, the app might know exactly when she adopted—and abandoned—that distinctive look. In the same way, visual cues could help pin down each photo’s location, too.

As Julie sets the last of the trash bags by the curb, she feels a twinge of bittersweet guilt. The garbage truck is on its way; soon, the original photos will be gone for good.

But based on experience, she’s confident enough in the restoral to let them go. The digitized shots are far more portable, more flexible, and more interesting than the paper copies ever were. She can make edits that would otherwise have been impossible—like tweaking the exposure level or even the location and brightness of the original scene’s light sources. Or altering the camera’s focal length, decades after the shot was taken; the app’s AI has used the source image to extrapolate exactly where each object sits in three-dimensional space.

Finally, Julie can even perform that “Zoom… enhance” magic promised by science fiction movies for decades. As she steps back into the kitchen, she grabs her tablet and plops down at the counter. Time to take a closer look at Aunt Susie’s unbelievable 70s curls. ■


My favorite Black Friday tradition? Complaining about Black Friday.

culture

It’s Black Friday—the highest of high American holidays.

For years, I puzzled about the traditions that have built up around the start of the holiday shopping season. “Who in their right mind,” I wondered, “would willingly wake before dawn to stand outside in sub-freezing temperatures, on the off chance that they might score a slight discount on a terrible TV set?”

And so I scoffed at the plebian masses, standing in line for the right to be the first to sprint into their local Wal-Mart. I shook my head sagely when the local news aired videos of shoppers trampling and berating each other to get their hands on the latest disposable toy. I rued the erosion of a family-oriented holiday and derided retailers who opened their doors on Thanksgiving evening.

Looking back, I relished my own Black Friday tradition: complaining about Black Friday.


Alas, I gave up my right to sneer at early holiday shoppers a few years ago—when I became one of them.

Back in 2013, at the height of the iPad’s popularity, Target announced a killer Black Friday deal: $20 off, plus a $100 gift card. I had been hankering to join the “post-PC revolution” this seemed like the perfect opportunity.

But to snag the discount, I would need to show up, in person, at the local Target outlet at 6 PM on Thanksgiving itself. With some embarrassment, I explained to my wife that I would be slipping out to shop. (Fortunately for me, she was more amused than annoyed.)

The shopping sojourn went as planned. I bought the tablet without incident; no sprinting or elbows required. I even sort of enjoyed the cultural experience—chatting up other people in the line as we waited for the doors to fling open. Power-walking through the store to the electronics department. Clutching my hard-won prize on the victory walk back to the car. Most of all, I felt a strange kinship for my fellow shoppers, who like me saw fit to celebrate the Day of Gratitude by buying more stuff.

As the Black Friday fever subsided, though, I found that I had lost more than I gained. I never really found a good use for the iPad itself. (For me, tablets have always fallen “into the cracks” between devices: worse for portable usage than a phone and worse for “real work” than a laptop.) In the ensuing months, I couldn’t really justify going to such lengths to secure a device I barely ever used.

The lost money and squandered family time are bad enough. But I have a more poignant regret about my participation in Black Friday mania: I lost any credibility as a couch critic of America’s bizarre shopping celebration. How can I sneer at the “mindless hordes” gathering outside the nearest Best Buy when I’m one of them? ■

The AppleCare gamble

apple

The release of the $1,000 iPhone X has renewed a tech nerd debate: is AppleCare+, Apple’s extended warranty and accident protection service, worth the price?

The iPhone X’s sky-high repair costs change the calculus somewhat. Shatter the screen of your X (sans AppleCare), and you’re out $279. Break anything else, and Apple will charge you $549(!) to replace the phone entirely. That sticker-shock price may scare some consumers into dropping another $199 for AppleCare.

Numbers that high make me nervous, too. Still, I’ve never sprung for AppleCare+. Here are some reasons I’ve been stingy:

  • I’ve never broken a phone. In seven years of iPhone ownership across five different devices (iPhone 4, 5, 6, 7, and X), I’ve never actually damaged a phone (beyond a scratch or two). My screens haven’t shattered, and the phones with glass backs haven’t busted. That perfect track record makes me cocky.
  • I’m obsessive about cocooning my devices. When I order a new phone, I order a sturdy case and a screen protector to go with it. My iPhones literally go straight from their original packaging into a case, and they don’t come out unless I’m cleaning the device or preparing to sell it. Of course, cases can’t prevent all damage, but so far I’ve been lucky. Even some significant drops haven’t noticeably damaged my ensconced handsets.
  • Those AppleCare coverage charges add up over the years. If I had bought AppleCare for each iPhone I’ve owned, I would have spent something like $555 in about seven years—with no actually benefit beyond peace of mind. Given this savings, I feel pretty comfortable “self-insuring” my devices; I’ll keep doing it as long as my out-of-pocket repair costs (currently $0) remain lower than than what I would pay adding AppleCare.
  • My credit card makes AppleCare somewhat redundant. My Chase Freedom card offers an extra year of extended warranty protection for anything I buy. That perk eliminates one clear benefit of the AppleCare plan and further diminishes its potential value.
  • Annual iPhone upgraders enjoy perpetual warranty coverage, anyways. By the time the iPhone X’s one-year warranty expires, there’s a good chance I will have sold it and upgraded to a newer device (with its own year-long warranty). So AppleCare’s extended warranty offers no real benefit.
  • AppleCare is less attractive for those who live far from an Apple retail outlet. For many Apple customers, it’s reassuring to know that there’s a bleached-wood-and-frosted-glass palace nearby—a place to have your busted phone serviced, often in a single day. But those who live in the sticks (like me) face a week-long wait, a back-and-forth with Apple via the package couriers. In other words, phone repairs are a pain, whether I buy AppleCare or not.

In the end, I’m betting on my own ability to prevent iPhone catastrophes. As long as I don’t bust my phone more than once every few years, that gamble will continue to pay off. ■

Tracking health data for its own sake

health

For the past few months, I’ve been using my Apple Watch to track my sleep. AutoSleep uses the device’s internal accelerometer to measure both sleep quality and total sleep time.

The app is surprisingly accurate, and it’s fun to peek at my sleep stats each morning. But I don’t actually do much with that data. The numbers don’t factor directly into my bedtime decisions or sleep habits.

But that doesn’t necessarily mean that my sleep tracking efforts are fruitless. There’s something powerful about knowing where I stand—particularly when it comes to health.

Take food tracking as a example. For the past year or so, I’ve input my consumed food via an app called Lose It. I don’t really adhere to a strict daily limit, but knowing how gluttonous I’ve been earlier in the day is often just enough motivation to resist dessert. Conversely, when I don’t record my meals, I tend to overeat.

Weighing myself each morning has proven similarly useful. Again, I don’t have strict weight loss targets; I simply record my poundage day after day using Vekt for the Apple Watch. This habit populates the Apple Health database with a running tally, giving me a general idea of which direction my weight is trending.

That knowledge leads to better food choices—almost automatically. If my weight drifts too high, I find myself gravitating to healthier options—fruit instead of sweets, salads instead of sandwiches, water rather than Coke. Conversely, when I’m hovering near my ideal weight, I’ll reward myself with an extra treat or two.


All that to say, even if my sleep stats seem inconsequential, I’m going to continue wearing my Apple Watch to bed. I’m hoping that a peripheral awareness of my sleep habits may (subliminally) lead to better sleep decisions. Maybe I’ll skip that Netflix binge in favor of an early bedtime. ■

Being okay with being terrible

meta

Lately, in addition to blogging and podcasting every day, I’ve been recording short videos and uploading them to YouTube.

These vlogs are pretty bad. I address the camera from my cramped little home office—a talking head with a weird-looking haircut. My ramshackle light rig casts a yellow, washed-out pall over my face. I deliver this scripted, stilted little speech, often spouting half-baked ideas. Very few viewers ever see these sad little videos; as I record this, yesterday’s episode has a grand total of one view. One.

Making something mediocre, let alone something that’s genuinely bad, is difficult for me. I’m very much a type-A personality; I was the kid who mourned every A-minus and who restarted a piano piece every time he hit a wrong wrong note.

And it’s not hard to see the flaws in what I’m posting, particularly when I compare it to others’ work on the web. Lately I’ve been watching a lot of Casey Neistat, vlogger king. His work makes me feel simultaneously jealous and ashamed. I feel jealous because he’s so damn good at what he does. And I feel ashamed because Casey and I are almost the exact same age (we were literally born just four days apart). Two thirty-six-year-olds, one who does amazing, admired work, and one who… doesn’t.

This self-critical, all-or-nothing mindset has sabotaged my creative impulse before. I have abandoned a half-dozen online projects when I wasn’t satisfied with either the quality of the result or the (nonexistent) audience reaction. My latent perfectionism sabotaged the daily discipline, grinding the machine to a halt.

The only difference so far this time around is that I’m pushing through that discouragement and trying to ignore the results. In short, I’ve learned to be okay with being terrible. I’ve decided to just keep making stuff, whether it’s mediocre or not. ■

Fixing Portrait mode’s grossness

apple / tech

A few weeks ago, I bought an iPhone X. I love its face-unlock authentication and its gorgeously tall screen, but its dual-lens camera is easily my favorite feature. The iPhone X is the best camera I’ve every owned, and it’s not even really a competition. I’ve never had an SLR and my last point-and-shoot was from the early days of digital photography.

In fact, the iPhone X camera is so good (or, rather, “good enough”) that it’s hard to imagine I’ll ever consider buying a standalone camera again.

That’s not to say there isn’t plenty of room for improvement. In particular, I find “portrait mode” (the fake bokeh blur effect) alternately intoxicating and maddening. In many cases, it does a great job isolating my photo’s foreground subject. But when it fails, it fails hard. As many others have pointed out, hair poses a serious challenge to its algorithm, as do non-contiguous background areas (e.g. a piece of the background visible through the crook of your subject’s arm) and ambiguous edges.

Could Apple fix these sorts of hiccups in software? This is my first dance with Portrait mode, so I can’t say whether the feature has improved since its first release last year. But I have at least some hope that edge detection will improve and fill in some of the gaps (pun intended).

Even if the algorithms improve, I’d like to see some way to touch up these problematic Portrait mode depth maps. There are already several interesting apps that let me see the underlying depth channel. Slør paints it as a black-and-white alpha channel; Focos lets me spin the depth layers around like I’m in some sort of sci-fi flick (“Rotate. Zoom. Enhance.”).

But neither of those apps—nor any others that I’ve heard of—let you actually fix the depth sensing errors that creep into Portrait mode photos. Take those non-continugous background areas I mentioned earlier. Given a decent channel-editing interface, it would be relatively simple to paint a foreground area back into the background, where it belongs.

It’s possible that Apple’s developer-facing APIs won’t allow depth changes to be written back into a fully editable Portrait mode photo. If not, that’s a shame and ought to be corrected. In the meantime, though, I’d love to see an app handle depth edits “in-house”, then export a flattened photo (with the proper blur baked in) back to the camera roll.

Hopefully that sort of functionality arrives soon. Portrait mode is a blast, but it’s hard to feel too enthusiastic when it produces such glaring flaws. ■

Detecting CTE—before it’s too late

culture / sports

Tom Schad writes about a potential breakthrough in diagnosing chronic traumatic encephalopathy (CTE), the degenerative brain disease caused by contact sports like football:

So far, the existence of CTE, a neurodegenerative brain disease, can only be confirmed through an autopsy. But scientists have been conducting positron emission tomography (PET) scans of the brains of former football players and military members, looking for patterns in living subjects.

That’s where McNeill [i.e., Fred McNeill, former linebacker for the Minnesota Vikings] comes in.

Four years ago, scientists noticed spots in McNeill’s brain that appeared damaged. More recently, an autopsy revealed the presence of the protein associated with CTE in those exact spots.

Up till now, CTE couldn’t be definitively detected in living patients. That meant you could never be sure if a former player’s symptoms—memory loss, mood swings, depression—were the result of football injuries or some other natural cause. You would have to wait until the individual died and their brain tissue could be sliced up and examined.

But if this new scanning method proves reliable, it could erase any doubt about cognitive symptoms. Such a black-and-white test would make it difficult for NFL players to shrug off the dangers of football in favor of a massive payday.

But it’s not just the pros who would be impacted. PTE scanning could further football as a whole. If this “CTE test” is sensitive enough to detect the disease in its earliest stages, it could accelerate the game’s decline at all age levels. As high school players are tested, they (and their parents) would be confronted with the damage the sport has done to their brains. Families would start to question whether the risks of football justify the rewards.

As young players’ interest declines, a chain reaction could ignite. More and more high schools would shutter their football programs. This means fewer available players at the collegiate level, as well as a diminished quality of play (because there’d be a talented pool of players). Fan interest declines, and Division II and III schools disband their teams. Things continue to accelerate, and in the end, football becomes a niche gladiatorial pursuit, rather than a national pasttime—more like MMA than baseball.

Honestly, that’s what I hope will happen. Football is irredeemably brutal. It’s doesn’t matter how many programs institute tackle-free practices, which concussion protocols are adopted, or what technological wizardry is packed into the helmet. Those adaptations can’t change the fact that the fundamental element of the game—bodies smashing bodies—ruins men’s minds.

This new test might helps force us to confront that fact—now, rather than once it’s too late. ■

Never unintentional: my brain on linear TV vs. Netflix

movies / TV

We typically spend the late-year holidays visiting my in-law’s home in Pennsylvania. It’s a welcome downshift from our usual, frantic pace. On many of these visits, I’ve watched more traditional, linear-programmed cable TV in one long weekend than I have in the rest of the year combined.

There’s a warm, zombified state that settles in after so watching many Property Brothers episodes. My body falls into sleepy hibernation, lying motionless on the couch for hours on end. My metabolism enters ‘slow burn’ mode, expecting a steady stream of pumpkin pie and sugar cookies. And my brain quiets, barely registering when one hour of bad TV bleeds into the next. The day rolls by.

However, in more recent holiday seasons, these cable TV binges have grown less frequent, for at least two reasons. First, we have a daughter now, and she prefers that her parents be play partners, rather than comatose couch potatoes.

Another reason I don’t binge on cable quite as often? The internet has fundamentally changed my relationship to content, and it’s hard to go back. I’ve grown accustomed to programming my own playlists, and I’ve grown resistant to “choice-less” consumption.

This change isn’t just about Netflix versus cable. Terrestrial radio’s bland playlists and brash commercials also turn me off; give me my podcasts instead. The satellite TV feeds offered on my recent cross-country flights didn’t tempt me in the slightest. I turned to my phone instead, which was chock-full of favorite vlogs, TV series, and movies. Even magazines bore me; why read fluffed-up filler, when I can hand-pick the best of the web?

There’s a huge difference in mindset for these two consumption methods. One, the traditional model, makes me passive and powerless. Someone else steers the ship, and I get sucked into its current. Linear TV puts me at the mercy of the least common denominator; I unintentionally wind up watching formulaic, overproduced reality TV.

In contrast, the internet makes content consumption more purposeful. I watch shows that I actually want to watch. I gravitate to shows with great writing and production values: content that delights me, thrills me, or makes me think. And when a show is bad? I’m just engaged enough that I don’t keep watching mindlessly. Instead I’ll switch and watch watch something else. Or I’ll turn off the gadgets and (gasp!) actually head outside.

(I still bring along the cookies.) ■

“Smart” homes are dumb.

tech


UPDATE (11/16/2017): Steven Aquino provides a helpful reminder: smart home devices provide accessibility benefits that easily outweigh the finicky quibbles I raise below:


Smart devices are all the rage. You can buy an internet-connected version of nearly every home appliance and gadget: light switches that respond to your digital assistant’s commands. Shades that automatically open or close, depending on the weather forecast. Power outlets that switch on your lamp as you pull into the driveway. Light bulbs that match their color to what’s on your TV screen. Thermostats that lower the temperature when you leave the house. Deadbolts that unlock when your phone draws near.

Gadgets like these are fun; I’d love to play with some smart bulbs or a robot vacuum at some point down the road.

But beyond that? I don’t really understand why anyone would install a semi-permanent smart device in their house.

On the one hand, there’s the “faux-convenience” factor. With many smart home gadgets, you’re trading a device that’s simplistic but predictable for one that’s “advanced” but finicky. Consider: if a dumb light switch stops working, there’s a very limited number of things that could have gone wrong—basically, either the wiring came loose or the circuit breaker blew.

But with a smart light switch, you have those potential problems, plus many others. Maybe the device’s firmware is buggy. Maybe the manufacturer hasn’t updated their app for your new phone hardware. Maybe the smart home platform itself is half-baked. Maybe the trigger service (e.g. IFTTT) is offline. Perhaps the automation you programmed failed to anticipate the fall time change. The list of potential troubleshooting steps goes on and on.

You may be just nerdy enough to enjoy debugging your house. More power to you. But do the other residents of your home feel the same way? Chances are, your roommates, significant other, guests, or kids would prefer that things just work. What happens when you’re away for the night and your spouse can’t turn on the lights? You’ve basically extended the problem of over-complicated home theater set-ups to your entire house.

And what happens when you try to sell your “smart home”? Most buyers won’t be interested in inheriting your complex network of domestic devices. They may not share your penchant for tinkering, and they may view your smart gadgets as a maintenance nightmare, rather than an automation dream.

Even if you plan to stay in your current house forever, you face the problem of longevity. It’s not unusual for “dumb” devices to last for decades—even generations. When’s the last time a manual light switch or doorknob in your house just stopped working?

A smart device is a ticking time bomb. It’s only a matter of time until the a) heat, dust and age render the unit inoperable or b) the device is deprecated by the manufacturer or the home hub vendor. An innocuous-looking app update could render your light switches inoperable. By installing smart devices, you’ve condemned yourself to upgrading the unit every decade (if not more frequently).


Computer miniaturization has led to remarkable quality-of-life improvements. A smart phone is infinitely more capable than the spiral-corded kitchen handset I grew up with. But that doesn’t mean that every device in my house deserves its own CPU. For basic home operations, rock-solid reliability is the only feature that matters. [Edit: that may be true for me, but not for everyone. See preface above.] Give me basic functionality over whiz-bang capability—at least when it comes to flicking on the lights. ■

Why Apple’s retail stores make me nervous

apple / culture


I grew up in Johnstown, Pennsylvania, a city with an ignominious reputation as a place where the rich abuse the poor. There are two infamous examples: first, a devastating, deadly flood in the 1800s, literally caused by the negligence of wealthy country clubbers. Second, the calamitous collapse of Johnstown’s manufacturing economy, caused by the steel industry’s decline. Tens of thousands of local workers lost their jobs.

As I was growing up in the 80s, Johnstown’s steel mills were shuttering en masse. Robbed of its primary industrial driver, the town imploded in slow motion. Retail decay was everywhere: once-bright storefronts patched with plywood. A deserted downtown. The closest grocery market transforming into a half-empty thrift store. Everywhere you shopped, things felt old and broken. Dingy, cavernous, fluorescent-lit spaces became the norm.

Uncomfortable luxury

Maybe that’s why Apple’s luxurious, meticulously-maintained retail spaces make me nervous. Its outlets resemble high-end, big-city fashion boutiques, more than they do the Rust Belt K-Marts of my youth. For lower-middle class consumers (like me), the Apple Store is the ritziest retail experience they’ve ever encountered—let alone shopped at.

Don’t get me wrong; I can appreciate a carefully-designed space like Apple’s new Chicago store. It’s gorgeous, thanks to its riverfront location, its two-story window wall, and its premium materials (e.g. a carbon-fiber roof and the familiar bleached-wood product tables).

But every time I visit an Apple retail shop, I feel guilty. I can’t help but think, “I’m paying for this experience. Apple’s charging me extra so that they can afford their premium real estate, massive video walls, and all-glass staircases.” That luxury feels like a waste and makes me second-guess my unswerving brand loyalty. “Maybe,” I think, “These products aren’t meant for people like me.”

That’s one reason I prefer buying my Apple devices online. It’s not just about convenience; it’s about willful ignorance. By skipping the manicured Apple retail store, I can overlook the ways that the Apple lifestyle grates against my childhood experience. ■