Categories
culture

Permanent Daylight Saving Time? No, thanks.

As we’re reminded each spring and fall, time changes are disorienting and disruptive. But the renewed movement to make Daylight Saving Time permanent is misguided—maybe even dangerous.

Yes, DST might get us home before sunset in the winter. At what cost, though? The mornings would be brutal. If Standard Time were eliminated, dawn would come ridiculously late to many North American cities:

CityDecember 21 sunrise time if DST were permanent
Chicago8:15 AM
Washington, D.C.8:23 AM
Seattle8:55 AM
Calgary9:37 AM
Anchorage11:14 AM

Few of us enjoy waking before dawn. Imagine if your morning commute and arrival at work happened in the dark, too.

That’s not just an annoyance; it could have serious public health implications. Human circadian clocks thrive when we’re exposed to early-morning sunlight—that’s why light therapy for Seasonal Affective Disorder is often administered immediately after waking.

What would happen if the entire continental population got dramatically less morning sun? The public health impact might be epidemic. ◾

Categories
culture

Thoughts on ‘Hamilton’ and theatergoing in the age of the smartphone

Last week, my wife and I had the opportunity to see Hamilton live in Pittsburgh.

We had decent enough seats—ten or so rows back in the left-hand orchestra section. While the view was partially obscured (we couldn’t see the elevated balcony at stage right), we were close enough to see the actors’ facial expressions clearly.

We had an amazing time. There’s a reason that Hamilton is a worldwide phenomenon; it’s a remarkable work of art. The show is cleverly self-referential—reprising leitmotifs and coyly paying off its dramatic promises. It’s playfully historical—grounding itself in real events but freely reinterpreting them, too. And it’s strikingly original—showcasing a genre foreign to Broadway, while also paying homage to musical theater’s long history.

However, my biggest takeaway from the show had nothing to do with the onstage performance. I was more fascinated by what happened in the theater during intermission. While Emily ran to the restroom, I sat and watched the crowd.

Here’s the thing: everyone was on their phone. I mean, literally 90% of the audience spent the intermission either staring at their smartphone or cradling it in-hand. There were very few exceptions: the very old (some of whom may prefer not to own a phone) and the very young (i.e., kids who probably can’t wait for their first hand-me-down device).

We’ve gone through an incredible societal transformation in just a decade. Twelve years ago, a Broadway intermission would have felt very different. Sure, a few people might have made a phone call on their flip phone, but nobody could’ve pulled an addictive “everything” device out of their pocket.

What did the 2006 audience do during those twenty-minute breaks? Doubtless, many would’ve buried their noses in the program—perusing the cast bios or the second act’s song list. But many attendees would’ve chatted up a neighbor and reflected together on the show. The hall’s decibel level might’ve been significantly louder—many more voices adding to the cacophony (rather than silenced in rapt attention to their phones). ◾

Categories
culture tech

Should we feel bad for loving Apple keynotes?

Today is the “high holiday” in Apple’s liturgical calendar: iPhone keynote day. In a few hours, Tim Cook and his cardinal executives will unveil the new devices designed to drive Apple’s business during the upcoming year. Apple devotees around the world will attend (virtually), eager to heap adoration on the innovations heralded from Cupertino.

That may sound a bit cynical, but the whole Apple scene is a little silly. We’ve spent the past year speculating about today’s event on podcasts, on Twitter, and in blogged think pieces. We’ve chased down a thousand supply-chain rabbit trails. Today, we’ll salivate over devices that are only incremental improvements over the ones already in our pockets and strapped to our wrists. And in the weeks to come, we’ll exhaust ourselves in post-event analysis—then prepare to hand over piles of cash to buy into the hype.

Honestly, we invest too much time and money in these keynotes, considering the serious news unfolding in the “real” world. While we focus on Apple, a hurricane is bearing down on the East Coast. Free speech is under threat throughout the country. Refugees struggle just to survive.

Should we geeks feel guilty about our self-absorption and shallowness? The answer is “Yes, probably.”

But technology enthusiasts aren’t unique in enjoying frivolous distraction from more important things. Others, for example, follow the celebrity fashion scene. They visit TMZ every hour, follow faux-celebrities on Instagram, and plan their TV-watching around which starlets guest-star on which talk shows. This world has its own “high holidays,” too—for example, the red carpet preshow at the Academy Awards. As at the Apple keynote, industry leaders parade for the cameras, sporting fashions that viewers will eagerly buy in the upcoming year.

Or consider the world’s preeminent distraction: sports, into which so many Americans enthusiastically invest free time. Every team, for example, is orbited by a cadre of sports radio hosts, newspaper writers, podcasters, Twitter personalities, team-focused TV shows, and (most of all) fan bases that consume all this media. Hardcore fans gladly plunk down thousands for game tickets, cable TV packages, team jerseys, and memorabilia. And the “high holidays” come fast and frequent: home games tailor-built for tailgating, draft days, playoff runs, bowl games. It’d be hard to argue that sports deserves this level of attention (and consumption) any more than technology.


Of course, other people’s obsessions don’t justify our own. The existence of fashionistas and sports nuts doesn’t mean that it’s okay that geeks spend so much time and money on tech.

But it helps to know we’re not alone in our penchant for expensive hobbies.  ■

Categories
culture

Playgrounds are deserted

My daughter is a connoisseur of fine playgrounds. Often, when we’re driving through somewhere unfamiliar, we’ll hear an excited voice from the backseat: “Look! Over there!” Sure enough, there’ll be a tell-tale yellow slide or a row of swings on the horizon.

What we typically won’t see? People. Wherever we go, whatever the day or time, America’s playgrounds seem empty. No new parents feeding newborns on benches, no infants swaying in the baby swings, no top-heavy toddlers stumbling up the ramps, and no grade-school kids leaping brazenly from the uppermost parapets. Of course, there are exceptions—well-placed, unique parks that still attract a crowd—but more often than not, we’re the only family at a playground.

Why is this? Was it always this way? If not, what changed?

One explanation I can rule out: kids didn’t abandon our parks because playgrounds somehow got worse. Yes, they’ve removed the jagged metal edges and concrete pads of decades past, but playgrounds have undeniably improved over the years. They now feature double curly-Q tunnel slides, massive subterranean mazes, bouncy bridges, two-person swings, climbing walls, and countless other “play-ventions” that didn’t exist when I was a kid. Even fast food playgrounds have evolved into four-story-tall wonder-worlds.

Our playgrounds are better than they’ve ever been. So what it is it? What’s keeping the kids away? Here are some guesses:

  • We’re too busy. Parents are stretched thin and can’t spare the time to prioritize their kids’ outdoor play. For their part, kids have overpacked schedules, too, bouncing from one extracurricular to the next: sports, music, dance, etc., etc.
  • We’re scared. In another era, many parents wouldn’t hesitate to let their children walk a few blocks or ride their bikes to the neighborhood playground and stay there for hours on end. That sort of “leash-free” parenting is pretty rare these days, in an era when cable news amps up our suspicion and anxiety to irrational levels.
  • Blame the screens? As our daughter grows, she’s increasingly obsessed with watching TV and playing simple video games on her tablet. “I just want to watch TV all day,” she pouts, when we take her Kindle away. We’re not alone in this struggle, I know. The kids missing from the playground may well be cooped up inside, staring at a TV screen or poking away at an iPad.

The truth is that all of these explanations probably factor into the exodus of children from the public square.

Of course, on the one hand, we like the fact that playgrounds are uncrowded. Our daughter never has to wait for her plaything of choice, and there’s plenty of room for us to join her, without any worry about stomping someone else’s munchkin.

But on the other hand, town councils and municipal committees are bound to notice that their pristine playsets are nearly always empty. Will they continue to spend precious tax dollars on building and maintaining playgrounds, when so few residents patronize them? ■

Categories
culture TV

Me and Mister Rogers

Neighborhood watch

As a kid, I loved Mister Rogers’ Neighborhood.

I had reasons to like the star; I was a quiet, gentle kid from southwestern Pennsylvania, and Fred Rogers was a quiet, gentle man from the same area. More importantly, I had precious few male role models in my family life, and Rogers modeled a warm-hearted, happy, self-assured masculinity that didn’t rely on mustering bravado or projecting toughness. Instead, he expressed his feelings, smiled and laughed, and freely shared his vulnerabilities. That gave me hope, as an insecure kid.

Of course, I eventually outgrew the show. Mister Rogers was geared for the five-and-under crowd, and I moved on to other series: Square One, Carmen Sandiego, Batman: the Animated Series.

Still, I retained an affection for Mister Rogers, and I would check in on his show from time to time, even as a teenager. It was reassuring to see his program continue, largely unchanged. Oh, his hair was whiter and his posture more stooped, but he was that same happy neighbor, beaming as he stepped into that familiar, dingy little sound stage.

The trolley, but bigger

My reentry into Fred Rogers’ orbit came from an unexpected angle: a summer job.

In spring of 1999, as my high school graduation neared, I needed to earn cash for college, but I dreaded the thought of another summer spent mowing lawns or slinging quarter pounders at McDonald’s. Fortunately, I had another option: the family-friendly amusement park near my house.

At the brief screening interview, I expressed interest in a “character” role—a park job that that involved performing a script, rather than pushing a sequence of buttons. I secretly hoped to land a job leading tours at the Wild West illusion house, where I’d get to create an over-the-top, old-timey character. (More importantly, I’d spend the summer working with my then-girlfriend, who was returning to that same role.)

To my dismay, there were no open positions at the illusion house. Instead, I was offered a job as the only male trolley driver on the park’s “Mister Rogers’ Neighborhood of Make-Believe” ride.

Yes, this was a real thing. The thirteen-minute experience piled thirty park guests into a life-size replica of the trolley from Mister Rogers’ show. This electric train trundled through a plywood tunnel and emerged into a humid, sun-dappled patch of forest. The track wound its way through Mister Rogers’ Neighborhood of Make-Believe, stopping at King Friday’s castle, the tree house of X the Owl and Henrietta Pussycat, Lady Elaine Fairchild’s Museum-Go-Round, and Daniel Striped Tiger’s clock house.

My job was to “drive” the trolley through the Neighborhood and encourage passengers to engage with its animatronic residents. As our trolley neared, each character would emerge from its set and “talk” (i.e., play back a recording of Fred Rogers himself, in character). Pauses in their delivery were my cue to recite a well-memorized script.

The plot wasn’t exactly Shakespearean; at the first station, King Friday commanded us to invite every character we met to attend an imminent “Hug and Song” party. At each stop along the way, I would dutifully lead the passengers in the prescribed mantra: “Come along, come along, to the castle Hug and Song.”

I spent two full summers driving the trolley, and this routine grew very familiar.

For example, by my calculations, I recited that “Hug and Song” line tens of thousands of time. By the end of my second season, I could have performed the script in my sleep and knew precisely where there was room for improvisation.

By sheer repetition, I had also mastered the skill of trolley-driving: I could stop the massive train on a dime and could tell by feel when the tracks had been recently greased. I knew exactly how each scene was likely to malfunction, too: the Merry-Go-Round would fail to spin open, leaving Lady Elaine to squawk at us from inside. X the Owl’s door would get stuck. Daniel Tiger, true to his shy reputation, would stay hidden away inside his clock. I had even invented ways of explaining away these problems, satisfying curious kids and amusing parents with a knowing wink.

It was a good job, as park jobs go, and it kept me entertained far better than working the carousel or the roller coaster ever could have. Still, the work eventually grew tiresome, and as my second summer drew to a close, I was eager to disembark the trolley—permanently.

Meeting the man himself

There was one perk of trolley-driving I haven’t yet mentioned: we were treated to visits from the show’s stars. For example, more than once, Mr. McFeely (the Neighborhood mailman) dropped by. All fine and good, but that paled in comparison to the time that Mr. Rogers himself visited.

We spent the better part of a week sprucing up the ride for Rogers’ arrival. We swept and re-swept the loading deck, scrubbed down the trolleys, and washed the scene platforms along the track. Park maintenance repaired animatronic malfunctions that hadn’t worked properly for ages. Everything was well-oiled, crisp, and shiny when an elderly Mr. Rogers showed up, slim and hunched but not particularly frail.

There’s not much I can say about Fred Rogers himself that others haven’t written more eloquently. But it’s true what they say: his real-life personality was very similar to the one he projected for the TV audience. I remember that he smiled a lot and that he seemed genuinely interested in each of us college kids working the ride.

We lined up for photos (I still have that snapshot, somewhere) and accompanied Mr. Rogers to a nearby pavilion, where we shared a picnic lunch and said our goodbyes. It was a wonderful way to bookend my summer—and my twenty-year relationship with Mr. Rogers as his “television neighbor.”

Last thoughts

A few years later, I was heartbroken to learn that Fred Rogers had passed away. He had kept his stomach cancer a secret from the public and died soon after his diagnosis, at the age of 74.

Reading through his obituaries, I was astonished to learn that Rogers and I had shared a birthday. That’s a coincidence, of course. But it felt significant to me—one more thread linking me to a remarkable man. ■

Categories
culture

My favorite Black Friday tradition? Complaining about Black Friday.

<!––>It’s Black Friday—the highest of high American holidays.

For years, I puzzled about the traditions that have built up around the start of the holiday shopping season. “Who in their right mind,” I wondered, “would willingly wake before dawn to stand outside in sub-freezing temperatures, on the off chance that they might score a slight discount on a terrible TV set?”

And so I scoffed at the plebian masses, standing in line for the right to be the first to sprint into their local Wal-Mart. I shook my head sagely when the local news aired videos of shoppers trampling and berating each other to get their hands on the latest disposable toy. I rued the erosion of a family-oriented holiday and derided retailers who opened their doors on Thanksgiving evening.

Looking back, I relished my own Black Friday tradition: complaining about Black Friday.


Alas, I gave up my right to sneer at early holiday shoppers a few years ago—when I became one of them.

Back in 2013, at the height of the iPad’s popularity, Target announced a killer Black Friday deal: $20 off, plus a $100 gift card. I had been hankering to join the “post-PC revolution” this seemed like the perfect opportunity.

But to snag the discount, I would need to show up, in person, at the local Target outlet at 6 PM on Thanksgiving itself. With some embarrassment, I explained to my wife that I would be slipping out to shop. (Fortunately for me, she was more amused than annoyed.)

The shopping sojourn went as planned. I bought the tablet without incident; no sprinting or elbows required. I even sort of enjoyed the cultural experience—chatting up other people in the line as we waited for the doors to fling open. Power-walking through the store to the electronics department. Clutching my hard-won prize on the victory walk back to the car. Most of all, I felt a strange kinship for my fellow shoppers, who like me saw fit to celebrate the Day of Gratitude by buying more stuff.

As the Black Friday fever subsided, though, I found that I had lost more than I gained. I never really found a good use for the iPad itself. (For me, tablets have always fallen “into the cracks” between devices: worse for portable usage than a phone and worse for “real work” than a laptop.) In the ensuing months, I couldn’t really justify going to such lengths to secure a device I barely ever used.

The lost money and squandered family time are bad enough. But I have a more poignant regret about my participation in Black Friday mania: I lost any credibility as a couch critic of America’s bizarre shopping celebration. How can I sneer at the “mindless hordes” gathering outside the nearest Best Buy when I’m one of them? ■

Categories
culture sports

Detecting CTE—before it’s too late

<!––>Tom Schad writes about a potential breakthrough in diagnosing chronic traumatic encephalopathy (CTE), the degenerative brain disease caused by contact sports like football:

So far, the existence of CTE, a neurodegenerative brain disease, can only be confirmed through an autopsy. But scientists have been conducting positron emission tomography (PET) scans of the brains of former football players and military members, looking for patterns in living subjects.

That’s where McNeill [i.e., Fred McNeill, former linebacker for the Minnesota Vikings] comes in.

Four years ago, scientists noticed spots in McNeill’s brain that appeared damaged. More recently, an autopsy revealed the presence of the protein associated with CTE in those exact spots.

Up till now, CTE couldn’t be definitively detected in living patients. That meant you could never be sure if a former player’s symptoms—memory loss, mood swings, depression—were the result of football injuries or some other natural cause. You would have to wait until the individual died and their brain tissue could be sliced up and examined.

But if this new scanning method proves reliable, it could erase any doubt about cognitive symptoms. Such a black-and-white test would make it difficult for NFL players to shrug off the dangers of football in favor of a massive payday.

But it’s not just the pros who would be impacted. PTE scanning could further football as a whole. If this “CTE test” is sensitive enough to detect the disease in its earliest stages, it could accelerate the game’s decline at all age levels. As high school players are tested, they (and their parents) would be confronted with the damage the sport has done to their brains. Families would start to question whether the risks of football justify the rewards.

As young players’ interest declines, a chain reaction could ignite. More and more high schools would shutter their football programs. This means fewer available players at the collegiate level, as well as a diminished quality of play (because there’d be a talented pool of players). Fan interest declines, and Division II and III schools disband their teams. Things continue to accelerate, and in the end, football becomes a niche gladiatorial pursuit, rather than a national pasttime—more like MMA than baseball.

Honestly, that’s what I hope will happen. Football is irredeemably brutal. It’s doesn’t matter how many programs institute tackle-free practices, which concussion protocols are adopted, or what technological wizardry is packed into the helmet. Those adaptations can’t change the fact that the fundamental element of the game—bodies smashing bodies—ruins men’s minds.

This new test might helps force us to confront that fact—now, rather than once it’s too late. ■

Categories
apple culture

Why Apple’s retail stores make me nervous

I grew up in Johnstown, Pennsylvania, a city with an ignominious reputation as a place where the rich abuse the poor. There are two infamous examples: first, a devastating, deadly flood in the 1800s, literally caused by the negligence of wealthy country clubbers. Second, the calamitous collapse of Johnstown’s manufacturing economy, caused by the steel industry’s decline. Tens of thousands of local workers lost their jobs.

As I was growing up in the 80s, Johnstown’s steel mills were shuttering en masse. Robbed of its primary industrial driver, the town imploded in slow motion. Retail decay was everywhere: once-bright storefronts patched with plywood. A deserted downtown. The closest grocery market transforming into a half-empty thrift store. Everywhere you shopped, things felt old and broken. Dingy, cavernous, fluorescent-lit spaces became the norm.

Uncomfortable luxury

Maybe that’s why Apple’s luxurious, meticulously-maintained retail spaces make me nervous. Its outlets resemble high-end, big-city fashion boutiques, more than they do the Rust Belt K-Marts of my youth. For lower-middle class consumers (like me), the Apple Store is the ritziest retail experience they’ve ever encountered—let alone shopped at.

Don’t get me wrong; I can appreciate a carefully-designed space like Apple’s new Chicago store. It’s gorgeous, thanks to its riverfront location, its two-story window wall, and its premium materials (e.g. a carbon-fiber roof and the familiar bleached-wood product tables).

But every time I visit an Apple retail shop, I feel guilty. I can’t help but think, “I’m paying for this experience. Apple’s charging me extra so that they can afford their premium real estate, massive video walls, and all-glass staircases.” That luxury feels like a waste and makes me second-guess my unswerving brand loyalty. “Maybe,” I think, “These products aren’t meant for people like me.”

That’s one reason I prefer buying my Apple devices online. It’s not just about convenience; it’s about willful ignorance. By skipping the manicured Apple retail store, I can overlook the ways that the Apple lifestyle grates against my childhood experience. ■

Categories
apple culture

Conspicuous consumption and the iPhone X

Tech pundits occasionally suggest that some gadget purchases are driven by conspicuous consumption. In this view, a device like the iPhone X serves as a status symbol—a way to assert that you have (and can afford) the best.

This mindset is completely alien to me. Does anyone actually want to broadcast their buying decisions in this way? Both my wife and I hail from lower-middle-class families, for whom frugality is a (perhaps the) prime virtue. We pinch our pennies, drive our cars until the wheels fall off, and fix things ourselves—even when we’d be better off hiring an pro.

This thrifty mindset extends to gadget purchases. Our tribe takes pride in not carrying the latest and greatest devices. From that perspective, the so-called “stagnant” design of the iPhone 6, 6s, 7, and 8 was actually quite appealing. Toss that device in a case, and no one else could tell if you were rocking a three-year-old handset (nice!) or a brand-new device (for shame).

Not so with the iPhone X. Between its bezelless screen, dual camera, and unmistakable notch, Apple’s flagship is easily identifiable. Even someone who’s only casually familiar with Apple’s handset lineup can pick the X out of a crowd of devices.

This “recognizability” was a reason I considered avoiding the iPhone X. A device this expensive serves as a negative status symbol among our friends and family. Most people know by now that the X is the “thousand-dollar phone.” Owning it sends adverse signals about your character; others may think you’re either flaunting your discretionary cash, or that you’re spending your hard-earned money foolishly.

So, given the choice, I’d prefer inconspicuous consumption over status shopping. Give me gadgets that feel luxurious but don’t look luxurious. I’d rather buy my iPhone in a cavernous, filthy, fluorescent-lit, bargain warehouse than the glass-walled, immaculate boutique of an Apple store. ■

Categories
culture

My media consumption habits vs. the average American’s

According to eMarketer, U.S. adults spend twelve hours each day consuming media; that amount has increased 24 minutes since 2012. These numbers may seem impossible, but consider that consumption sessions can overlap:

For example, an hour spent watching TV while simultaneously using a smartphone counts as an hour of usage for each medium, and therefore as 2 hours of overall media time.

Here’s how those numbers break down for the average American:

My only takeaway from these numbers, on their own: it’s amazing how much time the average American carves out for TV. I’m at least a little bit jealous.

Here’s my attempt to estimate my own daily media consumption:

Some notes:

  • My total consumption hours are almost exactly the same as the average American’s: twelve hours each day. I’m not sure whether to be alarmed or relieved by that.
  • My definition for each medium may not match eMarketer’s. For my purposes, “TV” is anything I watch on our big screen—i.e., it includes streaming media (Netflix, YouTube, and Amazon Prime). In my “radio” bucket, I do listen to a bit traditional FM broadcast in the car (mostly NPR). But that category also includes Spotify (instrumental music is the soundtrack for most workdays), podcasts, and my voice-guided meditation app. My ‘radio’ hours are high, but is that because I’ve bucketed things differently?
  • Along the same lines, what counts as “consumption”? I use my devices all the time—literally from the moment I wake up to the moment I fall asleep. But for big chunks of the day, I’m making stuff: designing collateral for work, drafting blog posts, recording podcasts, composing emails, etc. I assume that eMarketer is making a similar distinction between active, productive work and passive consumption, but I’m not sure.
  • My media consumption on weekdays differs dramatically from what happens on the weekend. For example, during the week, I rarely watch more than one episode of a TV show before I get sleepy and stumble off to bed. On the weekend, though, I’ll often watch a movie or binge on Netflix for several hours.
  • Similarly, my smartphone use skyrockets on lazy weekend days; I’m far more likely to exhaust my iPhone battery on a Sunday than on the average work day.
  • Looking back, it’s astonishing how much my own habits have shifted in the past few decades. In high school, before I had easy access to the web, I would often spend my study hall flipping through the local newspaper. These days, my print consumption has dwindled to nearly zero—yes, we subscribe to the local newspaper, but I rarely get past the first fold.
  • Another big change since my teenage years? Back then, my media world revolved around the TV. Between traditional linear television and console gaming, I probably spent 5–6 hours each day planted in front of the boob tube. Now, I don’t game at all (I haven’t owned a console since the PS2). And even when the TV is on, it sits on the periphery of my attention. TV mostly serves as background noise for my #1 media consumption activity: browsing Twitter on my smartphone. ■