A lot of people I follow in tech gush over high-end cameras. When I see a piece on cameras, I just zone out, and it all becomes a sea of meaningless technical terms: mirrorless micro four-thirds full-frame DSLR with pancake lens at f/32 aperture… whatever. The camera on my iPhone 5S is more than good enough for the few photos I take of my life. I have to wonder if the geek obsession with high-end cameras is, in part, because point-and-shoot digital and smartphone cameras have become good enough for the average person. When I read articles defending the purchase of fancy cameras, there’s a recurring mantra of “you’ll regret it when your kids grow up and all you have are cell phone pictures.†I don’t know about other people in my age group, but I remember growing up with albums of badly exposed 35mm prints from point-and-shoot film cameras. My parents didn’t mind, and I doubt the parents of most other people my age minded either.
As the baseline quality of various goods increases, there’s always going to be an audience that demands something “better†for whatever reason. The criteria of “better†varies, but among the geeky, tech-savvy population, “better†has a specific meaning. “Better†used to mean higher specs: more memory, more disk space, faster graphics, but specs in consumer tech have become almost meaningless in the last few years. We’re at a point where you can buy a computer that’ll do all the things the average person needs to do for pocket change, comparatively. It’s not good for corporate profit margins, but you can find a machine, even one that isn’t crawling with privacy-invading, performance destroying crapware, for less than half the price of a base 11-inch MacBook Air. Hell, there’s a perfectly serviceable Windows 8 tablet you can buy for $80. You don’t have to spend four figures on a new computer, if you don’t want to, let alone a smartphone or other device.
In response, for geeks “better†is about less quantifiable things, like taste. While I’m probably the last person who would recommend a Samsung phone, I’m not about to bash the taste of a Samsung owner. [1] When certain tech pundits—Jim Dalrymple and John Gruber especially—start calling people who buy certain products idiots, or claiming a covered micro-USB port on the back of a smartwatch is a mark of bad taste, they’re losing the thread. It’s not that Samsung customers are idiots, or the designer of the Sony Smartwatch has no taste, it’s that their priorities are in a different place. Maybe the person on the subway with a Samsung phone just needed a smartphone, and that’s the one the sales guy at the carrier store got a SPIFF for flogging, and they were in no mood to comparison shop? For some people, a phone is a phone. It’s not a question of “taste.â€
It’s not hard to extend this to other geeky obsessions with quality: fussy coffee prepared fussily, artisanal notebooks and fountain pens, perfectly clear ice cubes for your cocktails[2], high-end audio equipment, and fancy bags for carrying all your fancy shit around. And I’m not immune to this phenomenon either. Though I don’t drink fussy coffee, or use a fountain pen, I have a fancy, artisanal pocket notebook I keep in an equally artisanally crafted leather cover. I collect records, and listen to them on a fancy turntable, on audiophile headphones with a “flat frequency response†through a tube amp. Why? I like the sound and the experience of listening to music on vinyl. A dollar store notebook, or a stack of index cards would serve my writing fine. I’m sure that I wouldn’t notice any difference if I was running my turntable’s audio through a solid state, digital amplifier, but I went with the tube amp. Despite vinyl being an inferior format for replicating audio.
There’s nothing wrong with liking the crazy, fancy stuff us geeks like. We can’t control our obsessions, but we can control how we communicate them to others. Smug superiority gets us nowhere. The elitism that too often creeps into any discussion of our obsessions is maddening to hear, even by some of us who share the obsession. It behooves all of us who talk about technology to get a little more perspective about the people who buy it, people who aren’t as obsessive about it for the same reasons we are. Let’s recommend the stuff we like, and be honest as to why. Let’s not assume that our reasons are the only valid ones, and stop impugning those who think different.
April 8th, a date that will live in infamy as the day the initial Apple Watch review embargo ended. There’s too many reviews to link here, so I’ll just link to this post linking to all the other reviews. The general consensus seems to be that it’s like most version 1 Apple products: pretty, with a lot of potential, but it still has a way to go. Reading the reviews, particularly from those new to smartwatches reminds me a lot of my Pebble experience, if a little more positive overall. And while I like my Pebble, if the opportunity arose to swap my Pebble for an Apple Watch tomorrow, I’d do it in a heartbeat.
One running theme I noticed in the reviews I bothered reading [1] is the problem of notification overload. I was under the assumption, possibly mistaken, that the Apple Watch setup process required users to winnow down the notifications they get. Either that’s the case and those complaining didn’t bother, or they get way more notifications in a day than I get in a week, even after paring down. Maybe both. That said, it wasn’t Apple who promulgated the idea that the Watch would be the panacea for notification overload.
As someone who’s been cautiously bullish on the smartwatch, at least since trying one, it’s hard to not be a little disappointed by the initial reaction. Most of the issues unrelated to an excess of wrist-tapping: apps not loading and general pokiness seem to be the sort that can be remedied with software updates on the phone and watch alike. Time will have to tell, not only if that’s the case, but if ordinary people will be using Apple Watches the way the reviewers did. Technology journalists don’t live entirely in the same world as everyone else.
Certain folks on the technology commentary beat don’t like the idea of talking about a product based on future potential. When you’re talking about a new category of device, you have nothing else you can judge it on, especially when going by a week of use (or someone’s subjective opinion on their week of use). I’ll repeat the same mantra I have whenever I write about this topic, at least since trying the Pebble: there is a lot of potential in the smartwatch. We just haven’t figured out how to tap into it yet.
I’m excited by what’s happening in this space. If Google can bring Android Wear to iOS, I’d like to try it, and put my Pebble aside for a while. When Pebble OS 3.0 finally arrives for the older hardware, I’m curious to see if the timeline idea—the one Apple rejected—is a better interface for wrist mounted technology. The potential is incredible for this to be more than a shiny, vibrating wrist-bauble.
Wearable technology has been in the Neolithic Age for the past few years. I think Apple Watch is the dividing line that marks the Bronze Age of wearables—or maybe the 18-Karat Rose Gold Age. The only way to know for sure, however, is to wait it out and see. And for those still skeptical of the whole category, if you have $99, just buy a damn Pebble and try it for a couple weeks. You’ve probably spent more on stupider stuff.
John Gruber, The Verge, The New York Times, TechPinions, and The Wall Street Journal… more than I thought I had at first. ↩
The only certainty is that ad block is increasingly an issue. Don’t forget that there are numerous blocking techniques on mobile, too, so the shift from desktop to mobile does not implicitly solve anything.
Publishers will address it with a variety of approaches; I expect to see a rise of ad-block-blocking and of circumvention techniques a la Secret Media. These are both antagonistic methods, but will at least add exposure to the issue. I do think that deliberately circumventing the user’s desire is a poor long term choice – it is rare that content is truly unique in the vast expanse that is the internet.
An interesting analysis of the problems with web ads and with ad blocking, with a possible solution.
I use ad blockers across all my browsers, my phone, and my tablet. I’m not happy about it, but the aggressive nature of most web ads is infuriating, and the browser is slow enough already. FairBlocker seems like an interesting solution to the problem, but it seems to suffer from the same issue as Readability’s old business model: how does the content creator actually get their share of the money?
“For many of us, hooked into an abusive relationship with tech, with Twitter, with local and global communities, or even corporate entities, we are constantly walking on eggshells. The toxicity (or implicit threat) of sexism, racism, transphobia, ableism, and poverty leave us feeling utterly isolated and without recourse. Faced with organized infiltration, appropriation and psychological abuse in our online communities, we have stopped believing in our own interpretations of what we experience. We don’t believe our innate reactions are valid. If we are extremely lucky, and not utterly isolated, we rely on the trusted counsel of a few close friends, so that we can periodically reassure ourselves that what we are perceiving and experiencing is real.â€
The same tools that enable marginalized groups to find each other and unite for their benefit had the problem of exposing them to ever more anonymous abuse. It doesn’t help that so many declarations of harassment are so easily dismissed by those of us in a space of privilege online. Either harassment doesn’t happen to us, or it’s just an inevitable fact of living online. Neither is a valid reaction. The experience of harassment is real, and we owe its victims to find a way to end it.
I don’t own a television, but I try not to be one of those jerks who’s self-righteous about how he doesn’t own a television. Considering the amount of time I spend passively consuming streams, both social and pirate, I’m not about to maintain any pretensions about how not owning a television makes me better than anyone. The time most people spend watching TV is just filled up with other activities, that’s all. Name any major TV show of the last decade or two: Breaking Bad, Mad Men, The Sopranos, The Wire, The Walking Dead… something else? Odds are, I haven’t seen it.
Growing up, I watched plenty of TV. I was raised on Star Trek: The Next Generation, PBS cooking shows, and various cartoons. Though once I got an Internet connection at home in the late 90s, my TV watching time was doomed. Even then, I made it a point to catch The Simpsons on Sunday nights, and MST3k whenever the Sci-Fi Channel—before it was “Syfy”—could be bothered to show it. I went off to college without a TV in tow, but I did get a TV tuner card for my desktop—so I could hook up my Super Nintendo without needing any extra displays in my tiny dorm room.
Working evenings, with my part-time telemarketing job for the Walnut Street Theatre while in school put the final nail in my TV watching coffin. Sure, we had a TV in my dorm, and later in my shared house, but I was rarely around to watch it. When I moved out to West Philadelphia on my own, I didn’t take a TV. I didn’t even sign up for Netflix—it wasn’t how I learned to spend my leisure hours. Even now, years later, my girlfriend and I still don’t own a TV. It’s just not what we do. If I feel like being mindless for 22 minutes at a stretch, I’ll pull up a classic episode of The Simpsons from [REDACTED] and laugh my fool head off.
And so, I feel somewhat disconnected from talk that flies around my Twitter circles, largely tech-related, about TV shows, new and old. I’ve never seen anything touched by the hand of Joss Whedon, be it Buffy, or Firefly. I don’t really desire to, either, not out of dislike, but sheer apathy. Why bother? It’s officially too much of a timesink that I can’t even be bothered to keep up with recent shows I do care about. I stopped watching the final season of Boardwalk Empire, and I’m far enough behind on Doctor Who that they’ll be on to the 15th Doctor by the time I finish Peter Capaldi’s first season.
I’m sure people who follow me are just as confused when I drop the name of some band or artist whose album or show I’m eagerly anticipating. It’s just surprising after it being a minor-to-non-existent part of my life, just how important TV watching is to people I know and follow. I guess it shouldn’t be, though. I also find myself wondering just what I’ve been missing, but there’s only so many hours in the day. I can’t see where I’d squeeze in the time to keep up with TV, let along all the ancillary recaps, podcasts, and meta-discussion surrounding it. Better to just stick with leaving the metaphorical cable cut. Besides, that’s $9.99 a month I can blow on an album by someone you’ve never heard of instead.