Welcome to the bloggy home of Noah Brier. I'm the co-founder of Percolate and general internet tinkerer. This site is about media, culture, technology, and randomness. It's been around since 2004 (I'm pretty sure). Feel free to get in touch. Get in touch.

You can subscribe to this site via RSS (the humanity!) or .

One More Note on Why Brands Aren’t Going Anywhere

Yesterday I posted a few points in response to James Surowiecki’s New Yorker piece on the role of brand’s in a world of better information. I made a specific distinction between research-heavy products like cars and a product like soap or toothpaste, which people generally choose on brand alone. On Twitter someone asked whether that meant car brands are actually less valuable in this new world and I thought it was worth answering here as well as on Twitter.

First, the answer is no, they’re not less valuable even though research does even the playing field to some extent. But there are two important points about how people buy cars that need to be addressed. First, people generally choose out of a subset. If you want a “luxury” car you’re choosing between a BMW, Audi, or Mercedes. You don’t get to that point without brand. If you’re Kia right now, you’re advertising the hell out of your new car because you want to be in that decision set. While your ultimate decision may be purely on product merits (though it’s likely not), you have eliminated 99% of the other car options off brand alone.

Second, just want to reshare something I wrote a few months ago. This argument about brands is part of a larger anti-brand argument that’s best categorized by the quote “advertising is the tax you pay for being unremarkable.” Back in August I explained all the reasons this isn’t true and they still apply here.

February 11, 2014 // This post is about: , ,

Why Brands Aren’t Going Anywhere

This morning Felix Salmon tweeted James Surowiecki’s latest article about brands at me to see what I thought. Basically Surowiecki makes the case that brands are less meaningful in a world of information efficiency thanks to the web. His thesis, roughly:

It’s a truism of business-book thinking that a company’s brand is its “most important asset,” more valuable than technology or patents or manufacturing prowess. But brands have never been more fragile. The reason is simple: consumers are supremely well informed and far more likely to investigate the real value of products than to rely on logos. “Absolute Value,” a new book by Itamar Simonson, a marketing professor at Stanford, and Emanuel Rosen, a former software executive, shows that, historically, the rise of brands was a response to an information-poor environment. When consumers had to rely on advertisements and their past experience with a company, brands served as proxies for quality; if a car was made by G.M., or a ketchup by Heinz, you assumed that it was pretty good. It was hard to figure out if a new product from an unfamiliar company was reliable or not, so brand loyalty was a way of reducing risk. As recently as the nineteen-eighties, nearly four-fifths of American car buyers stayed loyal to a brand.

He then goes in to talk about cars and travel and a lot of other stuff that people spend a lot of time researching. In response I shot off four quick Tweets that I thought were worth sharing here:
1. Using a luxury brand (Lululemon) is fundamentally flawed. Luxury is all brand. You can fall as fast as you rise. I understand calling Lululemon luxury is a bit of a stretch, but I think it’s pretty accurate. But the second part is really the point: However fast a company rises it can fall at the same speed. We’ve seen lots of businesses reach saturation and struggle and this isn’t necessarily what happened to Lululemon, but it could be a part of the picture. To solely say that it was based on consumer’s ability to do research seems odd. Seems like a classic story of a brand growing really quickly and then struggling to maintain it’s growth. Often those brands come back quickly and continue to grow, though, so the story is far from over.
2. Where is the evidence that suggests buying on non-research products has changed? He touches on this a bit towards the end by writing, “This isn’t true across the board: brands retain value where the brand association is integral to the experience of a product (Coca-Cola, say), or where they confer status, as with luxury goods. But even here the information deluge is transformative; luxury travel, for instance, has been profoundly affected by sites like TripAdvisor.” But again, travel has always been a category that a lot of research goes in to. Anywhere people research, that research will move to the web. The fundamental truth of most products is they’re not research-driven. This is why CPG is always near the top of the largest ad spenders.
3. Research-heavy products (TVs, cars) have become much more price-efficient. Duh. Not sure there’s much to say here, but he specifically hits on how cars have been squeezed to be much more efficient. This is not rocket science and is a good thing for consumers. No debate here, just not a very interesting point.
4. Consumers have always been in control. There is more efficiency in the market now, but fundamentals are the same. This is the big piece. I’ve argued this often. Word of mouth always drove purchasing decisions. Certainly this is much more efficient, but I’m not sure I’d say it’s a sea change.

February 10, 2014 // This post is about: , , ,

The Accuracy of Strangelove

Dr. Strangelove, my favorite movie ever, turns 50 years old this week. The New Yorker has a great story of just how frighteningly accurate the movie’s portrayal of poor nuclear security actually was:

With great reluctance, Eisenhower agreed to let American officers use their nuclear weapons, in an emergency, if there were no time or no means to contact the President. Air Force pilots were allowed to fire their nuclear anti-aircraft rockets to shoot down Soviet bombers heading toward the United States. And about half a dozen high-level American commanders were allowed to use far more powerful nuclear weapons, without contacting the White House first, when their forces were under attack and “the urgency of time and circumstances clearly does not permit a specific decision by the President, or other person empowered to act in his stead.” Eisenhower worried that providing that sort of authorization in advance could make it possible for someone to do “something foolish down the chain of command” and start an all-out nuclear war. But the alternative—allowing an attack on the United States to go unanswered or NATO forces to be overrun—seemed a lot worse. Aware that his decision might create public unease about who really controlled America’s nuclear arsenal, Eisenhower insisted that his delegation of Presidential authority be kept secret. At a meeting with the Joint Chiefs of Staff, he confessed to being “very fearful of having written papers on this matter.”

January 28, 2014 // This post is about: , , ,

The drunk driving/gun connection

Over at the New Yorker Adam Gopnik draws an interesting paralell between drunk driving and guns:

If one needs more hope, one can find it in the history of the parallel fight against drunk driving. When that began, using alcohol and then driving was regarded as a trivial or a forgivable offense. Thanks to the efforts of MADD and the other groups, drunk driving became socially verboten, and then highly regulated, with some states now having strong “ignition interlock” laws that keep drunks from even turning the key. Drunk driving has diminished, we’re told, by as much as ten per cent per year in some recent years. Along with the necessary, and liberty-limiting, changes in seat-belt enforcement and the like, car culture altered. The result? The number of roadway fatalities in 2011 was the lowest since 1949. If we can do with maniacs and guns what we have already done with drunks and cars, we’d be doing fine. These are hard fights, but they can be won.

January 21, 2013 // This post is about: , , , ,

Idiots & Taking the Long View

I’ve been listening to a lot of podcasts lately, and one of them is New Yorker’s Out Loud. The last episode featured a great interview with Daniel Mendelsohn, a literary critic. In the podcast he mostly talks about the books that inspired him to become a writer, but then, towards the end, he talks a bit about the job of a cultural critic and I thought what he had to say was interesting enough to transcribe and share:

We now have these technologies that simulate reality or create different realities in very sophisticated and interesting ways. Having these technologies available to us allows us to walk, say, through midtown Manhattan but actually to be inhabiting our private reality as we do so: We’re on the phone or we’re looking at our smartphone, gazing lovingly into our iPhones. And this is the way the world is going, there’s no point complaining about it. But where my classics come in is I am amused by the fact our word idiot comes from the Greek word idiotes, which means a private person. It’s from the word idios, which means private as opposed to public. So the Athenians, or the Greeks in general who had such a highly developed sense of the readical distinction between what went on in public and what went on in private, thought that a person that brought his private life into public spaces, who confused public and private was an idiote, was an idiot. Of course, now everybody does this. We are in a culture of idiots in the Greek sense. To go back to your original question, what does this look like in the long run? Is it terrible or is it bad? It’s just the way things are. And one of the advantages about being a person who looks at long stretches of the past is you try not to get hysterical, to just see these evolving new ways of being from an imaginary vantage point in the future. Is it the end of the world? No, it’s just the end of a world. It’s the end of the world I grew up in when I was thinking of how you behaved in public. I think your job as a cultural critic is to take a long view.

I obviously thought the idiot stuff was fascinating, but also was interested in his last line about the job of a cultural critic, which, to me, really reflected something that struck me about McLuhan in the most recent biography of his by Douglas Coupland:

Marshall was also encountering a response that would tail him the rest of his life: the incorrect belief that he liked the new world he was describing. In fact, he didn’t ascribe any moral or value dimensions to it at all–he simply kept on pointing out the effects of new media on the individual. And what makes him fresh and relevant now is the fact that (unlike so much other new thinking of the time) he always did focus on the individual in society, rather than on the mass of society as an entity unto itself.

January 7, 2013 // This post is about: , , , , , ,

Top Longform of 2012

Last year I listed out my five favorite pieces of longform writing and it seemed to go over pretty well, so I figured I’d do the same again this year. It was harder to compile the list this year, as my reading took me outside just Instapaper (especially to the fantastic Longform app for iPad), but I’ve done my best to pull these together based on what I most enjoyed/found most interesting/struck me the most.

One additional note before I start my list: To make this process slightly more simple next year I’ve decided to start a Twitter feed that pulls from my Instapaper and Readability favorites. You can find it at @HeyItsInstafavs. Okay, onto the list.

  1. The Yankee Comandante (New Yorker): Last year David Grann took my top spot with A Murder Foretold and this year he again takes it with an incredible piece on William Morgan, an American soldier in the Cuban revolution. The article was impressive enough that George Clooney bought up the rights and is apparently planning to direct a film about the story. The thing about David Grann is that beyond being an incredible reporter and storyteller, he’s also just an amazing writer. I’m not really a reader who sits there and examines sentences, I read for story and ideas. But a few sentences, and even paragraphs, in this piece made me take notice. While we’re on David Grann, I also read his excellent book of essays this year (most of which come from the New Yorker), The Devil & Sherlock Holmes. He is, without a doubt, my favorite non-fiction writer working right now.
  2. Raise the Crime Rate (n+1): This article couldn’t be more different than the first. Rather than narrative non-fiction, this is an interesting, and well-presented, arguments towards abolishing the prison system. The basic thesis of the piece is that we’ve made a terrible ethical decision in the US to offload crime from our cities to our prisions, where we let people get raped and stabbed with little-to-no recourse. The solution presented is to abolish the prison system (while also increasing capital punishment). Rare is an article that you don’t necessarily agree with, but walk away talking and thinking about. That’s why this piece made my list. I read it again last week and still don’t know where I stand, but I know it’s worthy of reading and thinking about. (While I was trying to get through my Instapaper backlog I also came across this Atul Gawande piece from 2009 on solitary confinement and its effects on humans.)
  3. Open Your Mouth & You’re Dead (Outside): A look at the totally insane “sport” of freediving, where athletes swim hundreds of feet underwater on a single breath (and often come back to the surface passed out). This is scary and crazy and exciting and that’s reason enough to read something, right?
  4. Jerry Seinfeld Intends to Die Standing Up (New York Times): I’ve been meaning to write about this but haven’t had a chance yet. Last year HBO had this amazing special called Talking Funny in which Ricky Gervais, Chris Rock, Louis CK and Jerry Seinfeld sit around and chat about what it’s like to be the four funniest men in the world. The format was amazing: Take the four people who are at the top of their profession and see what happens. But what was especially interesting, to me at least, was the deference the other three showed to Seinfeld. I knew he was accomplished, but I didn’t know that he commanded the sort of respect amongst his peers that he does. Well, this Times article expands on that special and explains what makes Seinfeld such a unique comedian and such a careful crafter of jokes. (For more Seinfeld stuff make sure to check out his new online video series, Comedians in Cars Getting Coffee, which is just that.)
  5. The Malice at the Palace (Grantland): I would say as a publication Grantland outperformed just about every other site on the web this year and so this pick is part acknowledgement of that and part praise for a pretty amazing piece of reporting (I guess you could call an oral history that, right?). Anyway, this particular oral history is about the giant fight that broke out in Detroit at a Pacers v. Pistons game that spilled into a fight between the Pistons and the Detroit fans. It was an ugly mark for basketball and an incredibly memorable (and insane) TV event. As a sort of aside on this, I’ve been casually reading Bill Simmons’ Book of Basketball and in it he obviously talks about this game/fight. In fact, he calls it one of his six biggest TV moments, which he judges using the following criteria: “How you know an event qualifies: Will you always remember where you watched it? (Check.) Did you know history was being made? (Check.) Would you have fought anyone who tried to change the channel? (Check.) Did your head start to ache after a while? (Check.) Did your stomach feel funny? (Check.) Did you end up watching about four hours too long? (Check.) Were there a few ‘can you believe this’–type phone calls along the way? (Check.) Did you say ‘I can’t believe this’ at least fifty times?” I agree with that.

And, like last year, there are a few that were great but didn’t make the cut. Here’s two more:

  • Snow Fall (New York Times): Everyone is going crazy about this because of the crazy multimedia experience that went along with it, but I actually bought the Kindle single and read it in plain old black and white and it was still pretty amazing. Also, John Branch deserves to be on this list because he wrote something that would have made my list last year had it not come out in December: Punched Out is the amazing and sad story of Derek Boogaard and what it’s like to be a hockey enforcer.
  • Marathon Man (New Yorker): A very odd, but intriguing, “expose” on a dentist who liked to chat at marathons.

That’s it. I’ve made a Readlist with these seven selections which makes it easy to send them all to your Kindle or Readability. Good reading.

January 4, 2013 // This post is about: , , ,

Blesh

I was reading this New Yorker piece about the Grateful Dead at my friend Colin’s recommendation and I liked the notion of “blesh”:

“More Than Human” is a sci-fi novel, published in 1953, in which a band of exceptional people “blesh” (that is, blend and mesh) their consciousness to create a kind of super-being. “I turned everyone on to that book in, like, 1965,” Lesh said. “ ‘This is what we can do; this is what we can be.’”

Which reminded me a bit of scenius:

The musician and artist Brian Eno coined the odd but apt word “scenius” to describe the unusual pockets of group creativity and invention that emerge in certain intellectual or artistic scenes: philosophers in 18th-century Scotland; Parisian artists and intellectuals in the 1920s. In Eno’s words, scenius is “the communal form of the concept of the genius.” New York hasn’t yet reached those heights in terms of internet innovation, but clearly something powerful has happened. There is genuine digital-age scenius on its streets. This is good news for my city, of course, but it’s also an important case study for any city that wishes to encourage innovative business. How did New York pull it off?

January 3, 2013 // This post is about: , , , ,

How We Got Here: Second Amendment Edition

The New Yorker has a really interesting blog post about how the 2nd amendment came to mean what many now believe it to mean. Turns out we didn’t always see things the way we do:

Enter the modern National Rifle Association. Before the nineteen-seventies, the N.R.A. had been devoted mostly to non-political issues, like gun safety. But a coup d’état at the group’s annual convention in 1977 brought a group of committed political conservatives to power—as part of the leading edge of the new, more rightward-leaning Republican Party. (Jill Lepore recounted this history in a recent piece for The New Yorker.) The new group pushed for a novel interpretation of the Second Amendment, one that gave individuals, not just militias, the right to bear arms. It was an uphill struggle. At first, their views were widely scorned. Chief Justice Warren E. Burger, who was no liberal, mocked the individual-rights theory of the amendment as “a fraud.”

The article goes on to explain how interesting it is that this represents a “living” constitution that adapts with the times, something conservatives generally fight against:

But the N.R.A. kept pushing—and there’s a lesson here. Conservatives often embrace “originalism,” the idea that the meaning of the Constitution was fixed when it was ratified, in 1787. They mock the so-called liberal idea of a “living” constitution, whose meaning changes with the values of the country at large. But there is no better example of the living Constitution than the conservative re-casting of the Second Amendment in the last few decades of the twentieth century. (Reva Siegel, of Yale Law School, elaborates on this point in a brilliant article.)

December 28, 2012 // This post is about: , , , , , ,