Welcome to the bloggy home of Noah Brier. I'm the co-founder of Percolate and general internet tinkerer. This site is about media, culture, technology, and randomness. It's been around since 2004 (I'm pretty sure). Feel free to get in touch. Get in touch.

You can subscribe to this site via RSS (the humanity!) or .

Is it or isn’t it?

Differing opinions (sort of) from the New York Times over whether technology is or isn’t what the science-fiction writers imagined. From a November article titled “In Defense of Technology”:

Physical loneliness can still exist, of course, but you’re never friendless online. Don’t tell me the spiritual life is over. In many ways it’s only just begun. Technology is not doing what the sci-fi writers warned it might — it is not turning us into digits or blank consumers, into people who hate community. Instead, there is evidence that the improvements are making us more democratic, more aware of the planet, more interested in the experience of people who aren’t us, more connected to the mysteries of privacy and surveillance. It’s also pressing us to question what it means to have life so easy, when billions do not. I lived through the age of complacency, before information arrived and the outside world liquified its borders. And now it seems as if the real split in the world will not only be between the fed and the unfed, the healthy and the unhealthy, but between those with smartphones and those without.

And now, in response to the Sony hack, Frank Bruni writes, “The specter that science fiction began to raise decades ago has come true, but with a twist. Computers and technology don’t have minds of their own. They have really, really big mouths.” He continues:

“Nothing you say in any form mediated through digital technology — absolutely nothing at all — is guaranteed to stay private,” wrote Farhad Manjoo, a technology columnist for The Times, in a blog post on Thursday. He issued a “reminder to anyone who uses a digital device to say anything to anyone, ever. Don’t do it. Don’t email, don’t text, don’t update, don’t send photos.” He might as well have added, “Don’t live,” because self-expression and sharing aren’t easily abandoned, and other conduits for them — landlines, snail mail — no longer do the trick.

But the New York Times doesn’t stop agreeing with itself there (by the way, this is totally fine, I have no problem with the contradictions and actually appreciate them). From “As Robots Grow Smarter, American Workers Struggle to Keep Up”:

Yet there is deep uncertainty about how the pattern will play out now, as two trends are interacting. Artificial intelligence has become vastly more sophisticated in a short time, with machines now able to learn, not just follow programmed instructions, and to respond to human language and movement. … At the same time, the American work force has gained skills at a slower rate than in the past — and at a slower rate than in many other countries. Americans between the ages of 55 and 64 are among the most skilled in the world, according to a recent report from the Organization for Economic Cooperation and Development. Younger Americans are closer to average among the residents of rich countries, and below average by some measures.

My opinion falls into the protopian camp: Things are definitely getting better, but new complexities are bound to emerge as things change. It’s not going to be simple and there are lots of questions we should be asking ourselves about how technology is changing us and the world, but it’s much healthier to start from a place of positivity and recognition that much of the change is good change.

December 23, 2014 // This post is about: , , ,

Tech plus Culture

Interesting point of distinction between Google and Facebook’s approach to the world in this long feature of Zuckerberg and Facebook’s approach to wiring the world. Specifically in reference to Facebook and Google buying drone companies as a possible approach to getting internet to rural areas:

Google also has a drone program—in April it bought one of Ascenta’s competitors, Titan Aerospace—but what’s notable about its approach so far is that it has been almost purely technological and unilateral: we want people to have the Internet, so we’re going to beam it at them from a balloon. Whereas Facebook’s solution is a blended one. It has technological pieces but also a business piece (making money for the cell-phone companies) and a sociocultural one (luring people online with carefully curated content). The app is just one part of a human ecosystem where every-body is incentivized to keep it going and spread it around. “Certainly, one big difference is that we tend to look at the culture around things,” Zuckerberg says. “That’s just a core part of building any social project.” The subtext being, all projects are social.

This is a pretty interesting point of difference between how the two companies view the world. Google sees every problem as a pure technical issue whereas Facebook sees it as part cultural and part technical. I’m not totally sure I buy it (it seems unfair to call Android a purely technical solution), but it’s an interesting lens to look through when examining two of the world’s most important tech companies.

December 9, 2014 // This post is about: , , ,

Misinterpretation

I never finished Taleb’s Black Swan, but I vividly remember a point from the opening chapter about what would have happened if a senator had pushed through legislation prior to September 11th to force all airlines to install locked cockpit doors. That person would have never recieved attention or recognition for preventing an attack, since we never would have known:

The person who imposed locks on cockpit doors gets no statuses in public squares, not so much as a quick mention of his contribution in his obituary. “Joe Smith, who helped avoid the disaster of 9/11, died of complications of liver disease.” Seeing how superfluous his measure was, and how it squandered resources, the public, with great help from airline pilots, might well boot him out of office . . .

I was reminded of this as I was reading Tim Harford’s Adapt and this point about how we interpreted the US domination of the first Gulf War:

Another example of history’s uncertain guidance came from the first Gulf War in 1990–1. Desert Storm was an overwhelming defeat for Saddam Hussein’s army: one day it was one of the largest armies in the world; four days later, it wasn’t even the largest army in Iraq. Most American military strategists saw this as a vindication of their main strategic pillars: a technology-driven war with lots of air support and above all, overwhelming force. In reality it was a sign that change was coming: the victory was so crushing that no foe would ever use open battlefield tactics against the US Army again. Was this really so obvious in advance?

January 2, 2014 // This post is about: , , ,

Engineering Culture

I’m incredibly proud of this blog post by James OB, one of the engineers at Percolate, about why he likes the engineering culture at the company. The whole thing is well worth a read, but I especially liked this bit:

The autonomy to solve a problem with the best technology available is a luxury for programmers. Most organizations I’ve been exposed to are encumbered, in varying degrees, by institutional favorites or “safe” bets without regard for the problem to be solved. Engineering at Percolate has so far been free of that trap, which results in a constantly engaging, productive mode of work.

We’re hiring lots more engineers (and a VP of Engineering). If you want to work somewhere awesome (as described by James), please apply (or email me directly).

August 27, 2013 // This post is about: , , ,

Curiosity Kills

I’m a sucker for all quotes about how one thing or another was going to ruin society. Most of these are about media, but I couldn’t help myself when I saw this one about curiosity from an article on The American Scholar:

Specific methods aside, critics argued that unregulated curiosity led to an insatiable desire for novelty—not to true knowledge, which required years of immersion in a subject. Today, in an ever-more-distracted world, that argument resonates. In fact, even though many early critics of natural philosophy come off as shrill and small-minded, it’s a testament to Ball that you occasionally find yourself nodding in agreement with people who ended up on the “wrong” side of history.

I actually think it would be pretty great to collect all these in a big book … A paper book, of course.

August 8, 2013 // This post is about: , ,

Technology Advances Backwards

I really love this quote which came from an article Umberto Eco wrote about Wikileaks by way of this very excellent recap of a talk by the head of technology at the Smithsonian Cooper-Hewitt National Design Museum:

I once had occasion to observe that technology now advances crabwise, i.e. backwards. A century after the wireless telegraph revolutionised communications, the Internet has re-established a telegraph that runs on (telephone) wires. (Analog) video cassettes enabled film buffs to peruse a movie frame by frame, by fast-forwarding and rewinding to lay bare all the secrets of the editing process, but (digital) CDs now only allow us quantum leaps from one chapter to another. High-speed trains take us from Rome to Milan in three hours, but flying there, if you include transfers to and from the airports, takes three and a half hours. So it wouldn’t be extraordinary if politics and communications technologies were to revert to the horse-drawn carriage.

August 5, 2013 // This post is about: , , ,

Global Time

In response to my little post about describing the past and present, Jim, who reads the blog, emailed me to say it could be referred to as an “atemporal present,” which I thought was a good turn of phrase. I googled it and ran across this fascinating Guardian piece explaining their decision to get rid of references to today and yesterday in their articles. Here’s a pretty large snippet:

It used to be quite simple. If you worked for an evening newspaper, you put “today” near the beginning of every story in an attempt to give the impression of being up-to-the-minute – even though many of the stories had been written the day before (as those lovely people who own local newspapers strove to increase their profits by cutting editions and moving deadlines ever earlier in the day). If you worked for a morning newspaper, you put “last night” at the beginning: the assumption was that reading your paper was the first thing that everyone did, the moment they awoke, and you wanted them to think that you had been slaving all night on their behalf to bring them the absolute latest news. A report that might have been written at, say, 3pm the previous day would still start something like this: “The government last night announced …”

All this has changed. As I wrote last year, we now have many millions of readers around the world, for whom the use of yesterday, today and tomorrow must be at best confusing and at times downright misleading. I don’t know how many readers the Guardian has in Hawaii – though I am willing to make a goodwill visit if the managing editor is seeking volunteers – but if I write a story saying something happened “last night”, it will not necessarily be clear which “night” I am referring to. Even in the UK, online readers may visit the website at any time, using a variety of devices, as the old, predictable pattern of newspaper readership has changed for ever. A guardian.co.uk story may be read within seconds of publication, or months later – long after the newspaper has been composted.

So our new policy, adopted last week (wherever you are in the world), is to omit time references such as last night, yesterday, today, tonight and tomorrow from guardian.co.uk stories. If a day is relevant (for example, to say when a meeting is going to happen or happened) we will state the actual day – as in “the government will announce its proposals in a white paper on Wednesday [rather than ‘tomorrow’]” or “the government’s proposals, announced on Wednesday [rather than ‘yesterday’], have been greeted with a storm of protest”.

What’s extra interesting about this to me is that it’s not just about the time you’re reading that story, but also the space the web inhabits. We’ve been talking a lot at Percolate lately about how social is shifting the way we think about audiences since for the first time there are constant global media opportunities (it used to happen once every four years with the Olympics or World Cup). But, as this articulates so well, being global also has a major impact on time since you move away from knowing where your audience is in their day when they’re consuming your content.

August 5, 2013 // This post is about: , , , ,

Describing the Past & Present

I don’t know that I have a lot more to add than what Russell wrote here, but I like the way he described the challenge of describing something that is simultaneously happening the past and present (in this case, describing a soccer replay):

This is normally dismissed as typical footballer ignorance but it’s better understood when you think of a footballer standing infront of a monitor talking you through the goal they’ve just scored. They’re describing something in the past, which also seems to be happening now, which they’ve never seen before. The past and the present are all mushed up – it’s bound to create an odd tense.

It reminds me a bit of this post from a few years ago where I described the different levels of exposure/knowledge on the web.

August 3, 2013 // This post is about: , , ,

Borges and Sharknado

I really like this little post on “Borges and the Sharknado Problem.” The gist:

We can apply the Borgesian insight [why write a book when a short story is equally good for getting your point across] to the problem of Sharknado. Why make a two-hour movie called Sharknado when all you need is the idea of a movie called Sharknado? And perhaps, a two-minute trailer? And given that such a movie is not needed to convey the full brilliance of Sharknado – and it is, indeed, brilliant – why spend two hours watching it when it is, wastefully, made?

On Twitter my friend Ryan Catbird responded by pointing out that that’s what makes the Modern Seinfeld Twitter account so magical: They give you the plot in 140 characters and you can easily imagine the episode (and that’s really all you need).

July 30, 2013 // This post is about: , , , ,

Seeing Through Games

Clive Thompson, writing about finding the cruise ship that crashed in Italy last year on Google maps (Maps link here), made a really interesting point about how we interpret strange visuals in the age of digital technology and video games:

I remember, back when the catastrophe first occurred, being struck by how uncanny — how almost CGI-like — the pictures of the ship appeared. It looks so wrong, lying there sideways in the shallow waters, that I had a sort of odd, disassociative moment that occurs to me with uncomfortable regularity these days: The picture looks like something I’ve seen in a some dystopic video game, a sort of bleak matte-painting backdrop of the world gone wrong. (In the typically bifurcated moral nature of media, you could regard this either as a condemnation of video games — i.e. they’ve trained me to view real-life tragedy as unreal — or an example of their imaginative force: They’re a place you regularly encounter depictions of the terrible.) At any rate, I think what triggers this is the sheer immensity of the ship; it’s totally out of scale, as in that photo above, taken by Luca Di Ciaccio.

Growing up playing video games I definitely know the feeling. I do wonder, though, whether this is actually a new feeling or we could have said the same about feeling like something was a movie when that was still transforming how we saw the world. When I was in Hong Kong in December, for example, I felt like it was more a reflection of Blade Runner than anything else, for what it’s worth. Either way, though, it’s an interesting notion.

July 23, 2013 // This post is about: , , , ,