I love the new iPad. If you're one of the four or five people in America who missed the news about Apple's latest device yesterday afternoon, stop reading this blog and visit Apple's website immediately and check it out. As CEO Steve Jobs revealed yesterday, the new iPad is kind of like a scaled-up iPod Touch, or an over-sized iPhone without the phone. It will allow you to surf the web, catch up on e-mail, play music (and music videos), watch movies and television, and read books -- all on a full-color, 10-inch touch screen. It's thin and light and, like everything Apple, stylish.
But what caught my eye when I looked at the pictures of the iPad on the Apple site this morning were the images on the device's touch screen: they were pages from the New York Times! The iPad, I thought, could be "it." The Holy Grail. The device that finally allows old media, like my beloved New York Times, to fully embrace the electronic age.
I first encountered the Times when I was a freshman in college. Hard to believe now, but back in the late 1960s, people in Allentown, Pennsylvania were largely unfamiliar with America's finest newspaper. We had The Morning Call and the Evening Chronicle, and I'm sure a few of our neighbors subscribed to the Philadelphia Inquirer to get hometown sports coverage of the Phillies or the Eagles. But the New York Times? It might as well have been the Times of London.
So finding a copy of the Times on a coffee-stained table in Wesleyan's Downey House was a revelation. Front-line stories from Southeast Asia by great reporters like Sy Hersh, Harrison Salisbury, David Halberstam and Johnny Apple, and dispatches from the Civil Rights movement by the likes of Tom Wicker, Tony Lewis, and Roy Reed were eye-openers to me in 1967. The Op-Ed page, which first appeared in 1970 -- at the height of the Vietnam War -- has launched a million arguments and thoughtful conversations.
These many years later, the Times is still arriving on my doorstep every morning. But readers like me are a dying breed, as circulation numbers for virtually every print daily decline and laptops replace newspapers at America's breakfast tables. Like most papers, unfortunately, the Times got hoodwinked by the "information wants to be free" mantra of the early Internet and posted its content online free of charge, only to find that giving away expensively-produced journalism was not necessarily a sustainable business model. With advertising revenue heading south, can the Times survive?
Recently, publisher Arthur Sulzberger and the Times executive team announced their intention to create a new "pay per view" business model for the publication's online website. The details haven't been worked out yet, but it's said that the new plan will allow readers to access a limited amount of content for free, then charge heavy users for access beyond a minimum threshold. Well, good for them. I have no idea how this will work, but I know that if the Times doesn't find a way to generate revenue from the product it produces, it will, one day, cease to exist. (Which in my view, would kill intelligent life in our country as surely as an asteroid strike).
But monetizing content is only part of the problem. Everyone reading this blog knows that information-consumers aren't waiting for the news to arrive on their doorsteps anymore. Instead, we access information on a regular basis from the web, round the clock, on our computers and our smart phones. (See previous post: "TMI?"). Traditional newspapers, then, need a new delivery system. Laptops are fine, but they aren't practical for reading the news on public transportation, or while sitting in a comfortable easy chair. And smart phones are wonderfully portable, but reading anything more than a short article is tough on the thumbs.
And that's where the new iPad comes in. This is the first device I've seen that looks like a perfect substitute for the traditional newspaper. Based on the photos on Apple's website, the Times will be able to display something similar to a front page on the iPad, giving readers the same expansive menu of possibilities they enjoy when they read the Times in print. But unlike the print edition, the iPad version will allow readers to click on slide shows and interactive features -- things they now enjoy only online. The iPad also has the Goldilocks quality of being "just right." It's not as big and clunky as a laptop, or as small as a smart phone, and it should fit nicely in a shoulder bag, a briefcase or a backpack. As a newspaper reading device, it should work fine at the breakfast table, on the subway, at a desk or in a comfortable chair.
I don't know if the iPad will end up saving The New York Times. But if it does, Arthur Sulzberger should send a case of very fine champagne to Steve Jobs.
Thursday, January 28, 2010
Monday, January 25, 2010
TMI?
One of our local football teams, the New York Jets, made it to the AFC championship game this year and I devoted a little over three hours yesterday to watching them compete for the chance to play in the Super Bowl on February 7. The Indianapolis Colts prevailed, but the Jets played well. It was an afternoon well spent.
While watching the game, I also learned that I needed to send some documents to a colleague in Bucharest, Romania, for a project we're working on together; I got an e-mail from a friend in Majorca alerting me to an article of interest in London's Financial Times; and I saw that a former colleague from NBC News, now with PBS in Washington, had accepted my invitation to connect on the social/professional network, LinkedIn. All of this communication came my way via a few finger taps on my iPhone touch screen during time-outs in the Jets-Colts game.
This constant flood of information never stops. I woke up this morning to learn that my daughter in Australia loved the digital photos of her sister I'd e-mailed over the weekend, and that Amazon.com had some great new deals for me in consumer electronics. I checked my bank balance and the weather forecast before I had my first cup of coffee. I'm more connected, to more people and more sources of information, than I've ever been -- and I'm not even on Facebook.
If adults find themselves knee-deep in social media and digital information, you better believe it's a tsunami for kids. In a recent article in the New York Times, Tamar Lewin reported on a new study by the Kaiser Family Foundation that shows that "the average young American now spends practically every waking minute -- except for the time in school -- using a smart phone, computer, television or other electronic device" ("If Your Kids Are Awake, They're Probably Online," 1/20). According to the researchers at Kaiser, "Those ages 8 to 18 spend more than seven and a half hours a day with such devices, compared with less than six and a half hours five years ago, when the study was last conducted. And that does not count the hour and a half that youths spend texting, or the half-hour they talk on their cell phones."
None of this is news to parents who have tweens or teens at home, but it's remarkable data all the same. The new media experience may be forcing adults to adapt to a new way of communicating, but it's probably changing the entire life experience of young people in ways we can't yet measure.
The ways we use and interact with media have social consequences, of course, and not all of them are positive. Text messaging means fewer conversations with friends on the telephone, just as online banking means less chitchat with the local bank teller. As time goes on, it seems most of us find we're spending more and more time looking at an electronic screen instead of socializing with our friends or personally interacting with our colleagues.
In "Understanding Media," (required reading when I was an undergrad), Marshall McLuhan wrote that "the personal and social consequences of any medium -- this is, of any extension of ourselves -- result from the new scale that is introduced into our affairs by each extension of ourselves, or by any new technology." He summed this up neatly in the phrase, "The medium is the message."
McLuhan died in 1980, years before the original Macintosh computer, not to mention cell phones, laptops and e-books. But he wrote as if he saw it all coming. "One of the effects of living with electric information," he said, "is that we live habitually in a state of information overload. There's always more than you can cope with." And that seems to be one of the bigger problems of our time. We're living in an age of too much information ("TMI"), and it's coming at us way too fast.
We can always cut the cord, of course, as the Times reports some Facebook users did recently. Fingers permanently curled over their computer keys, these desperate folks said they'd become so addicted to the social networking site they'd stopped "living life." Similarly, many of us probably know one or two people who refuse to have mobile phones because they don't like the idea of being reachable -- and accountable -- anytime, day or night. Most of us, though, move along with the digital tide, trading in our cell phones for smart phones, and our old Macbooks for the latest Macbook Pros.
For grownups like me, this new electronic age means information and connections at my fingertips, and a chance to catalogue my thoughts, occasionally, on this blog. What it will mean for kids growing up in this new digital century is still anybody's guess.
While watching the game, I also learned that I needed to send some documents to a colleague in Bucharest, Romania, for a project we're working on together; I got an e-mail from a friend in Majorca alerting me to an article of interest in London's Financial Times; and I saw that a former colleague from NBC News, now with PBS in Washington, had accepted my invitation to connect on the social/professional network, LinkedIn. All of this communication came my way via a few finger taps on my iPhone touch screen during time-outs in the Jets-Colts game.
This constant flood of information never stops. I woke up this morning to learn that my daughter in Australia loved the digital photos of her sister I'd e-mailed over the weekend, and that Amazon.com had some great new deals for me in consumer electronics. I checked my bank balance and the weather forecast before I had my first cup of coffee. I'm more connected, to more people and more sources of information, than I've ever been -- and I'm not even on Facebook.
If adults find themselves knee-deep in social media and digital information, you better believe it's a tsunami for kids. In a recent article in the New York Times, Tamar Lewin reported on a new study by the Kaiser Family Foundation that shows that "the average young American now spends practically every waking minute -- except for the time in school -- using a smart phone, computer, television or other electronic device" ("If Your Kids Are Awake, They're Probably Online," 1/20). According to the researchers at Kaiser, "Those ages 8 to 18 spend more than seven and a half hours a day with such devices, compared with less than six and a half hours five years ago, when the study was last conducted. And that does not count the hour and a half that youths spend texting, or the half-hour they talk on their cell phones."
None of this is news to parents who have tweens or teens at home, but it's remarkable data all the same. The new media experience may be forcing adults to adapt to a new way of communicating, but it's probably changing the entire life experience of young people in ways we can't yet measure.
The ways we use and interact with media have social consequences, of course, and not all of them are positive. Text messaging means fewer conversations with friends on the telephone, just as online banking means less chitchat with the local bank teller. As time goes on, it seems most of us find we're spending more and more time looking at an electronic screen instead of socializing with our friends or personally interacting with our colleagues.
In "Understanding Media," (required reading when I was an undergrad), Marshall McLuhan wrote that "the personal and social consequences of any medium -- this is, of any extension of ourselves -- result from the new scale that is introduced into our affairs by each extension of ourselves, or by any new technology." He summed this up neatly in the phrase, "The medium is the message."
McLuhan died in 1980, years before the original Macintosh computer, not to mention cell phones, laptops and e-books. But he wrote as if he saw it all coming. "One of the effects of living with electric information," he said, "is that we live habitually in a state of information overload. There's always more than you can cope with." And that seems to be one of the bigger problems of our time. We're living in an age of too much information ("TMI"), and it's coming at us way too fast.
We can always cut the cord, of course, as the Times reports some Facebook users did recently. Fingers permanently curled over their computer keys, these desperate folks said they'd become so addicted to the social networking site they'd stopped "living life." Similarly, many of us probably know one or two people who refuse to have mobile phones because they don't like the idea of being reachable -- and accountable -- anytime, day or night. Most of us, though, move along with the digital tide, trading in our cell phones for smart phones, and our old Macbooks for the latest Macbook Pros.
For grownups like me, this new electronic age means information and connections at my fingertips, and a chance to catalogue my thoughts, occasionally, on this blog. What it will mean for kids growing up in this new digital century is still anybody's guess.
Saturday, January 23, 2010
Beyond Flipper
There's something special about the relationship between humans and dolphins. Rescue narratives (stories about drowning humans saved by dolphins, or humans endangered by sharks and defended by dolphins) go back to the Greeks, and are still common among swimmers and divers today. And then there is the tantalizing prospect of interspecies communication. Dolphins, like their bigger cousins, whales, are intelligent creatures and have large, complex brains. Are they capable of "talking" to us, or understanding us when we talk to them?
Years ago, my buddy Jim and I decided to go scuba diving off Cozumel, Mexico during spring break. The island was far less developed than it is today, and I remember that the flight there was the most expensive part of the trip. The waters between the island and the mainland were pristine blue-green, and alive with coral formations and marine life of every description -- a scuba diver's paradise.
If you wanted to go diving, you showed up at the dock in the morning and clambered aboard one of a handful of dive boats fueling up in the harbor. You handed a few pesos to the captain and rummaged through the gear bin as he motored out into the channel. After a morning dive, you and your shipmates were treated to lunch on the beach -- fresh fish, cooked by the captain and his mate. And then it was back out to sea for a second dive on the reef.
On our second or third day, we were finishing lunch on the beach when we spotted a school of bottlenose dolphins traversing the channel, just a couple hundred yards offshore. The captain helped us quickly board the dive boat and we made for a spot ahead of the pack. I didn't have time to put on my tank, so I jumped in with only my mask, snorkel and flippers.
I was immediately surrounded by what appeared to be dozens of dolphins, packed close to each other and to me. I kicked as hard as I could to keep up. Adults and juveniles swam up to me and swooped under me, often just an arm's length away. It was ten minutes of pure magic.
Decades have passed and I've never forgotten that experience. If you're ever lucky enough to make eye contact with a dolphin in the wild, you'll know what I mean. When one looks you in the eye, it's one intelligent being to another (though we probably shouldn't overestimate the diver).
All of which is a long preamble to a movie recommendation. National Geographic photographer Louie Psihoyos has made a documentary about the slaughter of dolphins in Japan that is painful to watch, but important to see. "The Cove" chronicles the efforts of Ric O'Barry and a collection of animal rights activists to force the Japanese to stop capturing and killing dolphins at a national park in Taiji, Japan. Capturing and killing, because some of the more photogenic dolphins are, in fact, captured and then shipped to marine theme parks for water shows. The rest are killed for their meat.
The documentary footage of the slaughter itself, taken surreptitiously with HD cameras hidden in fake rocks, makes for particularly difficult viewing. For seven or eight excruciating minutes, we see dolphins being driven into the killing cove, then stabbed to death with harpoons. One young dolphin, mortally wounded, struggles to take his final breath before disappearing in the bloody surf.
Ric O'Barry, the central character in "The Cove," is passionate about saving dolphins because he once trained them. In fact, he trained the original dolphins who shared star billing on the TV program "Flipper" back in the mid-1960s. When "Kathy," one of his favorite dolphins, died of a broken heart, O'Barry decided that freeing captive dolphins would become his life's mission.
As skeptical media consumers, we're conditioned to expect that issues are never as one-sided as they appear in documentaries like this one. "The Cove" anticipates our skepticism by addressing the primary economic argument for the dolphin slaughter: the contribution of dolphin meat to the Japanese diet. The filmmakers sent samples of dolphin meat served to Japanese school children to a lab for analysis, and every sample contained dangerously high levels of mercury. It turns out that a dinner of dolphin meat is almost as bad for humans as for the dolphins themselves.
"The Cove" doesn't have a happy ending. Cetaceans -- dolphins and whales -- are still being slaughtered en masse, and not just by the Japanese. Norway, Iceland, and a number of smaller countries also allow whaling. But the success of activists to raise human consciousness about the killing of these amazing mammals offers hope that we may one day learn to see these animals in a different light... eyeball to eyeball, one intelligent being to another.
"The Cove" has been nominated for an Academy Award.
Years ago, my buddy Jim and I decided to go scuba diving off Cozumel, Mexico during spring break. The island was far less developed than it is today, and I remember that the flight there was the most expensive part of the trip. The waters between the island and the mainland were pristine blue-green, and alive with coral formations and marine life of every description -- a scuba diver's paradise.
If you wanted to go diving, you showed up at the dock in the morning and clambered aboard one of a handful of dive boats fueling up in the harbor. You handed a few pesos to the captain and rummaged through the gear bin as he motored out into the channel. After a morning dive, you and your shipmates were treated to lunch on the beach -- fresh fish, cooked by the captain and his mate. And then it was back out to sea for a second dive on the reef.
On our second or third day, we were finishing lunch on the beach when we spotted a school of bottlenose dolphins traversing the channel, just a couple hundred yards offshore. The captain helped us quickly board the dive boat and we made for a spot ahead of the pack. I didn't have time to put on my tank, so I jumped in with only my mask, snorkel and flippers.
I was immediately surrounded by what appeared to be dozens of dolphins, packed close to each other and to me. I kicked as hard as I could to keep up. Adults and juveniles swam up to me and swooped under me, often just an arm's length away. It was ten minutes of pure magic.
Decades have passed and I've never forgotten that experience. If you're ever lucky enough to make eye contact with a dolphin in the wild, you'll know what I mean. When one looks you in the eye, it's one intelligent being to another (though we probably shouldn't overestimate the diver).
All of which is a long preamble to a movie recommendation. National Geographic photographer Louie Psihoyos has made a documentary about the slaughter of dolphins in Japan that is painful to watch, but important to see. "The Cove" chronicles the efforts of Ric O'Barry and a collection of animal rights activists to force the Japanese to stop capturing and killing dolphins at a national park in Taiji, Japan. Capturing and killing, because some of the more photogenic dolphins are, in fact, captured and then shipped to marine theme parks for water shows. The rest are killed for their meat.
The documentary footage of the slaughter itself, taken surreptitiously with HD cameras hidden in fake rocks, makes for particularly difficult viewing. For seven or eight excruciating minutes, we see dolphins being driven into the killing cove, then stabbed to death with harpoons. One young dolphin, mortally wounded, struggles to take his final breath before disappearing in the bloody surf.
Ric O'Barry, the central character in "The Cove," is passionate about saving dolphins because he once trained them. In fact, he trained the original dolphins who shared star billing on the TV program "Flipper" back in the mid-1960s. When "Kathy," one of his favorite dolphins, died of a broken heart, O'Barry decided that freeing captive dolphins would become his life's mission.
As skeptical media consumers, we're conditioned to expect that issues are never as one-sided as they appear in documentaries like this one. "The Cove" anticipates our skepticism by addressing the primary economic argument for the dolphin slaughter: the contribution of dolphin meat to the Japanese diet. The filmmakers sent samples of dolphin meat served to Japanese school children to a lab for analysis, and every sample contained dangerously high levels of mercury. It turns out that a dinner of dolphin meat is almost as bad for humans as for the dolphins themselves.
"The Cove" doesn't have a happy ending. Cetaceans -- dolphins and whales -- are still being slaughtered en masse, and not just by the Japanese. Norway, Iceland, and a number of smaller countries also allow whaling. But the success of activists to raise human consciousness about the killing of these amazing mammals offers hope that we may one day learn to see these animals in a different light... eyeball to eyeball, one intelligent being to another.
"The Cove" has been nominated for an Academy Award.
Wednesday, January 20, 2010
Carpe Diem on Bondi Beach
One of our daughters is enjoying a college semester abroad in Australia. This might seem like a boondoggle choice on her part, but it has deeper meaning in our family. My wife is Australian and both of our daughters have dual Australian and US citizenship. Unfortunately, though, we haven't made as many trips to Australia as we should have over the years, and as a result our girls' life experience isn't in sync with their DNA. Mallory's semester in Sydney, then, represents an effort on her part to make a meaningful connection to her Aussie side.
One of the advantages of being young is that you don't spend a lot of time navel-gazing about how brief and precious life is. You just live it. The downside is that it's easy, at that age, to take things for granted.
When I was in college I was privileged to be invited to spend a summer working on an archaeological dig in Italy. It was a fantastic experience, and I remember thinking that I would certainly want to make time in my adult life to do it again. But, of course, I never did. "Life is what happens to you while you're busy making other plans," John Lennon once observed. It's easy to get lost in the plan-making and overlook the living. So when most of us look back on our youthful experiences from the vantage point of adulthood, we often wish we'd appreciated them more at the time.
Our daughter will likely visit Australia many times in her life. She may even choose to live there. But she doesn't seem to be taking this semester abroad for granted. In a recent e-mail home, she wrote that she was determined to "carpe diem every day, make the most of this trip and say yes to all the amazing things this place has to offer."
Good on ya, Mal.
One of the advantages of being young is that you don't spend a lot of time navel-gazing about how brief and precious life is. You just live it. The downside is that it's easy, at that age, to take things for granted.
When I was in college I was privileged to be invited to spend a summer working on an archaeological dig in Italy. It was a fantastic experience, and I remember thinking that I would certainly want to make time in my adult life to do it again. But, of course, I never did. "Life is what happens to you while you're busy making other plans," John Lennon once observed. It's easy to get lost in the plan-making and overlook the living. So when most of us look back on our youthful experiences from the vantage point of adulthood, we often wish we'd appreciated them more at the time.
Our daughter will likely visit Australia many times in her life. She may even choose to live there. But she doesn't seem to be taking this semester abroad for granted. In a recent e-mail home, she wrote that she was determined to "carpe diem every day, make the most of this trip and say yes to all the amazing things this place has to offer."
Good on ya, Mal.
Monday, January 18, 2010
Thanks, Jack
I had a phone conversation with a fellow Wesleyan alum a couple of weeks back and we were reminiscing about the impact Wesleyan had on our lives. He said, "You know, I don't think I'd ever even met a black person before I arrived at Wesleyan." The same was true for me. Like my friend, I'd grown up in what was then an all-white town (Allentown, Pennsylvania), and gone to an all-white public high school.
Wesleyan was an eye-opener. There were 39 African American students in our freshman class -- almost 11 % of the total -- and there was plenty of culture shock on both sides of the racial divide. Inner-city blacks felt socially uncomfortable with suburban whites and the privileged kids from elite prep schools. And we whites were, frankly, clueless. I'd like to be able to report that we all learned to live happily together during our four years at Wesleyan, but there was a lot of history to overcome. There was progress, but no Kumbaya. Still, I think the broadening of social perspective that the Wesleyan experience provided was as educational for me as any course I took there.
Under the leadership of a visionary Admissions Director, John C. Hoy, class of '55, Wesleyan was a pioneer in the new admissions practice that came to be called affirmative action. Before college, Jack Hoy attended an integrated high school in New York City and played on the school's football team. And from that experience, he developed a passionate commitment to equal opportunity, and a deep appreciation of racial diversity.
In 1964, the year Hoy took over as Dean of Admissions, there were only two black students enrolled at the University. He had his admissions staff scour America's top inner-city high schools for bright, ambitious African American students, and 14 were recruited for the next year's class. By the fall of 1966, there were 33 new black freshmen on Foss Hill.
In recent decades, most of the discussion of affirmative action has focused on the benefits minority students receive by having an opportunity to attend high-quality colleges and universities (see Obama, Barack, US President). But I think there's been an equal or greater benefit for students from the majority culture.
My older daughter is now a junior at a liberal arts college in Pennsylvania, and she and her fellow classmates -- black and white -- seem entirely comfortable in an integrated America. That's real progress, and many people deserve credit. But on this Martin Luther King holiday, I'll tip my cap to Jack Hoy.
Wesleyan was an eye-opener. There were 39 African American students in our freshman class -- almost 11 % of the total -- and there was plenty of culture shock on both sides of the racial divide. Inner-city blacks felt socially uncomfortable with suburban whites and the privileged kids from elite prep schools. And we whites were, frankly, clueless. I'd like to be able to report that we all learned to live happily together during our four years at Wesleyan, but there was a lot of history to overcome. There was progress, but no Kumbaya. Still, I think the broadening of social perspective that the Wesleyan experience provided was as educational for me as any course I took there.
Under the leadership of a visionary Admissions Director, John C. Hoy, class of '55, Wesleyan was a pioneer in the new admissions practice that came to be called affirmative action. Before college, Jack Hoy attended an integrated high school in New York City and played on the school's football team. And from that experience, he developed a passionate commitment to equal opportunity, and a deep appreciation of racial diversity.
In 1964, the year Hoy took over as Dean of Admissions, there were only two black students enrolled at the University. He had his admissions staff scour America's top inner-city high schools for bright, ambitious African American students, and 14 were recruited for the next year's class. By the fall of 1966, there were 33 new black freshmen on Foss Hill.
In recent decades, most of the discussion of affirmative action has focused on the benefits minority students receive by having an opportunity to attend high-quality colleges and universities (see Obama, Barack, US President). But I think there's been an equal or greater benefit for students from the majority culture.
My older daughter is now a junior at a liberal arts college in Pennsylvania, and she and her fellow classmates -- black and white -- seem entirely comfortable in an integrated America. That's real progress, and many people deserve credit. But on this Martin Luther King holiday, I'll tip my cap to Jack Hoy.
Remembering Supertrain
I hadn't intended to write about my old network again so soon, but an article on the front page of the Times yesterday is worthy of a brief comment ("NBC's Slide From TV's Heights To Troubled Nightly Punch Line," by Tim Arango). As the headline suggests, reporter Tim Arango regards the Leno/Conan issue as a metaphor for a larger story about how NBC has lost its way. And the man responsible for all of this, in Arango's telling, is Jeff Zucker, the CEO of NBC Universal.
It might be worth remembering that Zucker was the young man who was appointed as Executive Producer of the TODAY Show when that franchise was struggling after the wobbly transition from Jane Pauley to Deborah Norville to Katie Couric. Good Morning America was in first place and TODAY needed a new direction. Zucker had the brains and the creativity to turn the show around and, under his leadership, TODAY regained the lead in morning television.
What's especially hard to swallow is that some of the potshots directed at Zucker in the Times come from people like Fred Silverman. Silverman, some of you may recall, once headed NBC programming. In the late '70s, when NBC's prime time shows were struggling, the network hired him away from ABC, where he had been hailed as the "man with the golden gut."
Since Silverman was quoted in Arango's piece as saying that the current situation at NBC is "a corporate embarrassment," it might be worth recalling Silverman's own track record as a programmer. His showcase new offering during his first season at NBC was a drama called "Supertrain." The budget was high, the Nielsen numbers were low, and the program disappeared after the '78 - '79 season. That was my first year at NBC News, and I can still recall a joke from Johnny Carson's monologue....
"NBC's prime time shows are so bad," Johnny said, "that our crack network programming team has decided that they're going to start copying the successful shows on the other networks. Like, ABC has that hit show, "That's Incredible!" We're working up our own version. It's going to be called, "That's... kind of interesting."
Times are tough at NBC, but they've been tough before and the network has managed to turn things around. The bottom line here is that Jeff Zucker doesn't do "kind of interesting." He does bold. And I think he has the smarts to turn things around again at NBC. In my book, he's just two hit shows away from being hailed as "the comeback kid."
It might be worth remembering that Zucker was the young man who was appointed as Executive Producer of the TODAY Show when that franchise was struggling after the wobbly transition from Jane Pauley to Deborah Norville to Katie Couric. Good Morning America was in first place and TODAY needed a new direction. Zucker had the brains and the creativity to turn the show around and, under his leadership, TODAY regained the lead in morning television.
What's especially hard to swallow is that some of the potshots directed at Zucker in the Times come from people like Fred Silverman. Silverman, some of you may recall, once headed NBC programming. In the late '70s, when NBC's prime time shows were struggling, the network hired him away from ABC, where he had been hailed as the "man with the golden gut."
Since Silverman was quoted in Arango's piece as saying that the current situation at NBC is "a corporate embarrassment," it might be worth recalling Silverman's own track record as a programmer. His showcase new offering during his first season at NBC was a drama called "Supertrain." The budget was high, the Nielsen numbers were low, and the program disappeared after the '78 - '79 season. That was my first year at NBC News, and I can still recall a joke from Johnny Carson's monologue....
"NBC's prime time shows are so bad," Johnny said, "that our crack network programming team has decided that they're going to start copying the successful shows on the other networks. Like, ABC has that hit show, "That's Incredible!" We're working up our own version. It's going to be called, "That's... kind of interesting."
Times are tough at NBC, but they've been tough before and the network has managed to turn things around. The bottom line here is that Jeff Zucker doesn't do "kind of interesting." He does bold. And I think he has the smarts to turn things around again at NBC. In my book, he's just two hit shows away from being hailed as "the comeback kid."
Saturday, January 16, 2010
All There is to Know
My daughter is a nervous wreck. She's a senior in high school and she just completed her college applications. It will be two or three months until she hears back. She's a bright girl, a gifted writer, a talented violinist and a serious student, but she'll be up against tough competition for admission to the kinds of schools she's targeting. None of them accepts even half of the students who apply, and many accept only one in five, or even one in ten. They are the cream of America's liberal arts colleges and universities.
What's interesting to me is why a liberal arts education continues to be so popular in our era of micro-specialization. No one seems to be hiring classics majors, anthropologists or art historians, yet freshman humanities courses emphasizing classical literature are still required at many liberal arts schools, and subjects like art history and anthropology are both popular majors. There may be diminishing financial rewards for "generalists," but our most prestigious institutions of higher learning continue to graduate them by the tens of thousands every spring. Why?
As a product of a liberal arts education, I'll suggest an answer. I think it's because intellectually curious human beings want to feel like they have a basic understanding of the world they live in, and their place in it. They go to liberal arts colleges because they're seeking encyclopedic knowledge. (Useful piece of trivia: the word encyclopedia comes from the Greek for "instruction in the circle of knowledge," which about sums up the agenda at most liberal arts schools).
OK, that's fine as far as it goes. But how do you make a living? A generation or so ago, just having a college diploma -- no matter the major -- almost guaranteed you a decent, white-collar job. No more. Now you need specific training for most jobs, especially those involving science and technology. So what good is that BA in English in our modern, high-tech world?
Well, for one thing, a liberal arts education is basic training for anyone who intends to do battle in the world of ideas. It helps to know some history if you intend to pursue a career in public service (like the history of North and South Korea, Governor Palin), or to have a basic understanding of world affairs and some facility with a foreign language if you intend to pursue a career in international business. But more than any practical application, a solid liberal arts education provides a foundation for life-long learning.
The sad fact is that I have probably forgotten at least half of the "facts" I learned in college (sorry, Wesleyan). But I think I'm much more knowledgeable today than I was when I graduated because I've pursued so many of the interests I first acquired there. The truth is, even if you never leave the library, you won't learn all you need to know in just four years. What a good liberal arts education does -- or should do -- is whet your appetite for continued learning.
I have a book by Alexander Coleman on my bookshelf called "All There is to Know." It's a selection of excerpts from the Eleventh Edition of the Encyclopedia Britannica, published in 1910. Hans Koning wrote in the New Yorker that Britannica's Eleventh was the last version of that encyclopedia to contain what its authors considered to be the sum of all human knowledge. The Great War and the decline of traditional empires put an end to the hubris that characterized the Age of Reason. But still, the idea that just a century ago you could buy a complete education in just 29 finely-bound volumes is something to think about.
I don't know what Britannica's Eleventh cost in 1910, but I'm sure it was a lot less than four years at Brown, Hamilton, Wesleyan or Trinity costs today. And for all that cash, you still don't get all there is to know.
What's interesting to me is why a liberal arts education continues to be so popular in our era of micro-specialization. No one seems to be hiring classics majors, anthropologists or art historians, yet freshman humanities courses emphasizing classical literature are still required at many liberal arts schools, and subjects like art history and anthropology are both popular majors. There may be diminishing financial rewards for "generalists," but our most prestigious institutions of higher learning continue to graduate them by the tens of thousands every spring. Why?
As a product of a liberal arts education, I'll suggest an answer. I think it's because intellectually curious human beings want to feel like they have a basic understanding of the world they live in, and their place in it. They go to liberal arts colleges because they're seeking encyclopedic knowledge. (Useful piece of trivia: the word encyclopedia comes from the Greek for "instruction in the circle of knowledge," which about sums up the agenda at most liberal arts schools).
OK, that's fine as far as it goes. But how do you make a living? A generation or so ago, just having a college diploma -- no matter the major -- almost guaranteed you a decent, white-collar job. No more. Now you need specific training for most jobs, especially those involving science and technology. So what good is that BA in English in our modern, high-tech world?
Well, for one thing, a liberal arts education is basic training for anyone who intends to do battle in the world of ideas. It helps to know some history if you intend to pursue a career in public service (like the history of North and South Korea, Governor Palin), or to have a basic understanding of world affairs and some facility with a foreign language if you intend to pursue a career in international business. But more than any practical application, a solid liberal arts education provides a foundation for life-long learning.
The sad fact is that I have probably forgotten at least half of the "facts" I learned in college (sorry, Wesleyan). But I think I'm much more knowledgeable today than I was when I graduated because I've pursued so many of the interests I first acquired there. The truth is, even if you never leave the library, you won't learn all you need to know in just four years. What a good liberal arts education does -- or should do -- is whet your appetite for continued learning.
I have a book by Alexander Coleman on my bookshelf called "All There is to Know." It's a selection of excerpts from the Eleventh Edition of the Encyclopedia Britannica, published in 1910. Hans Koning wrote in the New Yorker that Britannica's Eleventh was the last version of that encyclopedia to contain what its authors considered to be the sum of all human knowledge. The Great War and the decline of traditional empires put an end to the hubris that characterized the Age of Reason. But still, the idea that just a century ago you could buy a complete education in just 29 finely-bound volumes is something to think about.
I don't know what Britannica's Eleventh cost in 1910, but I'm sure it was a lot less than four years at Brown, Hamilton, Wesleyan or Trinity costs today. And for all that cash, you still don't get all there is to know.
Friday, January 15, 2010
These Are the Good Old Days
TV news veterans love to talk about the good old days when the big three networks spent real money on news coverage and looked to their news divisions to bring glory and prestige to the brand. It was a terrific time to work in network news. OK, it was rare when an exec in New York would actually say, "Damn the expense. Just hire the jet and get there first!" But it did happen. TV news was a very competitive business, and quality and speed -- rather than profits -- were the metrics by which broadcast journalism was judged.
The business has changed dramatically in the past twenty years as cost-conscious conglomerates have gobbled up the big three networks, and audiences for the evening news shows, diminished by growing competition from cable, have shrunk to a fraction of their former size. Nostalgic news veterans can lament the passing of a bygone era, but the fact is that the news business has always been, well, a business.
Simply put, network television news is in the business of delivering an audience to its advertisers, just like the network's prime time shows. Before cable, when prime time ad revenues were reliably strong, the networks could, perhaps, afford to view their news operations as loss leaders. But those days ended long ago. Today, network news has to pay its own way if it's going to survive.
Which makes what we've seen in the past forty-eight hours remarkable. The Haitian earthquake rang a bell in the newsrooms of the old "big three" broadcast networks and they responded like it was 1980, not 2010. I spent fifteen great years with NBC News, and I couldn't have been prouder of my old network than I was last night. For the second day in a row, NBC expanded its show to a full hour and filled it with moving (e.g., Ann Curry) and informative (e.g., Andrea Mitchell) reports that brought this terrible and complex story home. Brian Williams did a terrific job anchoring the coverage from Port au Prince, and Lester Holt was a solid and reliable mainstay for the network's coverage in New York. I surfed CBS's and ABC's shows and they, too, appeared to have all hands on deck.
It's hard to imagine that advertising dollars will pay for the expensive coverage we're seeing in Haiti this week. The network news audiences may be larger than usual, but the costs are going to be through the roof. One can only hope that these broadcasts will remind viewers of the value the network news shows provide and buoy audiences during future, quieter times. Who knows, they may even help develop an audience for the evening news shows among a new generation of viewers. (It would be nice to see some ads for Ford Mustangs and Apple computers).
Network news divisions have been reading their obituaries for years. This week, at least, it's clear that news of their death is premature.
Thursday, January 14, 2010
Late Night Madness
Where to begin?
Whatever you think about Conan or Jay, you have to wonder about what was going through the heads of NBC's PR team during the planning process for a revised late night schedule.
Let's recap. As everyone knows by now, NBC's affiliates put pressure on the network to "fix" the late night problem. Leno was not delivering the lead-in audience the affiliates needed to support their late news shows (which are typically big money-makers at local stations), and because of NBC's bold 10 p.m. experiment, the affiliates were stuck with Jay's lead-in five nights a week. NBC management, seeing weak numbers from Jay's show and weak numbers from the Tonight Show with Conan O'Brien, decided to reverse course and restore Jay to his old perch at 11:35 and push Conan to 12:05. Conan balked, the blogosphere lit up in protest, and NBC is now wearing a 12-egg omelet on its face.
Jeff Zucker has been pilloried for his decision to move Leno to 10 p.m., but I think that's a bum rap (personal disclosure: I know Zucker from my NBC News days; I like him and I respect him). Zucker's decision didn't turn out well, but it's worth remembering that nothing seemed to be working for NBC at 10 p.m., and betting on a proven ratings winner like Leno probably seemed like a gamble worth taking. (I was less supportive of the decision to turn the Tonight Show over to Conan O'Brien -- I always thought his appeal was way too narrow for Tonight's big-tent audience).
The people whose feet should be held to the fire are the PR department folks who let this story get away from them. Consider: this was a story that should have been carefully planned, yet it unfolded as if it were unexpected, like a natural disaster. NBC could have made the decisions they made, planned a thoughtful, professional roll-out of the news, and emerged -- whether the decisions were good or bad -- with the network's image intact. None of that happened.
What NBC needed was a headline strategy. The network needed to figure out the headline they wanted to read on the New York Times website in the hours following their announcement, then devise a strategy to achieve that goal. They needed to reverse-engineer the desired outcome and plan, step by step, to make it a reality. NBC's PR team would probably say that they would have been happy to do exactly that if the story hadn't leaked and taken on a life of its own. Sorry, but if Apple Computers can keep a lid on a multi-year development project and surprise the world with the iPhone, NBC should be able to keep the lid on its plans for late night comedy, at least for a week or two.
Make no mistake, this is not just an "image" problem for NBC. O'Brien's contract provides him with a very lucrative payoff if he and the network can't come to terms, and the viewer backlash over NBC's decision to push him off his perch at the Tonight Show will certainly work to his advantage in the negotiations ahead. NBC's PR team must be kicking themselves. What might once have been achieved amicably, behind closed doors, is now playing out publicly and to the network's disadvantage. It's ugly, and it's going to be costly.
Wednesday, January 13, 2010
Aftershocks
The tragedy in Haiti is still unfolding as I write this and the death toll is already projected to reach the thousands. The major newspapers and all of the network anchors are enroute, including Diane Sawyer, who is parachuting in from Afghanistan. The earthquake and its aftermath promise to be a major story for days, and possibly weeks to come. What kind of a story the journalists tell will be decided, in large part, in Washington.
The role that the United States Government should play in disaster relief is pretty straightforward. Lives hang in the balance and our government should respond immediately with food, medicine, and, if necessary, security police to keep order. Yes, France should help, too. Haiti was once a French colony. But since the declaration of the Monroe Doctrine almost two centuries ago, the United States has taken a proprietary position on security issues in the Americas, and implicit is the notion that when disaster strikes and urgent assistance is needed, the US should lead the charge.
But then what? What role should the US play in Haiti after the wounds are bandaged and the rubble is cleared?
The instincts of the Obama administration, from the president on down, will probably lean strongly in the direction of nation re-building. It's president Obama's chance to use US power for good, and to demonstrate that he's the un-Bush (see Katrina, Air Force One flyover, "Heck of a job," etc.). But Haiti has long been a problem that's defied easy solution. The fact that it's the poorest nation in the hemisphere is repeated so often you'd expect to see it on Haitian bumper stickers. Living conditions are atrocious. Political corruption is endemic.
Ironically, the tragedy of this massive earthquake could, at last, provide an opportunity for reformers in Port au Prince -- with Washington's help -- to finally lift the Haitian people out of poverty and create a new and better country. It would be the longest of long-shots, but it could be done. Social entrepreneurs from around the world could be enlisted to come up with new housing and transportation solutions. The US and France could cooperate on restoring basic services like water and electricity. And Dr. Paul Farmer could be put in charge of building a new medical infrastructure. (If you haven't read Tracy Kidder's Mountains Beyond Mountains, now is the time).
There will be pushback, of course. As the seemingly interminable wars in Iraq and Afghanistan drag on, the American people will be in no mood for more nation building, no matter how needed or how close to home. And the president himself might have a hard time justifying major expenditures for Haiti when so many of our fellow citizens are hurting here at home. (Someone is bound to say that we should finish rebuilding New Orleans before we send American tax dollars to Port au Prince).
The timing is terrible. But in the end, there is no good time for an earthquake, and there is no time to respond but now. It will require the American people to sacrifice for others for reasons other than national security or political gain. But that's how great nations define their greatness. And as we like to say in my neighborhood, what are neighbors for?
Tuesday, January 12, 2010
Coming Clean and Blowing Smoke
Mark McGwire's PR team got high marks from the New York Times for the way they handled his revelation that he'd used steroids during his heyday as a home run hitter for the St. Louis Cardinals. Ari Fleischer Sports Communications managed to bundle the distribution of the press release and McGwire's key media interviews into one day -- one afternoon, really -- thereby "taking ownership of the story."
This is Crisis Communications 101, and McGwire will probably get kudos from some people for (finally) coming clean about what he did and when he did it. But in the end, will any of this really help his reputation? Will it rehabilitate him in the eyes of baseball fans? Help him get into the Hall of Fame?
I doubt it. The problem McGwire faces is that his steroid use, in the eyes of most fans, renders his historic achievements null and void. It's no use arguing, as McGwire did to Bob Costas in his centerpiece interview, that his native "hand-eye coordination" and "natural bat speed" would have allowed him to hit all of those home runs even without the juice. The millions of fans who wondered about his suddenly massive, muscular physique, aren't buying it.
What McGwire needed to do to salvage his reputation and his self-respect was to publicly renounce his title as the Home Run King and voluntarily erase his name from the record books. By doing that, he would have put consideration of the reputation of his sport ahead of concern for his own reputation. He'd be a hero again, at least for a day. No doubt he'd still be denied a place in Cooperstown, but he would have earned respect for being a "stand up guy."
McGwire's PR team should have told him: you can't come clean and blow smoke.
Subscribe to:
Posts (Atom)