Tag Archives: media

Facebook pays for content. Therefore, it’s a media company

Recently, Facebook COO Sheryl Sandberg was asked in an interview whether Facebook is a media company. “At our heart we’re a tech company,” Sandberg said. “We hire engineers. We don’t hire reporters. No one is a journalist. We don’t cover the news. “

Well, that’s not exactly true. In this video, I explain why.

How I leveraged my journalism skills for a career in content marketing


From time to time I’ll be speaking to someone I’ve never met before, and the “What do you?” question comes up. I’ve always struggled to answer. If I tell the person I’m a journalist, then invariably the next question is, “What publication do you work for?” Simply telling people I work in marketing could mean a variety of things depending on how familiar the person is with that industry. Well, though I haven’t thought up a more concise way to sum up my career, I now at least have an article I can point them to that discusses it at length. MediaShift asked me to write a first-person essay on how I leveraged my journalism skills for a career in content marketing. Here is the end result:

Why Freelance Journalists are Shifting Their Careers to Content Marketing

Silicon Valley once steered clear of original content. What changed?

apple tv

By the time news outlets began reporting that Apple is actively negotiating with Hollywood executives to produce exclusive programming for its fledgling television platform, none of us seemed surprised that a major tech company would invest so much money in original content.

If anything, Apple is late to the party. In 2011, Netflix, which until then had been just a tech platform that allowed one to stream already-released movies and old seasons of television shows, plopped hundreds of millions of dollars into the creation of premium shows, greenlighting them before even seeing a pilot. Amazon wasn’t far behind, launching a bevy of shows to mixed reviews. In 2012, YouTube shelled out $100 million to both lure established media companies onto its platform and allow its already-existing stars to up their games. These days, not a week goes by without a major tech company announcing a major content play, whether it’s Yahoo’s resurrection of the show Community or Facebook’s offering of huge advances to YouTube stars in order to entice them onto its native video platform. Twitter recently attempted to purchase the millennial news site Mic, and prominent venture capitalists have bought huge stakes in companies like BuzzFeed, Vice, and Vox, valuing these news outlets in the billions of dollars.

Viewing all this activity, it’s hard to believe that, a mere decade ago, the tech sector considered original content anathema to everything it stood for, a vestigial hangover from the days when the barrier to entry for content production and distribution was relatively high and therefore lucrative.

Circa 2007 – 2008, the practice of creating original content seemed to be a dying profession. The music industry had been completely eviscerated in the wake of Napster and other file-sharing programs. Newspapers were well into their decline, already kneecapped by Craigslist and facing a print advertising exodus. Magazines weren’t far behind them. The book industry, while not exactly suffering, wasn’t thriving either, with most sales coalescing into a handful of conglomerates who were already bracing themselves to have their asses handed to them by Amazon. The television industry seemed relatively sturdy but most assumed its day of reckoning would eventually come.

This is when we saw the rise of platforms that were fueled primarily by user generated content. First Myspace, and then later Facebook, YouTube, and Twitter. Media companies that were suffering looked at Keyboard Cat and assumed that this was the future of content, and Silicon Valley didn’t seem to disagree. Original content was expensive and difficult to scale effectively; why hire 60 journalists to create content when you could spend that money on 30 engineers who would then build a platform on which millions of users would generate content for free?

So what changed? Why are we seeing the sudden emphasis on premium programming in a world where everyone with a GoPro seems willing to upload their videos for no payment?

Well, it turns out that original content actually is scalable, particularly when it’s hosted on the right tech platform. Netflix just announced in July that it had reached 65 million subscribers, a number that would have been difficult to attain when it was merely licensing reruns, especially as other low-cost streaming services have entered the market. And sure, it’s possible that your amateur video of cat could hit the viral stratosphere, but most don’t, whereas YouTube stars can guarantee millions of views for each video posted. The majority of BuzzFeed listicles reach at least a million views, which means that your average BuzzFeed staffer can reach an audience that’s similar in size to The Daily Show’s.

And though viewers have flocked to user-generated content, advertisers still prefer premium programming, especially if it attracts hard-to-reach demographics. The critically-acclaimed USA Network show Mr. Robot only attracts about 3 million viewers per episode, a mediocre turnout when compared to the network’s other hit shows, but it’s having to beat away advertisers with a stick. “It’s a hot property right now,” network president Chris McCumber told New York Magazine. “We have more demand than we can handle for Mr. Robot, and it’s bringing in new advertisers.” And with brands increasingly shifting budgets toward native advertising and away from display, it suddenly behooves tech platforms to have in-house content expertise.


Finally, tech companies have discovered that exclusive content is a great way to lock users into a platform. A decade ago, there were only a handful of social networks that had the millions of users needed to effectively scale user generated content. Now let’s consider the number of platforms today that have at least 50 million active users: Facebook, Twitter, Google+, LinkedIn, Pinterest, Instagram, Snapchat, Tumblr, WhatsApp, Foursquare, YouTube, Flipboard. I’m likely just scratching the surface.

We now have dozens of networks competing for our attention, and our loyalty to any one platform is tenuous at best. Exclusive content, even if it makes up a relatively small percentage of the content posted to the platform, gives us that much more incentive to choose one platform over the other. Medium, the blogging platform launched by Twitter co-founder Ev Williams, employed this strategy well when it hired top-tier freelance journalists to write on its network before opening it up to the masses (I and call this the “mullet strategy”). Of course, nobody has capitalized on this approach better than Netflix, which is now spending north of $700 million on content you can’t watch without a Netflix subscription.

The question now is how traditional media companies, many of which have been producing original content for decades, will respond. Already we’re starting to seeing seismic shifts in the media landscape, whether it’s HBO launching a standalone app or magazines like Forbes transforming themselves into platforms. News companies are also inking content distribution deals on platforms like Facebook and Flipboard with promises of revenue sharing.

Perhaps the late David Carr was right when he said, in 2012, that “big news is still the killer app,” by which he meant original content. Given how much we keep hearing about the current “golden age of television” and the rise of millennial-focused news companies that are reaching billion dollar valuations, I can’t help but agree. A new dawn is upon us, and if you’re a content producer like myself, then take a few moments to rejoice.


Like my writing? Then you should hire me to create awesome content for you.

Will the FTC soon rain on native advertising’s parade?


In late March, CUNY journalism professor Jeff Jarvis reached into his own pocket and paid for a Google Consumer Survey, the results of which should concern anyone who works within the journalism or advertising industry. He did so after browsing the viral aggregation site Upworthy and noticing a miniscule “promoted” tag on the righthand corner of one of its posts. You and I know what that word means — that it’s a form of native advertising, which is paid-for content that appears alongside and resembles editorial content — but does the average news consumer?

No. In fact, a majority of those surveyed, 56 percent, had no idea that any money had exchanged hands for the post’s existence (most thought it was some sort of recommendation, either by algorithm or from the site’s editors). “Wouldn’t it be a helluvalot simpler just to call it an ad?” Jarvis asked rhetorically. “Why don’t they? Why doesn’t any publisher of such promoted/native/sponsored/brand content just call it an ad? Because busy people don’t want to click on ads; if the web proves nothing else, it proves that. So they—publisher and marketer, united—want to fool the reader into clicking.”

Jarvis isn’t the first to notice this obfuscation. Last year, Augie Ray pointed to several studies that shed light on the opacity of native advertising, including an Interactive Advertising Bureau survey that found only 41 percent of the general news audience was able to identify native advertising and a 2013 study revealing that over 50 percent of respondents “didn’t know what the word ‘sponsored’ actually meant.”

These studies come to us as we continue to contemplate whether native advertising is the news industry’s “savior,” here to rescue news orgs from ever-diminishing display ad rates. Over the past few years nearly every major news company has launched an in-house “creative agency” that works directly with sponsors to craft promotional content it thinks will appeal to the publication’s readership. Yahoo CEO’s Marissa Mayer has reportedly staked the future of her company on native advertising and the format now makes up 10 percent of the New York Times’s digital advertising revenue.

Thus far, the industry has galloped into this new frontier, treating it as a sort of Wild West where the chief concern is delivering value to the paying advertiser, even if it’s to the detriment of the consumer. As AdAge reported last year, the New York Times “shrunk the labels that distinguish articles bought by advertisers from articles generated in its newsroom and made the language in the labels less explicit,” all because “several marketers have bristled at all the labeling, suggesting it turned away readers before they had a chance to judge the content based on its quality.”

But just as the Wild West eventually reached a saturation point that required more strident law and order, native advertising, in its near-universal application, may be soon facing its own reckoning, in this case from the Federal Trade Commission.

Many mistakenly believe that a piece of advertising meets FTC requirements as long as there’s some form of disclosure, but that’s not true. In fact, the burden is much higher: the disclosure must be sufficient so that the average consumer recognizes it as paid content, and such recognition occurs prior to them consuming it. As I’ve documented previously, the agency has a long history of stepping in and ruling a disclosure insufficient, and it sometimes offers specific guidelines on how the disclosure should be presented.  Barry Cutler, who was director of the FTC’s Bureau of Consumer Protection from 1990 to 1993, recounted to me last year how the agency cracked down on infomercials in the 80s and 90s that purported to show man-on-the-street interviews with satisfied customers and scientists in lab coats endorsing products. As Augie Ray explained, the FTC requires infomercials to include the words “‘THE PROGRAM YOU ARE WATCHING IS A PAID ADVERTISEMENT FOR [NAME OF PRODUCT]'” at their start.

Over the last decade, the FTC has slowly waded into internet advertising, issuing several guidelines ranging from how a disclosure should be presented in a sponsored tweet to the requirement that bloggers disclose when they’ve received free products in exchange for reviews.

But so far it has remained reluctant to issue any firm guidelines for native advertising. A workshop conducted in late 2013 with several major news orgs left the agency “with no clear direction about how to police” the format. Considering that one of its earliest cases, in 1917, was against a vacuum cleaner company that placed misleading newspaper ads, the agency certainly has precedent on its side when wading into such issues, but the 2013 meeting merely led an FTC representative to conclude that “this has raised more questions than it answered.”


Well, it may need to answer those questions sooner rather than later, given that in recent months we’ve seen strong evidence that native advertising is not only making it difficult for consumers to differentiate between editorial and sponsorships, it’s also eroding the wall between the editorial and business divisions of news companies.

In October, a former Vice editor published emails sent from higher-ups in which he, the editor, was repeatedly reprimanded for publishing stories critical of Vice’s native ad partners. And then recently, BuzzFeed, considered to have one of the most successful native ad models in the industry, came under fire for removing a post critical of Dove, a BuzzFeed sponsor. Though editors initially argued the post was removed for other reasons, an internal investigation revealed several instances in which editorial staffers were pressured by the business staff into removing posts.

Is it possible that this could have happened had the sponsors simply purchased standard display ads? Sure. But it’s not difficult to see how creating sponsored content that so closely resembles editorial content erodes the differentiation not only in consumers’ eyes, but in the eyes of newspaper executives as well. And with these companies facing increasing pressure to make up for lost print advertising dollars, the erosion of that wall may prove too tempting to overcome. While no industry welcomes the oversight and enforcement of the FTC, I can’t help but wonder if many editors and reporters would breathe a sigh of relief if the agency suddenly stepped in and ensured that their journalism would continue to retain integrity in a world where marketers are concerned with anything but.


Like my writing? Then you should hire me to create awesome content for you.

How Obama forever changed the presidential interview


It’s hard to pinpoint why Bill Clinton’s June 3, 1992 appearance on The Arsenio Hall Show was such a seminal moment in the history of presidential interviews. After all, he wasn’t the first presidential candidate to agree to an interview on a late night show. In 1960, then-Senator John F. Kennedy appeared on Jack Paar’s Tonight Show, and in 1975 soon-to-be presidential candidate Ronald Reagan sat down for an interview on The Tonight Show with Johnny Carson. Yet Clinton’s decision to speak to and later play the saxophone for Hall’s audience marked the beginning of something new. Perhaps it was the generational hand-wringing, led most prominently by the traditional media stalwarts who, even then, felt vaguely threatened by the ease with which they had been bypassed completely. “In the long run, there is no substitute for discussing the tough issues they’re going to have to handle as President,” Tim Russert told the New York Times at the time. “If people really thought you could get elected by playing the saxophone, there would be a lot more musicians running for President.”

That hand-wringing would only increase as this type of presidential interview became more commonplace. Most interesting to me is how White House outreach has mirrored the larger unbundling of the media. It used to be that a small handful of networks dominated the television airwaves, thereby necessitating that any messaging travel through a paucity of channels, but the rise of the cable bundle through the 80s and 90s set the stage for then-presidential candidate John Kerry’s 2004 appearance on the Daily Show, a program on a cable network that existed at the very end of the dial. “A lot of television viewers — more, quite  frankly, than I’m comfortable with — get their news from the Comedy Channel on a program called The Daily Show,” complained Nightline anchor Ted Koppel at the time.

Before we flash forward to the rise of social media and the current presidency, I want to take a few steps back to the year 2000, when the Clinton White House shot a humorous video entitled “Final Days.” Though it was created for and first appeared during the White House Correspondents Dinner, the video was among the very first White House-produced videos to gain traction online. Even more interesting, I think, is how well it captures the media landscape Barack Obama would later confront head-on using new media tools. A lameduck Clinton, with only eight months in office left, finds he’s already entered an era of obsolescence. He lectures from a podium, only for the camera to zoom out and show the press room is entirely abandoned, save for Helen Thomas. Tim Russert passes on interviewing Clinton for Meet the Press. The White House Press Corps snickered while watching the video during their dinner, but it represented the long-standing criticism that their coverage existed as nothing but blood-sport, and they lacked interest in anything outside the latest horse race politics then gripping the nation.

If June 3, 1992 represented one turning point, then January 30, 2012 epitomized the next. That was when President Obama and the White House collaborated with Google for a live Google Hangout with five pre-selected participants. Though it could easily have succumbed to the inauthenticity of a manufactured, townhall discussion — the lame ones you see as presidential candidates tour the country — I was struck by the genuine substance of the event. These were real Americans from diverse backgrounds, many of them still reeling from the Great Recession and unafraid of putting Obama’s feet to the fire.


Though Obama’s team had already received plenty of praise for its digital outreach efforts up until that point, this Google Hangout unleashed a new era of diversified messaging catering to the fragmented media landscape that has made mass-outreach near impossible. “After the midterm elections, the President instructed us to double down our efforts, to try to get more innovative and more aggressive,” Daniel Pfeiffer, Obama’s senior advisor, told Steven Levy recently. This was around the time when you saw the president begin appearing for Reddit AMAs and in online-only outlets. If Clinton’s Arsenio Hall appearance was controversial, imagine the outcry when Obama sat down with Zach Galifianakis for Between Two Ferns. “The President shouldn’t be wasting time on a parody interview,” tweeted Republican Congressman Randy Weber. “He should be focusing on the current state of affairs across the globe.”

obama galifnakis

But, as Pfeiffer explained in his interview, Obama had no choice. “This disaggregation of the media is very challenging for a White House,” he explained. “You can no longer just have a nationally televised address and speak to 150 million people. So you have to work 15, 20, 30 times harder than previous presidents to have the same impact.”

Then came Obama’s January sit-down with three YouTube celebrities, one of whom was a 50-something comedian who had once taken a bath in Froot Loops. If the media had shown any kind of restraint in the past, now it was open season.

froot loop

In a postmortem written for Medium, Hank Green, one of the aforementioned YouTube stars, pinpointed the genuine fear that stemmed from many of these criticisms:

Legacy media isn’t mocking us because we aren’t a legitimate source of information; they’re mocking us because they’re terrified. Their legitimacy came from the fact that they have access to distribution channels and that they get to be in the White House press pool because of some long-ago established procedures that assumed they would use that power in the public interest. In reality, those things are becoming less and less important and less and less true. Distribution is free to anyone with a cell phone and the legitimacy of cable news sounds to me like an oxymoron. The median-aged CNN viewer is 60. For Fox, it’s 68.

For every Fox News host openly criticizing Obama for shooting a selfie for BuzzFeed or releasing a House of Cards-style video on April Fools, there’s a group of Fox News executives huddled in a conference room reviewing Nielsen charts and strategizing how they can recapture Millennial viewers. The incongruity of these two scenarios shows they’re no closer to answering that question than they were when Obama took office.

The debate over whether these cable stalwarts will ever answer this question is now moot, because the White House has already moved on from the traditional media structures to which it formerly adhered and entered a fray from which it will never return. The question now is whether the traditional media follows him and future presidents into that fray or if they stand on the outside, forever looking in. Those who consider themselves “above” such shenanigans shouldn’t be surprised as they’re given less and less access, especially as their core audience begins to die off. The white men in tailored suits have had their moment. Bring on the green lipstick and Froot Loops.


Like my writing? Then you should hire me to create awesome content for you. 

The perils of hiring “name brand” journalists

Ta-Nehisi Coates

Ta-Nehisi Coates

In my interview last month with Jake Swearingen, social media editor at The Atlantic, Swearingen’s assessment of Twitter and its role in driving real traffic to news stories could only be considered bearish. “Twitter doesn’t drive a lot of referral traffic, and it doesn’t drive a lot of referral traffic for us,” he told me. “I think if we were just looking at referral traffic, we could stop posting on Twitter all together and we would take a traffic hit, but it wouldn’t be a significant one.” He had one exception to this declaration, however. “Ta-Nehisi Coates, pretty much anything he writes is going to do well, and where it does really well is on Twitter. Because when he says ‘I have a story,’ it’s something people pass around really quickly.”

In his several years as a senior editor and blogger at The Atlantic, Coates has benefited greatly from the institutional credibility the venerable magazine lent him and has successfully built his own devoted audience. As described in a great CJR profile of him, Coates has amassed a small army of intelligent, thoughtful commenters, the kind of community every editor wishes for as he gazes at the racist bile and detritus that floods nearly every article comment section on the internet. This loyal audience has made Coates extremely valuable to The Atlantic, in fact a case could made that he’s the magazine’s most valuable employee and the person whose departure would cause the most damage. This is because those fans who follow Coates and click dutifully on his Twitter links  do so not because he writes for The Atlantic, but because the content was written by Coates, and at this point he could pick up and leave, bringing the majority of those readers with him.

If there’s one lesson to be learned from the Brian Williams fiasco still plaguing NBC News, it’s that you shouldn’t put all your eggs in a single media basket. What might have been an otherwise routine series of disciplinary actions taken against a TV journalist blew up into a network-wide crisis simply because NBC had spent the better part of a decade building a brand that rested upon a single person: Williams.

In a recent behind-the-scenes tell-all published in Vanity Fair, we learn that this over-reliance on the personal brand was compounded after Comcast’s purchase of NBC in 2010. The new raft of executives Comcast brought in “time and again proved unable to rein in the news division’s high-priced talent.” As one anonymous source put it, “You have kids? Well, if you let them, they’ll have ice cream every night. Same thing in TV. If you let the people on air do what they want, whenever they want, this is what happens.” Once you transfer too much power to a few on-air personalities, then each acts as a keystone, a central foundation, the removal of which causes the entire apparatus to tumble down.

NBC can’t be entirely blamed for this strategy, however. As Frank Rich lays out in a brilliant history of the network news anchor, this personality-driven approach to TV news has its roots in the mid-20th century, when networks eschewed journalistic pedigree and instead focused on hiring square-chinned entertainers with charm and good looks.

The network anchor’s roots are not in journalism but in the native cultural tradition apotheosized by L. Frank Baum. Like the Wizard of Oz (as executive-produced by Professor Marvel), anchors have often been fronts for those pulling the strings behind the curtain: governments and sponsors, not to mention those who actually do the work of reporting the news. With their larger-than-life heads looming into our living rooms, the anchors have been brilliant at selling the conceit that a resonant voice, an avuncular temperament, a glitzy, thronelike set, and the illusion of omniscience could augment the audience’s brains, hearts, and “courage” (at one point, a Dan Rather sign-off) as it tries to navigate a treacherous world. Just don’t look behind the curtain. Many of the charges leveled against Williams for conduct unbecoming an anchorman could be made against his predecessors too.

For most of news print’s history, the medium has avoided this personality-based approach to journalism. Sure, every reporter has a byline, and she is expected to gain credibility within her beat, especially among sources, but it was somewhat rare for a journalist to gain anything approaching celebrity status unless she published a book.


All that has changed, however, with the rise of the internet and cable news. Beltway reporters now clamor to get invited on as on-air commentators for MSNBC, CNN, and Fox News, and in some cases their employers task PR staff to actively pitch these reporters to cable news producers. At the same time, every aspiring reporter is now expected to be building out a “personal brand” before even graduating college. In addition to the standard byline, most media outlets also include a small bio, along with a photo and link to the journalist’s Twitter account, at the bottom of every story. We’re now seeing the breathless coverage of reporter job hopping — Joe Weisenthal decamping Business Insider to join Bloomberg, Felix Salmon leaving Reuters to head up Fusion — that was typically reserved for celebrities in tabloid mags. A new generation of media-focused publications, from Capital New York to Digiday, have arisen to cover the inside baseball of these rising journo celebrities.

As I wrote last week, we’ve seen an increasing trend of writers and other artists finding ways to dispel of the middlemen, and media companies risk becoming nothing but dumb pipes through which readers flow to the real brands they’re seeking out: the journalists themselves. For the most part, news orgs are actively encouraging this personality brand building. One of the few exceptions is The Economist, which not only famously eschews bylines entirely, but also has an active policy not even to link to outside sources. It’s hoarding its readers, lest any of them decamp to a competing publication. With its circulation at an all-time high, its strategy has seemed especially prescient.

While The Economist’s approach might be a little extreme, it’s worth studying as media companies struggle to define their place in a world where the majority of traffic is now being driven through side doors like Facebook and Google. How many mini versions of Brian Williams, upon whom publications’ futures are increasingly reliant, are being unleashed upon the world, and how can investment in them continue to produce longterm value? The Atlantic was forced to confront this problem head-on in 2011 when Andrew Sullivan announced he was leaving the magazine to blog for the Daily Beast. At the time, Sullivan was responsible for up to 75 percent of The Atlantic’s traffic and had been widely credited for placing the magazine on the path to online profitability. Luckily, it was able to adjust to the loss and continue to increase its web dominance. Still, I can’t help but wonder if Atlantic executives, after facing the mass exodus of Sullivan’s audience, view Coates’s own success building a readership with a certain level of wariness. After all, how can they read about the comings and goings announced each week in Capital New York and not recognize that Coates — and his audience — are just one job offer away from departing?


Like my writing? Then you should hire me to create awesome content for you.

Image via The Lavin Agency

What Andrew Sullivan taught us about paywalls and independent journalism

andrew sullivan

In a video-recorded interview with the Nieman Foundation in 2013, Andrew Sullivan recounted how, in 2000, he received the “light bulb idea” to start his own blog. He was in England, traveling from London to Oxford, and while at the train station he happened to see a stand with copies of the Evening Standard, which is an afternoon paper. “And I just realized that journalism has always produced material around the clock. Why don’t I just start writing at different times and provide the readers with the kind of journalistic service that the London papers are doing?”

So began an era. Over the next decade and a half, Sullivan would gradually build a massive audience, helped in part by having his blog syndicated on major news websites like Time, The Atlantic, and the Daily Beast. In 2013, he announced that he and a small editorial team would launch a standalone, independent news site, generating revenue by way of subscriptions. His theory was that he had enough dedicated readers who would be willing to pay to read his content, and he was correct. The site generated about a $1 million a year in revenue, enough to sustain him and his staff.

Then, this week, Sullivan shocked the media world by announcing he would cease daily blogging, citing the enormous pressure of producing round-the-clock content and a desire to write longform essays and books. The move produced no shortage of think pieces not only about Sullivan’s impact on the media and journalism, but whether or not his model proved that independent journalism can produce a sustainable business. While I tend to agree with those who do think there are valuable lessons to be drawn from his experiment, I think it should be done with several caveats. Sullivan’s endeavor, while successful, was in many ways unique and could not necessarily be emulated by someone starting out in journalism today.

So here are some lessons we can draw from Sullivan’s two-year stint as an entrepreneurial blogger:

Establishing an online brand takes time

Andrew Sullivan didn’t start his blog from scratch and then suddenly begin raking in money. Not only had he been blogging for 14 years, but he had been an established journalist for years before that, at one point serving as an editor at The New Republic. And during those 14 years he was blogging he hosted his blog on the websites of major journalistic institutions, which not only endowed him with more legitimacy but also helped in attracting readers. There are very few writers on the web who have had that kind of prolonged exposure, and so the launch of any new website is unlikely to be met with an instant audience, much less one willing to pay to read your content.

Cut out as many middlemen as possible

Part of the reason that Sullivan’s venture was successful was because it had little overhead, and he knew from the very beginning that he would need to cut out as many middlemen as possible if he wanted it to work. Generating a million dollars a year in revenue sounds impressive, but that number gets chipped away at pretty quickly the more people you need to hire. Sullivan wasn’t bogged down by the institutional costs that plague many legacy media outlets — he basically just needed a computer and somewhere to host his website , and all other overhead went to his editorial staff. He didn’t have to hire an ad sales team or pay for a printing press or rent trucks to deliver papers to newsstands. I’m not even sure that he needed to rent office space; I’m guessing most of his staff worked out of their own homes.

Paywalls can work, but…

Yes, Sullivan and a handful of other news outlets have made digital paywalls work, but it continues to be a tricky endeavor. It’s now commonly accepted that if you’re going to have a paywall, it needs to be a leaky paywall, one that allows visitors to read a certain amount of content before they need to pay up. And even with this strategy there seems to be a ceiling. The New York Times, which has one of the most successful online paywalls, has begun to plateau at around 900,000 subscribers. Yes, that’s impressive, but that means nearly everyone else is likely to see subscription numbers far south of that. Even at his most successful, Sullivan was only able to amass 30,000 or so paying subscribers.

Also, while it’s easy to get people to try out a subscription, getting them to renew is much harder. Sullivan saw about an 83 percent renewal rate. It seems paywalls are a sort of novelty where people are willing to pay up to try it out, but when it comes to renewing they’re more likely to assess whether they got their money’s worth the prior year.

Leaky paywall or not, it’s still pretty damn hard to get people to pay for text-based news content online.

An advertising model is also hard

Unless you’re willing to create the kind of clickbait content that generates millions of pageviews, achieving the kind of scale needed in order to make enough money on ads is near impossible for a small, independent outlet. Sullivan decided early on that he wasn’t interested in producing this kind of content and so chose to forgo advertising completely. Even if you do achieve web traffic scale, there’s no guarantee that you’ll have the ads to populate your site, especially if you’re not willing to hire an ad sales team (which creates additional overhead).


The content churn can be draining

Even without the viral aggregation, Sullivan’s output was draining. In  his post announcing his quitting, he specifically cited the pressure of producing daily content and how this discouraged deeper thinking and longform writing. Independent news outlets typically have small staffs and there’s a lot of pressure to constantly produce new content so it’s waiting for your readers when they return every day. The fewer staffers you have, the more pressure on each one to carry his or her weight.

Speaking of staffers, I think it’s important to address something about Andrew Sullivan’s success that may not have been immediately obvious for his more casual readers. Because of the way his site is structured — his name in the domain and an illustration of him at the top — it would be reasonable to conclude that any particular post you were reading was written by him. But starting sometime during his days at the Atlantic, Sullivan began using interns to produce unbylined content for his blog, a move that some people (including myself) found a little shady. To be fair, these days he credits his staff in the masthead and will sometimes reference them in posts, but for the most part it can be difficult to discern whether a post was written by him or someone else. The reason I bring this up is that Sullivan has created an exaggerated perception of what one person can accomplish on his own, mostly because a lot of what he accomplished wasn’t done so on his own.

Which is to say that his experiment would have likely turned out much different if he was just a lone blogger. Many of you who are perhaps thinking about quitting your journalism jobs to run your own standalone websites should keep this in mind — even one of the most popular bloggers in the world who benefited from widespread media coverage when he decided to go out on his own needed a staff in order to produce a sustainable business model. I’m not writing this to discourage your dreams of journalistic entrepreneurism, but rather to state one obvious but important truth: Not all of us can be Andrew Sullivan, so we should take any “lessons” from his experience with that in mind.


Like my writing? Then you should hire me to create awesome content for you.