The Subtle Power of Branding

Branding is one of the more advanced forms of business voodoo. There are many approaches to it -- emotional, analytical, strategic, aesthetic. All of them matter, and a powerful brand combines these approaches and others in a compelling and memorable manner. Ignoring branding means ignoring a potentially high-ROI business element. Embracing branding doesn't mean success, but improves the odds of long-term success.

Strategic brand initiatives can occur in a couple of ways, either to express a set strategy or to help facilitate the discovery of a latent strategy. All organizations have a strategy working at some level, but often haven't made it explicit. In these unexamined cases, an effective set of strategies can blend with less effective tactics, making it hard to know which is which. A strategic branding approach can help leaders differentiate between strategic and non-strategic activities, and then consolidate success into a powerful new brand expression. Of course, when strategies are already settled and clear, branding moves along more quickly into aesthetic and emotional spaces.

Emotional branding generates some of the most fascinating work in the area. There are a number of approaches, and those invoking archetypes are pretty convincing. One favorite example is how GE, representing the "creator" archetype, had the slogan, "We bring good things to life." This slogan captures the "creator" in spades. When GE recently changed their slogan to, "Imagination at work," the "creator" archetype remained in play, but the creator in question moved from life-giving to idea-creating. A subtle restatement that remains true to the company's archetype, it gives the organization a less religious or patronizing feel, and leans toward personal achievement and creative thinking.

Aesthetics matter, as well. Verizon's recent shift away from its well-established italic logotype with huge, looming checkmark like an angry eyebrow to a simpler wordmark with a simple checkmark "ding" at the end is well-analyzed in a post written soon after the new logo appeared. What's interesting is how simple the new logo is, especially when viewed in the midst of the other major carriers' logos, giving the Verizon logo a strength and confidence it didn't possess before. Aesthetics are contextual, after all. In the competitive landscape, what is your brand saying compared to others your customers encounter all the time?

Some branding approaches are more analytical/structural -- a "branded house" approach often requires structured branding, with an umbrella brand and subsidiary brands. The strategy is clear in this case -- support the structure of the business, and make the brand extensible.

Publishers live in a different branding space. Journal publishers see their brands mostly resolved into specific products with their own value elements -- unique audiences, impact factors, and editorial approaches. In the books world, the branding of authorship or series continuations can be extremely powerful, and publisher branding is usually small and regal. Company branding is usually more on the b2b side, not the b2c side.

Branding can cut through the clutter, prepare the path to sales, attract and reassure partners and customers, and create a consistency that's vital for long-term success. Is your brand well-managed? Does it reflect your strategy?

Sci-Hub -- Two Sides, Both Shrink

There has been good coverage of the Sci-Hub controversy -- from an overview of their technical approach from Silverchair to media coverage to blog posts.

One thing that was recently pointed out in a blog post by Stewart Lyman is that Sci-Hub is entirely dependent on the scientific publishing economic ecosystem remaining largely intact. Because the site leverages credentials from institutional subscriptions, these access points are vital to the site's ongoing relevance (and, with a couple of million new articles being generated each year, this is not a trivial point).

As Lyman writes:

The rebel movement won’t gain much traction unless researchers at Yale, Stanford, Oxford, Pfizer, and Genentech, etc. begin to switch over to Sci-Hub, and that’s not going to happen. These organizations will block this behavior because, though they hate paying for overpriced journals (e.g. Harvard paid $3.5 million in 2012 for these), they will stand firm in support of intellectual property rights. They will not be part of the revolution.

While Lyman is certainly correct, I think there is an important nuance to add. I wrote about this recently, and Lyman's essay provides a good chance to expand on the point in an important manner.

In my earlier post, I talked about how piracy might drive consolidation among publishers, something that is already moving ahead at a steady clip, if not accelerating. Many specialty societies are signing on with the big publishers as competing on terms of operational and sales excellence in a global, technological economy seems less and less feasible for small organizations with each passing year.

However, what I missed in the first post is that piracy might also consolidate the purchasers in the academic publishing market.

While Lyman notes that big institutions aren't going to violate IP laws they themselves benefit from, many smaller purchasers may have fewer qualms or restrictions when it comes to using sites like Sci-Hub to access content. Individuals certainly will have fewer qualms or restrictions.

Already, technology has consolidated purchasers in academic publishing, from widespread individual and uncoordinated departmental purchasers to fewer institutional licenses, coordinated organizational purchasing, and consortial approaches to consolidate buying power.

Should piracy effects allow smaller purchasers to exit the market, pricing for those remaining will increase to compensate. As large, consolidated sellers meet large, consolidated buyers, pricing battles will become more common, more often public, and messier overall.

But Sci-Hub or its ilk can't kill paid access. It is what they're leveraging. If that ends, Sci-Hub ends. They succeed in their end game, and they fail. It's a paradox that may ultimately make this a tempest in a teapot when it comes to dire effects and the gutting of an industry.

However, with the extreme eliminated, it's likely that consolidation continues, unless a new technological card is dealt that changes the table, or legal recourse proves successful. But market dynamics are clear -- piracy drives consolidation, for both buyers and sellers.

Responsibility and Unpublished Research Results

We often hear the economic argument, "taxpayers paid for the research, so they deserve to see the results." This argument usually involves publishers.

But who is to blame if publishers never even see a manuscript to consider? Who is to blame if research is funded, patients are put at risk, and no outcomes are even recorded in a required government database?

A new study suggests that there's something important going on, with potentially two-thirds of clinical trial results in the US going unpublished and undocumented more than two years after the trials have concluded.

Speaking to the presumption that funded research results in published papers, the authors give us this quote:

While seemingly axiomatic that the results of clinical trials led by the faculty at leading academic institutions will undergo peer reviewed publication, our study found that 44% of such trials have not been published more than three, and up to seven, years after study completion.

In other words, can you believe that some of the studies from Dr. Prestigious didn't work?

But publication now differs from registration and from documentation, with databases like ClinicalTrials.gov in existence. Unfortunately, the authors of the paper fail in their discussions to draw a clear distinction between the two types of failure their study covers.

These two types of failure are different in vital ways. Submitting trial results into ClinicalTrials.gov is one type of failure, and one that has a different hurdle and different implications (reflecting different obligations) than getting published in a peer-reviewed journal.

It seems less excusable for data to not be submitted into ClinicalTrials.gov. After all, the only hurdle there is the work involved in reporting the results. However, in speaking with researchers, the ClinicalTrials.gov hurdle is formidable, as the interface and technical implementation of ClinicalTrials.gov makes it a major task to report results there. Imagine a manuscript submission system that's twice as cumbersome. The community seems increasingly disenchanted with making the effort, and there is no carrot and no stick to keep them using it.

Yet, compliance here should be 100%. Instead, it's far lower, with some major academic centers having less than 10% of their clinical trials results reported in ClinicalTrials.gov. For example, Stanford's compliance rate for reporting results in ClinicalTrials.gov was 7.6% between 2007 and 2010. This means 10 trials out of 131 were reported in ClinicalTrials.gov. Meanwhile, 49.6% (65/131) were published. Overall, publication rates were higher than rates of compliance with depositing results in ClinicalTrials.gov.

Maybe publishers have a better carrot . . .

In covering the study, a story on NPR elicited a comment that supports this hypothesis:

    I work at a contract research organization that has a large contract with NIH DAIT. We are required to report the clinical trial results to clinicaltrials.gov within one year of "last patient last visit." It is a challenging task but we have a process in place to accomplish this requirement. I don't think researchers deliberately try to hide findings. It takes experience to write acceptable endpoint descriptions, generate an xml file to report adverse events, and properly organize and format the results. When planning a clinical trial resources must be committed to publishing the results at the conclusion of the trial.

    This is a usability problem, pure and simple, yet one that is clearly depriving scientific researchers and patients of information they may want or need. Where is the outrage over this poor user-interface design? Its effect may be far graver than any subscription barrier when it comes to taxpayer access to study results.

    Then we have the data from the study under discussion here pertaining to the percentage of trials published in peer-reviewed journals. It's surprisingly low. But is this because researchers are too lazy to write up the results and submit them to journals? Or is it because the results underwhelmed?

    There is a trade-off between publication rates and reproducibility, as I discussed recently in a post here. More publications of lower-quality studies (poorly powered, not predictive, weak hypothesis, weak generalizability) means a lower rate of reproducibility. Perhaps the problem here isn't that a low rate of these studies are published -- it may be, instead, that too many unimpressive and unpromising studies are funded, started, and terminated after getting poor results.

    I once participated in a clinical trial that went nowhere. The side-effects of a biological agent were simply intolerable, so most participants dropped out, leaving the researchers with no publishable results. The side-effect was known, but what wasn't known was that patients would stop taking the medication because of the side-effect. So why report it in the literature? It added nothing to the knowledge base, except that a silly side-effect hurt compliance. This isn't big news.

    However, the study I participated in was preliminary, and little funding was squandered in learning what it taught. The authors of the paper discussed here counted papers, but did not calculate the amount of funding spent on trials without registered outcomes or published results. That would have been a more interesting number, and perhaps would have given us something better to chew on. After all, if most of the unpublished/undeposited studies were small, preliminary, and involved fewer patients and less funding, we might have a different potential explanation.

    Studies underperform or disappoint for a number of reasons, some bizarre, some pedestrian, some worth pursuing. Not having published results from most of these is probably not doing damage in the larger scheme of things. However, not submitting the data to ClinicalTrials.gov is another issue entirely, and one we need to address. The usability issues with ClinicalTrials.gov may be scuttling a good idea, slowly but surely. Researchers dislike the site, and the benefits of compliance are elusive.

    Whatever the cause, the discrepancy between publication and deposit is certainly worth contemplating.

    The Start-up Shootout

    Earlier this week, I was privileged to serve as one of three judges in the NFAIS session, "Start-up Shoot-out," in which four different young or start-up companies presented, were peppered with difficult questions from the judges and the audience, and then waited to hear who would win a free NFAIS webinar as the winner of the shoot-out.

    Loosely based on formats like ABC's "Shark Tank" and TechCrunch's "Startup Battlefield," the event was as fun and interesting as I'd hoped. Eric Swenson from Elsevier/SCOPUS did a marvelous job as our drill sergeant/moderator, keeping what could have been an unwieldy session tight and on-time. The other judges -- Chris Wink and James Phimister -- were excellent, bringing a nice style and great questions to the proceedings.

    Going in, we were encouraged to indulge a bit in the theatrical aspects of the motif, which involved being a little edgy, asking hard questions, not letting presenters or one another pontificate, and so forth. This helped make for a session that felt, by scholarly meeting standards, kind of bruising, but in what I thought was a good way. After all, these firms are vying for viability, so tough questions await them, whether we ask them or not.

    Volleying questions to the participants, who deserve eternal thanks for tolerating our "tough guy" approach, was interesting. We were told to keep the questions pointed and short. That helped. So, we rarely got a rambling answer, and had full permission to cut off rambling answers, which we had to do once or twice.

    There were some puzzling answers, as well. One I will never forget was the answer of "It's confidential" when one participant was asked how his product's business model worked. Given the motif, it was easy to not let that answer suffice. Since there are really only a few business models, this participant was interrogated again and again until we at least could sense a bit more of what was behind the screen. It did not engender confidence.

    Determining a winner was surprisingly difficult. There are many factors to weigh when recommending investments, even mock investments, and the current economic climate, the payback period, and other dimensions all factored in.

    Once we'd winnowed down the four entrants to the two strongest, it became a bit more of a coin toss. Ultimately, we unanimously felt that the earlier-stage company, which we felt an investor would get a bigger share of at a lower cost, with the strongest network effect potential and a viable freemium model, seemed like the better choice, but only if we were willing to accept a 3-5 year investment window and not a 1-3 term.

    Sessions like this are a great idea, and one I hope other meetings consider. To me, this format is an improvement on the "flash session" model. However, a great moderator is key to making it work. Kudos to NFAIS and Eric Swenson for pulling it off.

    How Healthy is Your Marketing Program?

    The publishing landscape is becoming increasingly crowded and competitive.  In this environment, a healthy marketing function is essential. Brands battle for authors. Societies strive to attract new members. Management seeks efficient spending and strong returns on investment. New digital and social media opportunities beckon. Are you ready?

    Just as a doctor considers different data points in diagnosing the health of a patient, many variables determine the effectiveness of a publisher’s marketing program. A full and impartial analysis can provide valuable insights to help publishers maximize their marketing spend. 

    To help, Caldera Publishing Solutions is pleased to announce a 30-day Marketing Effectiveness Assessment to assist publishers and their marketing teams.

    While many factors play into what makes one brand or product more successful than another, marketing is the nexus for these factors. To succeed, you need information on customer awareness, engagement, and experience to fully evaluate the power of your brand.  You also need to consider what you are measuring and how you are using the information you are gathering. Marketing efficiency and effectiveness – for site licensing, for individual members and subscribers, for social media marketing, SEO and SEM, and other factors – also needs to be assessed.

    Our Marketing Effectiveness Assessment will quickly, and with one flat fee, evaluate your:

    1. Brand Power—the perceived strength and consistency of your brand; strength of the value proposition presented in all communications
    2. Market Planning—is a formal planning process/cycle in place?  Is planning tactically focused, or strategic?
    3. Marketing Structure and Roles—what is marketing accountable for?  What skillsets are in-house and/or handled by outside vendors?
    4. Customer Experience—what is it like for a customer to engage with your brand via your website, customer service, social media, and other channels?
    5. Customer Engagement--what is your marketing mix; how are you engaging your audience?
    6. Sales Enablement—what tools, messaging, and communications are you providing?
    7. Metrics and Analytics—what are you measuring and how are using these metrics?

    As part of this systematic evaluation of the individual components that drive marketing success, we will also make clear recommendations for improving your marketing programs, including efficiency of spend, brand positioning, and market engagement. 

    Improving your marketing program empowers your organization to identify and focus on areas of particular strength or weakness in its efforts to retain and expand its audience. 

    Contact us to learn more about this offering:  contact@caldera-pubishing.com.

     

    Could Piracy Accelerate Consolidation?

    The recent news that a researcher in Russia running a site called Sci-Hub has downloaded 48 million scholarly articles and is making them available free as a form of protest over the fact that some publishers charge US$32 to buy a single article is a good opportunity to pause and consider exactly who is being hurt in this scenario.

    From a financial and economic standpoint, collateral damage is entirely possible.

    In the mind of Alexandra Elbakyan, the Russian researcher behind Sci-Hub who is currently defying a US district court injunction, she and her ilk are mainly hurting large commercial publishers, organizations she thinks are exploiting academic information illegitimately.

    But is her approach likely to have that effect?

    First, how Sci-Hub works:

    The site works in two stages. First of all when you search for a paper, Sci-Hub tries to immediately download it from fellow pirate database LibGen. If that doesn't work, Sci-Hub is able to bypass journal paywalls thanks to a range of access keys that have been donated by anonymous academics (thank you, science spies). This means that Sci-Hub can instantly access any paper published by the big guys, including JSTOR, Springer, Sage, and Elsevier, and deliver it to you for free within seconds. The site then automatically sends a copy of that paper to LibGen, to help share the love.

    The article from Science Alert calls this system "ingenious," but that's flattering and naive. This kind of scheme is not new. As "ingenious" as an employee at a local hardware store who cuts an extra set of house keys, jots down the addresses, and uses these keys to enter homes around town whenever the owners are elsewhere? Or, in the old days, an "ingenious" employee would keep the carbons from credit card transactions and use these to make personal purchases? Or, more currently, an employee "ingeniously" swiping a credit card twice, once for the customer and once for themselves? The list goes on. Misused and misappropriated passwords aren't "ingenious" ideas, either.

    Complaining about the US$32 per-article pricing isn't new, either. Unfiltered complaints like this from journalists always signal that the reporter hasn't done her or his homework, and does not understand how publishing works, which is odd since it literally might pay for them to know. Again, there's nothing new here. Newspapers at the newsstand are much more expensive than home-delivery newspapers, for instance, because the newspaper publisher wants to encourage subscription, which is a better business model. So subscribers pay a lot less per-copy than ala carte purchasers. The same goes for academic publishers, but even more so -- that is, subscribers pay only a few cents per-article for subscription access (whether that's through an individual subscription or a site license), while the price for single articles is high to discourage ala carte usage. In addition, new solutions like article rentals and so forth have lowered pricing on the per-article front, and free access to developing economies is a long-standing practice among academic publishers.

    As so often seems to be the case, the problem isn't nearly what Sci-Hub wants everyone to believe. Research papers are more available, and available at a lower cost per-paper, than ever before, trends that are likely to continue.

    What's more alarming is that the organizations Sci-Hub's activities will hurt aren't the ones they're after, and the effects are likely to strengthen large commercial publishers. Bizarrely, some individuals working at organizations that would be hurt are apparently helping the pirates.

    Returning to the days of Napster, or more recently of early music streaming services, and you begin to realize we've seen this story before. Instead of dozens of record stores and outlets, we now have a handful of digital sales and streaming services. Consolidation was the result of a disrupted marketplace, instigated by piracy.

    But who has that hurt? In the case of music, the people most of us forgot about were the artists, who received no royalties at all for music downloaded illegally. Then there were the record stores, which used to exist, but no longer do as they were undercut by digital music piracy as a first blow, one from which they never recovered. What started as piracy ended up as a music economy that sells more songs than ever, but makes far less money from these sales, forcing artists into a more performance-oriented mode, reducing the number of mid-tier artists with viable careers, and putting technology companies and producers in far greater control of the music industry.

    Sci-Hub believes its actions are primarily humbling publishers like Elsevier, which are obviously the primary target. However, they are also hurting other participants in the academic publishing economy, some of whom are apparently aiding and abetting:

    Authors -- Books are also included in the materials Sci-Hub has purloined through access keys delivered by "science spies." When a publisher sells fewer books, royalties fall. While not often huge, these can be a nice supplement to academic pay. If allowed to persist, not only would royalties be lower, but advances would fall. For journal authors, where rewards for publication are indirect, data about their articles' impact and influence will likely be diminished, especially alt-metrics measures.  This is an interesting aspect of the new interlinked impact infrastructure -- piracy undercuts its functioning.

    Libraries who pay for site license access -- Despite the public shaming over US$32 articles and continued claims that institutions like Cornell and Harvard can't afford to purchase site licenses (despite multi-billion-dollar endowments, and the fact that Cornell raised more than $11 million to support its $8.8 million dollar library budget in 2015, and that Harvard recently saved $25 million by restructuring staff and eliminating duplications in its system), the reality is that libraries pay very low per-article usage rates for most titles. In any event, the main problem here is that libraries are paying while Sci-Hub is using their access keys and paid access to purloin articles. Both publishers and libraries have a mutual interest in accurate usage reporting. If Sci-Hub ever becomes a significant factor in access to articles, usage becomes inaccurate, and pricing inequities are more likely to emerge or be suspected, which will cause both parties to posture or make pricing adjustments in the blind, which could lead to irrational behavior. Meanwhile, Sci-Hub continues to bleed libraries using their own access keys.

    Society publishers and specialty societies -- Focusing solely on commercial publishers like Elsevier and SAGE, and you find dozens upon dozens of society journals within each publishing company. Sci-Hub is not ripping off articles from Elsevier or Wiley or SAGE, per se, but from the societies that use these commercial firms as publishing houses. Go beyond this, and you find the self-publishing non-profits with articles caught in Sci-Hub's scheme. In short, the majority of what is in Sci-Hub is most likely coming from non-profit societies, making this less of a story of Robin Hood robbing from the town's greedy sheriff, and more a story of Robin Hood stealing from the town's hospitals and charities.

    Universities -- Returning to the site licenses and libraries above, clearly academic centers are being hurt as an extension, as they have less to show for their expenditures, yet no decrease in demands from faculty and researchers that they maintain access to key titles. Add to this the institutional repositories universities invested in, which are now at risk of becoming even less viable.

    Funders -- With Gold OA now a decent segment of paid publishing, venues like Sci-Hub could be viewed as just another distribution outlet for articles already paid for. However, usage of these articles isn't documented in the normal fashion given Sci-Hub's spare infrastructure, so funders and Gold OA publishers have a new blind spot around the value they're actually deriving from their funding. Accountability decreases, uncertainty increases, and APCs will likely rise if publishers of all stripes have to adjust to pirates on the waters.

    Sci-Hub, and those "science spies" who are making Sci-Hub's piracy possible, are skewing the academic publishing economy in a way that will only hurt not just large commercial publishers but authors, libraries, charities and societies, universities, and funders as well.

    It's likely that the entities most vulnerable to machinations like those exhibited by Sci-Hub are the non-profit societies -- organizations with long histories of providing training, assistance, and career boosts for people like Alexandra Elbakyan, Sci-Hub's creator. In essence, the only people she's hurting are people like her, who will now have to pay more for articles (to offset the losses from her theft), more for society memberships, more for tuition, and so forth.

    Economies respond to piracy by charging more for those who do and will pay, or by letting entities scuttled by pirates sink into Davy Jone's locker. Piracy eliminates jobs, suppresses economies, and can, at its most extreme, bifurcate an economy as smaller entities are easily sunk while larger ones withstand the assault. With about 95% of academic publishers earning revenues of $25 million or less annually, most academic publishers need to preserve, if not build, revenues. That's difficult to do in the shadow of piracy.

    Bottom line: In academic publishing, with all the other forces pushing consolidation, you can add piracy to the list.

    Reproducibility Problems Run Deep

    The reproducibility crisis continues to provide intriguing insights into how to get science back on track.

    First flagged by some as a potential problem with peer review or an indictment of glamour journals, further explorations have found that the problems run much deeper than publishing and distribution outlets.

    At the recent PSP meeting in Washington, DC, a speaker from the Global Biological Standards Institute (GBSI) explained how 15-36% of cell lines used in biological research are not authenticated. This could be a major contributor, if not the major contributor, to the reproducibility crisis. Other factors he flagged include poor study design, poor data analysis and reporting, and problems with laboratory protocols.

    The existence of Eroom's Law (Moore's Law spelled backwards) is especially vexing, and points to fundamental problems that start well before any papers are written or submitted. Eroom's Law points to the approximate halving (between 1950 and 2010) in the number of new drug molecules approved by the FDA per billion dollars of inflation-adjusted R&D investment by the drug industry, despite huge gains in knowledge and brute-force research capacity (e.g., the ability to sequence genes or synthesize chemicals).

    In a recent paper from analysts specializing in this area, a set of profound and fundamental problems emanating from biomedical and pharmaceutical research is described:

    • Pursuit of animal models with low predictive value
    • Clinical conditions that aren't described specifically enough yet for targeted therapies to have addressable therapeutic targets, yet which are pursued nonetheless (e.g., Alzheimer's disease)
    • Ignoring "field observations" (i.e., physician reports) of what works, and pursuing reductionist predictive models instead
    • Following management's demand for more R&D throughput rather than ensuring predictive values are sufficient to better ensure success (quantity over quality)
    • Ignoring "domains of validity" for predictive models, and expanding or elaborating upon them inappropriately in a research project
    • Using terminology without rigor, creating confusion or misinterpretations in cross-discipline teams

    Journals exist to document and maintain the record of scientific achievements. When these achievements are underwhelming or fraught for whatever reason, the record will reflect this. These and other inquiries into the problem reiterate that the reproducibility crisis is a problem within science, which journals only reflect.

    However, as part of the academic and research establishment, journals do have a role in helping to turn things around. More statistical analysis, demanding more explanations of the predictive value of the experiments and the predictive models and their domains of validity can all help. This means spending more time with each paper, and emphasizing quality over quantity.

    As Derek Lowe wrote in his "In the Pipeline" blog at Science Translational Medicine:

    If you want better, more reproducible papers, you’re going to have fewer of them. Shorter publication lists, fewer journals, and especially fewer lower-tier journals. The number of papers that are generated now cannot be maintained under more reproducible conditions . . .

    Or, as the authors wrote in an important Science paper on reproducibility:

    Correlational tests suggest that replication was better predicted by the strength of the original evidence than by characteristics of the original and replication teams.

    In other words, better evidence is better evidence, and is more likely to be reproducible.

    Unfortunately, until the underlying cultural aspects that treasure quantity of publications over quality of publications are fundamentally addressed and changed -- and all the associated economic, financial, reputational, and career issues all players tacitly support -- we will continue to have problems reproducing weak science. Publishers can't solve these problems alone.

    High Yield

    In business, it's called "return on investment" or ROI. But with less jargon, what we mean is high-yield activities -- you put in some effort and get a lot for it.

    Business activities that are high-yield are critical to success. Low-yield activities in succession will only exhaust and demoralize an organization. A breakthrough, a breathtaking success, a long-term portfolio play -- all these things have "high yield" written all over them.

    Utilizing a blend of staff and consultants is key to having high-yield success. Even in the best organizations, staff can't bring all the skills and perspectives needed for success (and, cross-functional is not the same as diversified thinking, especially within the same organization).

    Yield is also relative. A small company that generates a new $1 million product may be ecstatic, while this same revenue achievement would be middling for a larger organization. And spending too much money and time to get to the $1 million lowers or erases yield.

    Long-term value also has to be factored in. If the revenue source resembles an annuity, the multiyear value can be significant. Imagine the yield around the journals acquired in distressed condition during the 1920s and 1930s, which now are multi-million-dollar entities. It took 30-40 years for trends to coalesce, but when they did, the societies with those properties were utterly transformed.

    Acquisitions can be high-yield. Product development can be high-yield. What is usually not high-yield is keeping money in investment instruments, especially currently. So look around. There may be a new way to set the table for future success, a new initiative that takes little effort to launch if done right but which could generate tremendous returns. What high-yield plans does your business have today?

    The Re-emergence of "Live," and Why It Matters

    Recently, "Grease: Live" aired on Fox to the highest ratings for a live musical broadcast since the form re-emerged with the "Sound of Music" in 2013. Other live musical broadcasts have also garnered high ratings, including "The Wiz" in 2015 and "Peter Pan" in 2014.

    Numerous competitive pressures and social trends have led to the re-emergence of the live event, including the practice of "live Tweeting" and network television's desire to recapture their ratings dominance by creating event-driven viewing. After all, watching a live musical on DVR-delay isn't nearly as visceral as watching it live, and some viewers are certainly looking for trainwrecks and missteps as much as anything (but to the television executive, trolls = audience). Social media's entanglement power makes live events more fun and interesting, as anyone who has attended a recent conference can attest.

    The appeal of live events also means new revenue potential.

    The shift to digital downloads and streaming for music has decreased music industry and artist revenues for studio recordings, even as the volume of music consumed has increased. Packaged album sales are largely a thing of the past (unless you're Adele or another marquee artist). Album sales let artists make money off 10-12 songs at a whack, rather than the 1-2 hits they might get revenues from in the single-track era. This has led to artists making more live appearances and giving more concerts to bolster their incomes. This is also feeding the re-emergence of live performance.

    For singers (and perhaps for their fans), there is a potential downside to the greater number of live performances -- an increasing rate of vocal fatigue and injury. Throat surgery is becoming more common for singers, and voices are fading faster than they did when solid record sales could let a band rest for months while making good money. Now, they sing for their dinners more and more, and that's wearing them out, altering the careers of some major talents.

    Reddit is another venue in which "live" has become a hotter commodity, with their Ask Me Anything (AMA) format. Webinars are more popular. Regional and local conferences and meet-ups are increasingly being utilized to extend the "live" experience. However, the challenges of exhaustion and over-extension also must be managed, as editors, authors, and other prominent ambassadors of brands are pulled into more and more settings and commitments.

    The challenges for purveyors of fixed or recorded media are multi-faceted. The Internet has made fixed media highly discoverable and shifted business models from packaged good (issues, albums) to ala carte sales (songs, articles). Expectations of "free" run rampant. The "live" approach creates a new package, this one temporally and physically based, allowing for the re-emergence of the packaged price or value exchange. But it is not without risk or responsibilities.

    Zeno's Paradox, "Free" Content, and Retail Publishing

    "Free" is a price. Research has shown that "free" is an especially powerful price or label, changing behavior more dramatically than most other factors. Synonyms currently in vogue for "free" include "open" and "public." It's unclear if these terms have the same market power as "free."

    Providing free content to readers has been a disruptive pricing model on the Internet. If you want evidence of this, ask a newspaper publisher -- assuming you can find time with them as they scramble to put the "free" genie back in the bottle. "Free" drained newspaper coffers, caused newspapers to collapse, led to the layoffs of hundreds of journalists, and shifted the news market toward new funding approaches. 

    The initial hope of "free" newspaper content -- the business gamble -- was that advertising payments would balloon in pursuit of the huge audiences "free" would spawn, thereby surpassing the value of subscriber payments, and leading to a big payoff. That didn't work, and now newspapers are busy building and refining paywalls for the networked economy.

    In any case, because traffic is the lifeblood of any online business model, we now face click-bait headlines everywhere, today's equivalent of the "yellow journalism" of the early 20th century -- lurid, distasteful, and hard to ignore.

    Other "free" content sources also have business models working behind the veneer of "free," and they are just as dependent on traffic.

    Wikipedia is free to use, yet it has a successful business model -- fundraising. In its 2014-15 fiscal year, the Wikimedia Foundation (the parent) raised $75 million, its best year ever. The majority of these funds -- more than 60% -- came from the United States. Donations were smaller per donor, but more people donated. Of course, fundraising is a legitimate and time-tested business model, as NPR can attest. Because Wikipedia depends on individual donors, traffic is a key variable to its fundraising success.

    "Free" has a few odd qualities. For instance, it is a condition enjoyed by as few as one party in the exchange of value. For NPR listeners who donate, NPR is not free. It could be that everyone who listens to NPR has donated, all save one person. This serves to illustrate that NPR is only free to those listeners who do not donate. The same goes for Wikipedia and its users. I've donated multiple times to both, so NPR and Wikipedia will never again be strictly "free" to me. I'm part of their paid base.

    Similarly, in academic publishing, a Gold OA journal is no longer "free" to the authors of a paper published in that journal. In the same way that NPR is now never free to me because I frequently donate, OA journals are never strictly "free" to any author who has paid an APC. For a mega-journal like PLoS ONE or Scientific Reports, the journal can never be strictly "free" for thousands of researchers.

    Granted, the share of costs for NPR for which I'm responsible are trivial, and constantly eroding, but they only approach zero and cannot mathematically attain zero. It is a Zeno's paradox of payments -- the tortoise and Achilles of cash.

    This paradox is the same one that all business models possess at their heart, and why acquiring new customers is so critical for long-term success. Without the subscription model, publishers seeking to earn money have to move to the retail model -- asking for customers to purchase something occasionally at a point of sale. And traffic is key to this retail model, which NPR, Wikipedia, online newspapers, and more journals participate in.

    For OA publishers, acquiring authors at a sufficient pace to maintain cash flows is their main financial challenge as retail publishers. They offer author services, which compete on price, convenience, and reputation. Authors shop these outlets, a retail experience. PeerJ attempted to move away from a retail model with a membership model, but it has since added a more customary per-paper APC model, signaling a move back into retail. 

    The subscription and membership models solve this retail paradox to a large degree through renewals. NPR strives to solve it by getting donors on monthly payment plans, and Wikipedia tries to solve it by optimizing their donation forms via A/B testing and other approaches.

    New models of publishing, which have more in common with retailing, may want to look at how former retail outlets for information and services -- Amazon, Netflix, iTunes, Sam's Club, Costco, and others -- have solved the payment paradox. Blunting the pressures of full-on retailing may be a major reason the subscription model is more popular, and being utilized by more businesses, than ever.

    Do Your Financials Make Sense?

    Asking a start-up about their business model is a classic "wet blanket" move at conferences. It's an absolutely critical question, but there are better questions within it that deserve asking, questions that often aren't asked by established businesses.

    For instance, new entrants are typically cast in one of two market approaches -- wide or deep.

    In the "wide" case, wide adoption is necessary for the business to succeed. A good example is a general search engine or a browser. Both need lots of customers, and getting those customers tends to tilt the market in the long-term favor of the winning firm. The intrinsic nature of the model when purveyed for browsers means end users don't pay, but people pay to access customers through the reach of the major "wide" players. The contracts to be featured as the main search engine are very lucrative, and other bidders pay to be featured services. But wide adoption is a condition of success.

    In the "deep" approach, a few customers deeply engaged can make a business work. Bloomberg Terminals provide a good contrast. Comparatively few people use them, but the value of the product is so high to those few that they are willing to pay a lot. The company has to keep the value high, and lock-in is a powerful element of ongoing success. Those few customers have to be deeply engaged and loyal.

    In either case, cost-per-acquisition is a critical metric to consider, as it impacts growth potential, cash flows, and the likelihood of having enough money to enter the market effectively. If you are pursuing a "wide" strategy, cost-per-acquisition usually has to be as close to zero as possible. For a "deep" strategy, cost-per-acquisition can be much higher.

    I've heard of some publishers who have launched "wide" play products or services with a marketing budget based solely on direct expenses for promotion and creative. Yet, when the cost-per-acquisition is multiplied with their expected number of customers, the total spend far exceeds what they have set aside. In cases like this, the project is likely doomed, or will at least contain some unpleasant financial surprises. Add to this a rolling expense for retaining current and incoming customers, and the expense level can truly astound the unprepared. With a "deep" strategy, in which customers pay a good amount, can be upgraded at a predictable rate, and stay for a long time, the cost-per-acquisition can be much higher. But it still has to be calculated, and has to fit into the overall financial model.

    The trajectory for cost efficiency is also important to take into account. Will new customers become more expensive to acquire as you move from the core market to peripheral markets? Will technology become cheaper to build and maintain as adoption grows? Or will the opposite occur? In many cases, core customers are the cheapest and easiest to acquire, so have the lowest cost-per-acquisition, and because they are the easiest to identify, technology costs are lower (e.g., fewer databases to integrate, less data cleanup, smaller databases overall). Going into secondary and tertiary markets to pursue the "wide" strategy can increase expenses unexpectedly. Some customers aren't worth acquiring, or keeping. Do you know who they are?

    In scholarly publishing, too often non-financial measures become the currency driving decisions. Will this increase our impact factor? Will we attract more papers? But focusing on non-financial measures, while useful to a degree, should come after the financial costs and benefits are well-understood. What good is a higher impact factor if it breaks the bank?

    Keeping an Eye on Macroeconomics

    In 2008, many academic institutions and non-profits were reeling after the global economy was shaken by the mortgage crisis and other macroeconomic problems. Investment funds and endowments suddenly lost huge swaths of value, customer accounts became uncertain, and future planning stopped in its tracks briefly.

    Now, eight years later, storm clouds are gathering again, with the first rumbles of thunder coming from China. An exaggerated view of the risks of falling oil prices have driven stocks down prematurely, adding to a prevailing skittishness. There are other strains. For instance, a strong US dollar means problems for other global economies as accessing the world's most lucrative market becomes costlier. Productivity gains are also predicted to be lower in the near-term. And if we needed any reminder that we're in a global system, the sad and surprising outbreak of Zika virus serves notice, as it has rapidly consumed attention and shaken economies in the Americas.

    The Internet has made the publishing economy much more immediately global -- and both terms matter, as being "delayed global" offered the buffer of time. Authors, customers, and markets aren't everywhere yet, but more business crosses more borders than before, and ramifications are more quickly felt, even anticipated. Asia, North America, and Europe remain the primary markets for academic publishers, but South America, the Middle East, and India are more important than before. For authors, the world is even more diverse, especially as various collaborative groups form and splinter and reform.

    It's important to not lose touch with macroeconomic opinion. Crises are as much material as mental, and anxieties travel in packs. Markets are going to become more anxious as more evidence of trouble emerges or is highlighted. Political leaders are beset in many markets, adding to worries. And the US is not fielding a set of candidates that is reassuring to world markets, with most candidates voicing protectionist agendas, with some appearing deeply unhinged.

    What is a business leader to do in times like this? It depends on market and internal organizational dynamics. It may be a good time to refocus on the US market, pulling back from other markets until the storm passes. Perhaps it's time to reinvest in infrastructure, as providers may be more willing to cut deals to get your business. It may be time to think about personnel, as the job market may soon have slightly more eligible recruits to consider. It may be time to develop new products for the inevitable upswing, as resources, time, and vendors will be more available now than later. Or it may be time to simply hunker down and wait it out.

    No matter your choice, keep an eye on the economic climate. It's changeable, opinions can become self-fulfilling prophecies, and every business is more connected to the global economy. Plan for it.

    Empathy, Convenience, and the Modern Publisher

    Customers don't know what they want. Because of that, if you're looking to develop viable new revenue-growth strategies, you can't just ask them what they want. As Henry Ford famously said, "If I had asked people what they wanted, they would have said faster horses."

    In order to move beyond "faster horses," you have to bring empathy, attention, and imagination to your customer interactions.

    The beginning part of empathizing with customers is often to think about what's more convenient for them -- or what will be. Some people narrow "convenience" down to "saving time" for the customer, but time savings is only one form of convenience. Portability is another. Reliability is yet another. It can even be convenient to have something really cool (Apple Watch, anyone?). In a competitive world, it can be convenient to have a conversation-starter. It only goes to show that there are many ways things can be made convenient. Empathy about the context and intention of the user helps identify the best questions and zero-in on the most viable answers.

    Netflix is a perfect example of a company that has worked hard on honing their convenience model. Their DVD rental service was able to crush the life out of Blockbuster because it afforded customers four major forms of convenience -- no trip to the store to rent or return; no need to actively pay except for the first transaction; no humiliating public shaming at the hands of a teenage clerk imposing a late fee; and no fear of coming up empty after a trip to the strip mall. Netflix's DVD rental service was convenient because it let users be more passive and keep their pride. It may not have saved much time, but it was less stressful and less taxing.

    Looking beyond the DVD, Netflix has continued to cater to convenience, making its streaming service available through most devices (Apple TV, most wi-fi-enabled Blu-ray players and TVs, online). It has allowed multiple users per account. It has retained the convenience of its subscription payment model. The Netflix platform was so purpose-built for convenient watching that it has spawned new vocabulary -- binge-watching. Netflix has catered to our love of convenience.

    Convenience has become table stakes for customer acquisition and retention. Netflix has leveraged its reputation for convenience into original programming wins, as have Amazon and Hulu. Google dominates through convenience and reliability on at least two fronts -- search and Google Maps (sorry, Apple Maps, but you blew it). Google continues to bring clever insights to its search interface, including restaurant traffic graphs (so you can see when your favorite place is busiest and plan accordingly) and other empathy-informed techniques.

    Publishers often begrudge the PDF as a technology from yesteryear, and look forward to what's next. "What's next" should be driven by empathy for the information user, not a desire for new tech. The PDF endures because it is portable, reliable, and standard. It's not sexy. It's not particularly cool or exotic. But it works, it's easy to save and search and share, and it's standard. The bar for it's substitute is informed by these traits, at least. Perhaps there is something better, but we'll only find that by making something more convenient than the PDF. Can we?

    PLoS has been a major innovator because of its convenience. For many journals, the second- or third-choice publication after initial submission is PLoS ONE. It saves time, but also provides a decent brand and impact factor, especially when compared to many specialist journals. And it allows researchers to move on from incremental research findings.

    As an industry, publishers can seem far from empathic. At a recent meeting, an audience member asked, "How do you make customers use something you've developed?" Empathy is the opposite of this -- it is, "How do you develop something customers want to use?"

    Did Henry Ford have empathy? Certainly. He paid his workers well so they could afford his cars. That's empathy and business together. Based on that, let's give Henry today's final word:

    "If there is any one secret of success, it lies in the ability to get the other person’s point of view and see things from that person’s angle as well as from your own."

    The Failure to Fail

    Yahoo!'s recent woes are an interesting business story, one I was reminded of recently after participating in a panel discussion about technology and innovation at the 2016 PSP Annual Conference in Washington, DC.

    During the session, Chris Kenneally from the Copyright Clearance Center asked the panelists (Sarah Tegen from the American Chemical Society and Phil Faust from Gale/Cengage also participated) to name a failure that had been instructive, and how each person moved forward from there, all in the context of the Silicon Valley meme of "fail fast."

    When it was my turn, I recalled the experience early in my career of working for a small publishing company led by a CEO who believed (quite rightly) that it was faster and cheaper to put a product on the market with clear expectations and measurements, see how it performed, and if it failed to meet expectations, pull it down. The company became very good at launching, measuring, and either building on success or avoiding costly failures. In short, this organization didn't fail to fail -- that is, it failed a product when it needed to. It didn't keep struggling products around out of sentimentality or false hopes.

    "Failing to fail" may be a key type of failure dogging a number of organizations these days.

    Back in this early career incubator, the comfort with "failing fast" bred a culture of experimentation and clear-eyed measurement that was reassuring and energizing. Most people were involved in a product launch at some point each year, so you didn't get rusty or bored. Market research was done quickly and with an eye toward key metrics focused on a quick market entry and upside exploration. Investments were relatively small, and the spending and results were known to all, because measuring results was key to knowing when to pull the plug or build on a winner.

    The organization succeeded about 30% of the time with product introductions. Because the bets were small on the losers and the plug was pulled quickly, the winners more than offset the expense of small failures. Sometimes, the organization would hit a home run, and have a crown jewel. Other times, the results would be solid, fit into a portfolio we were building, and add punch and vitality while being financially viable. The failures were quickly forgotten. It wasn't an emotional experience. It was business.

    Is your organization "failing to fail" at anything? Is it holding onto projects, products, or services that are "dead men walking" from a financial, market, or strategic perspective? Failing to fail isn't a guarantee against failure. In fact, it may be just the opposite.

    Dealing with the Multi-layered Future

    Predicting the future is difficult, but that doesn't keep people from trying. Get it right, and the rewards can be significant.

    One way to improve your odds is to spread your risks. Another is to think hard about the dynamics of the market, which has multiple levels of related activity. By placing bets of varying sizes at various levels, your odds of success climb.

    What do I mean when I talk about "levels"? One of the most informative graphics about these levels comes from Stuart Brand and Paul Saffo and their work on the Clock of the Long Now (the book from 2000 remains a classic, in my opinion). In this layered model, fashion is a superficial, fast-moving layer, while commerce regulates fashion as a moderating layer, which itself is regulated by infrastructure. The layers continue through governance and culture, each of which moderates and regulates the levels above it, until we hit the ultimate rate-limiter, nature itself.

    These layers translate fairly directly into business, with investments at the fast-moving levels being more cash-intensive and more prone to failure, while the deeper you go -- into commerce, infrastructure, and down to nature (commodities and real property) -- tends toward stability. Of course, ask anybody in the petroleum or housing markets, and they'll not feel that even the more fundamental layers are behaving predictably these days.

    Nevertheless, odds are odds, so playing them well can pay off. Warren Buffett is a classic example of enduring success, and his approach fits this narrative. Buffet spreads his risk, and tends to look for investments at deeper levels than the fashionable, generally avoiding the glitz and glamour. He likes to find infrastructure or commerce that binds many fashionable trends, or is itself immune to the vicissitudes of trendiness.

    Take, for instance, Buffett's investments during the dot-com bubble of the late-1990s. While investors were chasing IPOs and stocks with artificially inflated valuations from Silicon Valley, Buffett was looking at the infrastructure level supporting the Internet -- namely, energy. With more computers, more servers, and more cell phones, demand for electricity was destined to surge. Buffet bought energy companies instead of tech stocks, and had the last laugh when the bubble burst.

    To Buffett, the significance of Silicon Valley at the time wasn't Pets.com or Pointcast -- it was that a lot of new demands on the electrical grid were emerging, making investments in energy smart.

    Sometimes, it seems as if purveyors of the latest fad try to make it look like it comes from a deeper level of experience. For example, one hallmark of many flash-and-crash dot-coms (and it still holds true to some extent today) is that they revolve around "a big idea" -- that all commerce will move online, or that all data will matter, or that people will value connection over commerce. This helps these ideas masquerade as being more important than they really are. While that may attract investors, it also increases the likelihood for failure. Not everything is fundamental. Most things are ancillary.

    Identifying what kind of work you're doing in building your business helps to prevent misadventures, as well.

    In a fascinating piece covering the failed alliance of Facebook CEO Mark Zuckerberg and New Jersey politicians Chris Christie and Cory Booker to rescue Newark's schools, NPR's "Fresh Air" discussed how Zuckerberg learned that it wasn't "the big idea" that would make the difference, but using experts, activating infrastructure, and engaging stakeholders, which led him to a more modest, local approach in his own backyard, and one that is apparently working. The problem failed when framed as an infrastructure upgrade. It worked when recast as emanating from the governance level.

    As I wrote recently, mistaking the superficial changes for fundamental changes in a market can lead to poor decisions. Root causes and fundamental market forces are important to appreciate, understand, and address. Spreading risk into deeper layers girding the more obvious changes may be an approach to consider in seeking growth and sustainability.

    The Non-profit Revenue Tripod -- Risks and Rewards in Equipoise?

    Last fall, I was attending a meeting of non-profit executives in which revenue growth was a major topic. You could tell from the conversations over breakfast that some executives came to the meeting with the idea of growing revenues by growing membership, while others were favoring expanding their meetings to generate new revenues. Still others were drawn to further developing their ancillary revenues (certification programs, services).

    At the outset of the meeting, no one was really talking about growing their publishing business as a way to add revenues, grow membership, or even in relation to improving the other lines. Publishing was something they seemed to take for granted.

    That all changed after a few presentations focusing on the concepts of risk, reward, reliability, and predictability, all in the context of current non-profit, economic, and demographic trends. 

    • Unlike conferences and meetings, a severe storm is very unlikely to dramatically interfere with a year's revenue projections in publishing. As conversations and questions emerged, each executive seemed to have a story of a major meeting being disrupted by a snow storm, ice storm, or hurricane, causing their organization to bear heavy losses and the long-term consequences of a meeting gone bad (extra marketing, venue reconsiderations, making amends with board members and major sponsors, and long memories). Publishing's lack of vulnerability to the vicissitudes of Mother Nature shelters organizations for a good amount of risk when it comes to reliable revenues.
    • Membership has been a riskier proposition for more than a decade, with the rewards shrinking as well. Again, these executives all pointed to their own experiences with membership growth and associated revenues, experiences which suggested exquisite price sensitivity (requiring heavy discounting of member dues) as well as difficulty appealing to non-members and struggles retaining current members. They also noted that publications -- journals, trade magazines, and online portals -- continue to serve as the primary membership draw, another reinforcement of the publishing value proposition they became more aware of as the meeting progressed.
    • Ancillary revenues that worked tended to resemble publishing revenues -- reliable, immune to weather, vital to the community or individual user, delivering tangential benefits to other categories.

    Non-profits are stamped with a "mission," and often that mission consists of wording and beliefs that have been inherited from the 1950s or 1980s. The related bylaws structuring a governance body to support this mission require leadership to pursue membership or meetings, but not usually publications. This may be a source of some cognitive dissonance -- the organization's history is based on membership and meetings, but current revenues, membership appeal, and sustainability are more often built on the value of publications. The bylaws and mission have created constraints. It has been assumed that running these lines of business is a vital part of the mission, even if there are new risks now built into executing that mission in that way.

    Stepping back and looking at the modern world and how organizations fit into the current and developing economy is a big step, often requiring frank assessment of the mission and not self-congratulatory, winking appraisals. In the framework of a stronger, more relevant mission, updated so it makes sense today, membership and meetings can still play a vital role, but success won't look the same as it did in 1959 or 1979. The dynamics between the three major legs of the non-profit society tripod -- meetings, membership, and publications -- have changed.

    Some organizations have taken this step back and completed this examination, benefiting tremendously. I know of one large non-profit that recently completed a deep, strategic look -- which was not without some pain -- and is seeing its membership grow for the first time in decades, while also growing its publishing and ancillary activities. It was an act of reinvention and simplification. It was an acknowledgement that it's not 1990 anymore . . .

    "Rinse and repeat" doesn't work in the current distributed, digital, dynamic information and personal services economy. Customers are facing a new reality, and responding as anyone would. New strategies have to be articulated, old habits re-examined, and new ventures used to probe for growth opportunities. Positions in the market, articulated decades ago, may no longer apply. The tripod we assumed would stand forever in equipoise on a flat surface may need to be rebuilt for steeper terrain.

    Who Holds the Cards in the Digital Advertising Game?

    For a number of academic publishers, online advertising is a viable, if peculiar, line of business. For others, online advertising is viewed as an interesting potential in a market with many constraints on growth.

    Digital advertising sprouted in strange soil, and has developed in ways that continue to make it a difficult revenue line. Back in the late-1990s, the siren song of digital advertising was around the concepts -- still valid and valuable, by the way -- of targeting and personalization. It gained many attributes of direct mail marketing, with response rates represented by CTRs. Advertisers could finally know how many people were clicking on their ads, track advertising through to conversion, and calculate a return on investment (ROI) for their advertising spend. This promised to be a huge improvement over the past, which gave us the memorable thought, "Advertisers know that half of their advertising doesn't work. They just don't know which half."

    Executing on this promise turned out to be complicated, and the main beneficiaries of this complexity thus far have been the advertising agencies, who intermediate between buyers and sellers.

    Understanding this sheds some light on the overall digital landscape.

    Technology purveyors in the digital advertising space seek what technology purveyors in any space seek -- scale. The larger the number of users on your platform, the more valuable it is, and the more features you can add at marginal costs. The economies of scale multiply, and dominance can follow, leading to long-term, high-return business advantages.

    Riding on the back of these large-scale platforms has led to a fragmentation in the market between high-quality, well-managed sites and lower-quality, loosely managed competitors. However, this bifurcation is not generally seen for what it is by the platform providers, who only see more scale, while the agencies are torn between quality and quantity, leaving content providers stressed to cater to both demands at once.

    Scale is so sought after by the platforms that fraud has become an open secret in the world of digital advertising. This may actually play to the benefit of professional, well-run digital sites like those scholarly and scientific publishers generally run, as a recent essay by John Battelle explains. Quality venues free from fraud are going to become more valuable as quality becomes a greater consideration. Scale only works to a point, and then the pendulum swings back to quality. However, scale often leads, while quality consolidates its wins.

    The scale of digital advertising is also more constrained than many people think, even some advertising professionals and platform developers. As Battelle writes:

    Over and over, I hear that the reason CPMs (the amount of money a marketer is willing to pay for one thousand advertising impresssions) are so low is because “there’s infinite inventory.” Hogwash. There’s only so much time in the day, and only so many pages where actual human beings are really paying attention, and the web (including mobile) is growing at a finite pace. There are even fewer places where marketers can be assured of quality, engagement, and appropriate context.

    Advertising agencies are generally not as consolidated, and they have a different position in the market -- negotiating between distribution outlets and advertisers. In the online negotiating game that has emerged, the squeeze is on the distribution outlets. The complicated promise of digital advertising -- metrics, targeting, testing, and personalization -- falls generally on the publisher and its sales/support staff. Agencies simply dictate and negotiate the terms. As the costs of catering to more and more agency and industry demands increase, CPMs will also need to rise to absorb them. More cards may be shifting to the distribution outlets and publishers.

    Trends from the consumer side may tilt the table further toward distributors and publishers, if they can take advantage of the trend. A recent op-ed from Richard Reeves, the Managing Director of the Association of Online Publishers, cites the trend in ad blockers (a topic I touched on recently, as well) as an opportunity for publishers to re-engage with their audiences to find a mutually agreeable solution. Some publishers have been brusque with their users, with the Washington Post memorably putting up a content barrier if an ad blocker was detected, forcing users to register with the site since the user wouldn't allow ads. As Reeves writes:

    Accepting that ad blockers are now a part of the advertising ecosystem is vital to moving forwards. Publishers, advertisers, creative and buying agencies, and technology providers need to join forces - not necessarily to fight ad blockers - but to investigate how to harness the technology and ensure it eventually becomes a positive contributor to the industry.

    For the time being, as digital advertising continues to scale, it remains a "buy side" game, and consumers' only recourse is to block ads entirely. But the sands are shifting, putting more points of leverage into the hands of digital advertising venues that have quality inventory, strong site management practices, reliable deliverability, and strong reputations. These players -- and consumers in general -- could feel the benefits of the shift to "sell side" sooner. But the issues need to be discussed, and the story needs to be told. And it may be time for publishers to look anew at the cards they hold.

    Are We in a Digital Transformation? Or a Customer Transformation?

    We often hear about the digital transformation of professional societies and their publishing, membership, and meetings programs. Beneath this digital veneer, the real action is not "digital" but resides far closer to the customer -- the people you and your organization are trying to reach and retain.

    Take, for example, the professional conference or annual meeting of a professional society. Over a period of decades, meetings moved from regional to national, from social to scientific to commercial, swept along by changes that we realize, with hindsight, were dependent on multiple demographic, technological, and commercial realities rolling across the meetings landscape, sometimes in convergence, sometimes not. Relatively cheap, fast, and reliable air travel was certainly an enabling technology for national society meetings, but discretionary time, discretionary income, and institutions that reinforced professional membership participation certainly contributed.

    Today, discretionary time has diminished a great deal, discretionary income levels are far lower than in the past, and fewer institutions offer incentives that reinforce membership or affiliation in a meaningful way.

    But if you go long enough without attending a professional meeting, you begin to feel a disconnection. An interesting article in the New Yorker recently by Nathan Heller entitled, "Air Head," outlines some sources for this unease -- namely, the difference between being fed information and what he calls "encounter thinking":

    The battle between jet planes and smartphones isn't about speed and glamour. It's about ways of knowing. . . . When physical travel cedes to digital exploration, a certain style of discovery falls away. . . . The stories you uncover through your smartphone are stories, basically, asking to be found. . . . There is a moment in "The Big Short," the new film based on Michael Lewis's reporting on the mortgage crisis, in which Steve Carell's character, a querulous hedge-fund manager, wants to figure out whether to bet against the market. It is 2007. The wisest analysts in New York and Washington are sanguine. He gets on a plane, flies to Florida, and meets some people on whose mortgages the market rides. He finds abandoned homes, loans for the unemployed, and a pole dancer who owns many homes. Encounter thinking, our response to the exceptional, saves us from the errors of consensus and the expectations of smooth process . . .

    If improving meetings attendance involves improving the yield of "encounter thinking," then meeting strategies start to clarify. Imagine a meeting consisting of real surprises (keynotes unannounced but trusted to be worth seeing, with no option to watch after the event), heavy networking sessions structured for maximum results, and more community interactions geared to helping people make or sustain connections. In the society space, stagnant meeting formats are not a trivial part of the meetings problem.

    But there are other forces at work causing professionals to disassociate from membership societies and their meetings.

    One nearly invisible factor has robbed professionals at all levels of discretionary time. This thief does not emanate from digital technology. It's more prosaic. It's your local transportation system, and the long daily commutes that come from its inadequacies. In the US and other developed countries, professionals at all levels -- doctors, lawyers, scientists, administrators, professors, researchers -- can be stuck in traffic for 1-3 hours per day in major metropolitan areas. In addition, commutes are more sedentary -- more driving and sitting, rather than walking or cycling to work. Many professionals compensate for these hours of sedentary time by spending time at the gym or exercising. The net effect on discretionary time to spend with publications and membership organizations has consequently dropped off, which is an unseen factor we are all grappling with. It affects marriages, child rearing, and community involvement.

    How is this broadly felt change affecting social connections and professional meetings? Many successful new approaches involve creating opportunities for social and professional gatherings immediately after work, so that professionals can avoid rush hour and spend time with colleagues or friends. From social game clubs to "pub nights," this timeshift seems slight -- away from dinner meetings to happy hours -- but it's important. It also carries a more casual atmosphere promising more fun and excitement, something that increases the appeal when compared to sitting in a car or on a train for an hour or more.

    Other demographic changes are also changing society fundamentally -- higher college tuitions have diverted significant household income toward paying for college. In 1960, families spent 2% of household income on child care and education. By 2013, that number had swelled to 18%. With both parents working and caught in long commutes, daycare costs sap funds for the preschool years, and then college tuition comes along later to deplete funds when the adults are at peak earning potential.

    How do these pressures manifest themselves? More acute price sensitivity, lower loyalty from year to year, and reluctance to spend money on non-essential items. These trends are all important for the success of a professional society. Unfortunately, current economic and social realities diverge strongly from how most professional societies are positioned in the world. Bylaws and structures established 50-75 years ago don't reflect the modern world, it inhabitants, and their needs. The underlying mental model doesn't jibe.

    Publishers can have blind spots about customer needs, some of them pretty glaring. When it comes to digital, it's easy to succumb to seeing customer changes as emanating from the proliferation of technologies, but really the technologies are useful (or not) depending on the customers. Why is ORCID becoming more useful? The technology has certainly evolved, but the main factor is that authors are now grasping the value of identity disambiguation as a way to give them full credit for multiple contributions. And this eagerness for full credit in the scholarly sphere is driven by the heightened competition for scarce funding, which leads to academic and career success. The technology supports the need, and is becoming more valuable because customers are changing.

    The list goes on. ResearchGate, Mendeley, and Academia.edu don't have any truly proprietary technology, but have arrayed some standard technology in a way that users find valuable.

    Behind any successful technology -- print, online, mobile, broadcast, streaming -- is a satisfied customer. There are many more failures that have used the same technologies yet never satisfied enough customers to survive.

    What is the market positioning today that will lead to long-term differentiation and success? Whatever your answer turns out to be, it's safe to say the successful approaches will be based in understanding today's customer and human realities. If "digital" is involved, its involvement will be incidental.

    Do You Know Your Risk Profile?

    The concept of "risk" has been one of the most useful abstractions for organizations everywhere. It captures uncertainties, changes in fortune or competitive sets, macroeconomics, and organizational preparedness. The concept of risk management originated in the 1960s with a Canadian Rhodes scholar, Doug Barlow, who asserted that "all management is risk management."

    A decade ago, the risks around digital transformations may have been the top of the list. After the Great Recession of 2008 and the austerity measures that followed, economic growth has become a challenge.

    Today, the risks have morphed yet again.

    No matter what kind of organization you're in or run, risks accumulate from a number of sources -- financial, strategic, personnel, competitive, technology, governance, and product development. Looking across these effectively requires a clear view, as some risks can become so familiar otherwise that they no longer register.

    Crafting a risk assessment is a high-yield way to understand the risks an organization is facing at a particular point in time. Commissioning an independent assessment that includes mitigation strategies can move an organization forward in important ways, and is often a stepping stone toward a strategic refresh.

    That's why we're pleased to announce our Risk Assessment and Mitigation Plan package.

    Risk assessments can deliver great results for executive leadership, senior management, and line managers. Risk assessments capture known risks while unearthing hidden or poorly articulated risks. Tying the assessment to a mitigation plan moves the assessment into a proactive mode, so that leadership can grapple effectively with risks like customer concentration, new competitors, missed opportunities, technology gaps, management misalignments, product inadequacies, or skills gaps.

    For a fixed price, we'll dig in and deliver within 30 days an insightful and useful risk assessment and mitigation plan. Consisting of brief interviews with senior management, reviews of relevant documents and reports, and comparisons to cognate organization, our approach can kickstart important strategic and tactical discussions for your organization.

    If you'd like to learn more, email us at: contact@caldera-publishing.com.

    Crossing Route -- Will Cognitive Detraining Make AI Win Sooner?

    I was recently (and belatedly) reading the 2016 Industry Trends report from Outsell. In addition to a lot of great data and takeaways, it included a tantalizing and slightly unsettling thought -- namely, that habituating ourselves to shorter information tidbits and residing within a fragmented and noisy knowledge sphere are leading to a general decline in our cognitive skills and stamina, a detraining of concentration. In addition to the direct effects of this, the phenomenon may affect the rate at which artificial intelligence (AI) overtakes human intelligence, as the distance between AI today and human intelligence in the future is shrinking because both aspects are moving toward each other. We're on the decline, and AI is on the rise.

    There is a lot to unpack here. First, it is a stark reminder that the human brain is not something that you have, but something you develop. How you train it and feed it matters. There's a great passage in a book called, "How Not to Be Wrong." The author, Jordan Ellenberg, addresses the common complaint among school kids that they're never going to use what they learn in school in their everyday adult lives. Ellenberg argues that this is missing the point.

    Just as a tennis star doesn't use weight-lifting, wind sprints, healthy eating, and stretching exercises during a tennis match, an educated person doesn't use their ability to solve quadratics or memorization of dates and events in their work lives. However, the training and conditioning from the preparatory activities make a difference in the end. Having lifted weights and pursuing a careful diet will give a tennis player an edge over a weaker player with a lousy diet, just as a person who has trained her mind to solve quadratics and memorize facts will do better with intellectual challenges than a person who slacked off. You don't use the wind sprints or quadratic drills themselves later, but you do leverage the benefits they deliver.

    Every time we sneak a peek at our smartphones during a movie, we are detraining our brains from being able to sustain moderate-to-intense concentration for two hours. Every time we look away from writing a long document to check social media or email, we are detraining our minds for extended workloads.

    Now, we are human, and we need breaks from work. Research has shown that taking a break helps knowledge workers, because the human brain is complex -- hot showers and long walks deliver better blood supply, which helps thinking; resting the conscious brain can help the limbic brain churn out answers in our multi-tier cranial system, leading to that classic "aha!" moment.

    Another aspect of this thought is that if we're lowering the bar AI has to clear, then we're not going to get AI that's as impressive or helpful as it could be.

    All in all, I found this to be a sobering reminder that we need to remain aware that while our intellectual habits are trending toward burst communication, attention fragmentation, and short attention spans, we can benefit from working out with extended mental concentration. It's not easy, the payoffs are indirect, but the benefits will likely come.