Zeno's Paradox, "Free" Content, and Retail Publishing

"Free" is a price. Research has shown that "free" is an especially powerful price or label, changing behavior more dramatically than most other factors. Synonyms currently in vogue for "free" include "open" and "public." It's unclear if these terms have the same market power as "free."

Providing free content to readers has been a disruptive pricing model on the Internet. If you want evidence of this, ask a newspaper publisher -- assuming you can find time with them as they scramble to put the "free" genie back in the bottle. "Free" drained newspaper coffers, caused newspapers to collapse, led to the layoffs of hundreds of journalists, and shifted the news market toward new funding approaches. 

The initial hope of "free" newspaper content -- the business gamble -- was that advertising payments would balloon in pursuit of the huge audiences "free" would spawn, thereby surpassing the value of subscriber payments, and leading to a big payoff. That didn't work, and now newspapers are busy building and refining paywalls for the networked economy.

In any case, because traffic is the lifeblood of any online business model, we now face click-bait headlines everywhere, today's equivalent of the "yellow journalism" of the early 20th century -- lurid, distasteful, and hard to ignore.

Other "free" content sources also have business models working behind the veneer of "free," and they are just as dependent on traffic.

Wikipedia is free to use, yet it has a successful business model -- fundraising. In its 2014-15 fiscal year, the Wikimedia Foundation (the parent) raised $75 million, its best year ever. The majority of these funds -- more than 60% -- came from the United States. Donations were smaller per donor, but more people donated. Of course, fundraising is a legitimate and time-tested business model, as NPR can attest. Because Wikipedia depends on individual donors, traffic is a key variable to its fundraising success.

"Free" has a few odd qualities. For instance, it is a condition enjoyed by as few as one party in the exchange of value. For NPR listeners who donate, NPR is not free. It could be that everyone who listens to NPR has donated, all save one person. This serves to illustrate that NPR is only free to those listeners who do not donate. The same goes for Wikipedia and its users. I've donated multiple times to both, so NPR and Wikipedia will never again be strictly "free" to me. I'm part of their paid base.

Similarly, in academic publishing, a Gold OA journal is no longer "free" to the authors of a paper published in that journal. In the same way that NPR is now never free to me because I frequently donate, OA journals are never strictly "free" to any author who has paid an APC. For a mega-journal like PLoS ONE or Scientific Reports, the journal can never be strictly "free" for thousands of researchers.

Granted, the share of costs for NPR for which I'm responsible are trivial, and constantly eroding, but they only approach zero and cannot mathematically attain zero. It is a Zeno's paradox of payments -- the tortoise and Achilles of cash.

This paradox is the same one that all business models possess at their heart, and why acquiring new customers is so critical for long-term success. Without the subscription model, publishers seeking to earn money have to move to the retail model -- asking for customers to purchase something occasionally at a point of sale. And traffic is key to this retail model, which NPR, Wikipedia, online newspapers, and more journals participate in.

For OA publishers, acquiring authors at a sufficient pace to maintain cash flows is their main financial challenge as retail publishers. They offer author services, which compete on price, convenience, and reputation. Authors shop these outlets, a retail experience. PeerJ attempted to move away from a retail model with a membership model, but it has since added a more customary per-paper APC model, signaling a move back into retail. 

The subscription and membership models solve this retail paradox to a large degree through renewals. NPR strives to solve it by getting donors on monthly payment plans, and Wikipedia tries to solve it by optimizing their donation forms via A/B testing and other approaches.

New models of publishing, which have more in common with retailing, may want to look at how former retail outlets for information and services -- Amazon, Netflix, iTunes, Sam's Club, Costco, and others -- have solved the payment paradox. Blunting the pressures of full-on retailing may be a major reason the subscription model is more popular, and being utilized by more businesses, than ever.

Do Your Financials Make Sense?

Asking a start-up about their business model is a classic "wet blanket" move at conferences. It's an absolutely critical question, but there are better questions within it that deserve asking, questions that often aren't asked by established businesses.

For instance, new entrants are typically cast in one of two market approaches -- wide or deep.

In the "wide" case, wide adoption is necessary for the business to succeed. A good example is a general search engine or a browser. Both need lots of customers, and getting those customers tends to tilt the market in the long-term favor of the winning firm. The intrinsic nature of the model when purveyed for browsers means end users don't pay, but people pay to access customers through the reach of the major "wide" players. The contracts to be featured as the main search engine are very lucrative, and other bidders pay to be featured services. But wide adoption is a condition of success.

In the "deep" approach, a few customers deeply engaged can make a business work. Bloomberg Terminals provide a good contrast. Comparatively few people use them, but the value of the product is so high to those few that they are willing to pay a lot. The company has to keep the value high, and lock-in is a powerful element of ongoing success. Those few customers have to be deeply engaged and loyal.

In either case, cost-per-acquisition is a critical metric to consider, as it impacts growth potential, cash flows, and the likelihood of having enough money to enter the market effectively. If you are pursuing a "wide" strategy, cost-per-acquisition usually has to be as close to zero as possible. For a "deep" strategy, cost-per-acquisition can be much higher.

I've heard of some publishers who have launched "wide" play products or services with a marketing budget based solely on direct expenses for promotion and creative. Yet, when the cost-per-acquisition is multiplied with their expected number of customers, the total spend far exceeds what they have set aside. In cases like this, the project is likely doomed, or will at least contain some unpleasant financial surprises. Add to this a rolling expense for retaining current and incoming customers, and the expense level can truly astound the unprepared. With a "deep" strategy, in which customers pay a good amount, can be upgraded at a predictable rate, and stay for a long time, the cost-per-acquisition can be much higher. But it still has to be calculated, and has to fit into the overall financial model.

The trajectory for cost efficiency is also important to take into account. Will new customers become more expensive to acquire as you move from the core market to peripheral markets? Will technology become cheaper to build and maintain as adoption grows? Or will the opposite occur? In many cases, core customers are the cheapest and easiest to acquire, so have the lowest cost-per-acquisition, and because they are the easiest to identify, technology costs are lower (e.g., fewer databases to integrate, less data cleanup, smaller databases overall). Going into secondary and tertiary markets to pursue the "wide" strategy can increase expenses unexpectedly. Some customers aren't worth acquiring, or keeping. Do you know who they are?

In scholarly publishing, too often non-financial measures become the currency driving decisions. Will this increase our impact factor? Will we attract more papers? But focusing on non-financial measures, while useful to a degree, should come after the financial costs and benefits are well-understood. What good is a higher impact factor if it breaks the bank?

Keeping an Eye on Macroeconomics

In 2008, many academic institutions and non-profits were reeling after the global economy was shaken by the mortgage crisis and other macroeconomic problems. Investment funds and endowments suddenly lost huge swaths of value, customer accounts became uncertain, and future planning stopped in its tracks briefly.

Now, eight years later, storm clouds are gathering again, with the first rumbles of thunder coming from China. An exaggerated view of the risks of falling oil prices have driven stocks down prematurely, adding to a prevailing skittishness. There are other strains. For instance, a strong US dollar means problems for other global economies as accessing the world's most lucrative market becomes costlier. Productivity gains are also predicted to be lower in the near-term. And if we needed any reminder that we're in a global system, the sad and surprising outbreak of Zika virus serves notice, as it has rapidly consumed attention and shaken economies in the Americas.

The Internet has made the publishing economy much more immediately global -- and both terms matter, as being "delayed global" offered the buffer of time. Authors, customers, and markets aren't everywhere yet, but more business crosses more borders than before, and ramifications are more quickly felt, even anticipated. Asia, North America, and Europe remain the primary markets for academic publishers, but South America, the Middle East, and India are more important than before. For authors, the world is even more diverse, especially as various collaborative groups form and splinter and reform.

It's important to not lose touch with macroeconomic opinion. Crises are as much material as mental, and anxieties travel in packs. Markets are going to become more anxious as more evidence of trouble emerges or is highlighted. Political leaders are beset in many markets, adding to worries. And the US is not fielding a set of candidates that is reassuring to world markets, with most candidates voicing protectionist agendas, with some appearing deeply unhinged.

What is a business leader to do in times like this? It depends on market and internal organizational dynamics. It may be a good time to refocus on the US market, pulling back from other markets until the storm passes. Perhaps it's time to reinvest in infrastructure, as providers may be more willing to cut deals to get your business. It may be time to think about personnel, as the job market may soon have slightly more eligible recruits to consider. It may be time to develop new products for the inevitable upswing, as resources, time, and vendors will be more available now than later. Or it may be time to simply hunker down and wait it out.

No matter your choice, keep an eye on the economic climate. It's changeable, opinions can become self-fulfilling prophecies, and every business is more connected to the global economy. Plan for it.

Empathy, Convenience, and the Modern Publisher

Customers don't know what they want. Because of that, if you're looking to develop viable new revenue-growth strategies, you can't just ask them what they want. As Henry Ford famously said, "If I had asked people what they wanted, they would have said faster horses."

In order to move beyond "faster horses," you have to bring empathy, attention, and imagination to your customer interactions.

The beginning part of empathizing with customers is often to think about what's more convenient for them -- or what will be. Some people narrow "convenience" down to "saving time" for the customer, but time savings is only one form of convenience. Portability is another. Reliability is yet another. It can even be convenient to have something really cool (Apple Watch, anyone?). In a competitive world, it can be convenient to have a conversation-starter. It only goes to show that there are many ways things can be made convenient. Empathy about the context and intention of the user helps identify the best questions and zero-in on the most viable answers.

Netflix is a perfect example of a company that has worked hard on honing their convenience model. Their DVD rental service was able to crush the life out of Blockbuster because it afforded customers four major forms of convenience -- no trip to the store to rent or return; no need to actively pay except for the first transaction; no humiliating public shaming at the hands of a teenage clerk imposing a late fee; and no fear of coming up empty after a trip to the strip mall. Netflix's DVD rental service was convenient because it let users be more passive and keep their pride. It may not have saved much time, but it was less stressful and less taxing.

Looking beyond the DVD, Netflix has continued to cater to convenience, making its streaming service available through most devices (Apple TV, most wi-fi-enabled Blu-ray players and TVs, online). It has allowed multiple users per account. It has retained the convenience of its subscription payment model. The Netflix platform was so purpose-built for convenient watching that it has spawned new vocabulary -- binge-watching. Netflix has catered to our love of convenience.

Convenience has become table stakes for customer acquisition and retention. Netflix has leveraged its reputation for convenience into original programming wins, as have Amazon and Hulu. Google dominates through convenience and reliability on at least two fronts -- search and Google Maps (sorry, Apple Maps, but you blew it). Google continues to bring clever insights to its search interface, including restaurant traffic graphs (so you can see when your favorite place is busiest and plan accordingly) and other empathy-informed techniques.

Publishers often begrudge the PDF as a technology from yesteryear, and look forward to what's next. "What's next" should be driven by empathy for the information user, not a desire for new tech. The PDF endures because it is portable, reliable, and standard. It's not sexy. It's not particularly cool or exotic. But it works, it's easy to save and search and share, and it's standard. The bar for it's substitute is informed by these traits, at least. Perhaps there is something better, but we'll only find that by making something more convenient than the PDF. Can we?

PLoS has been a major innovator because of its convenience. For many journals, the second- or third-choice publication after initial submission is PLoS ONE. It saves time, but also provides a decent brand and impact factor, especially when compared to many specialist journals. And it allows researchers to move on from incremental research findings.

As an industry, publishers can seem far from empathic. At a recent meeting, an audience member asked, "How do you make customers use something you've developed?" Empathy is the opposite of this -- it is, "How do you develop something customers want to use?"

Did Henry Ford have empathy? Certainly. He paid his workers well so they could afford his cars. That's empathy and business together. Based on that, let's give Henry today's final word:

"If there is any one secret of success, it lies in the ability to get the other person’s point of view and see things from that person’s angle as well as from your own."

The Failure to Fail

Yahoo!'s recent woes are an interesting business story, one I was reminded of recently after participating in a panel discussion about technology and innovation at the 2016 PSP Annual Conference in Washington, DC.

During the session, Chris Kenneally from the Copyright Clearance Center asked the panelists (Sarah Tegen from the American Chemical Society and Phil Faust from Gale/Cengage also participated) to name a failure that had been instructive, and how each person moved forward from there, all in the context of the Silicon Valley meme of "fail fast."

When it was my turn, I recalled the experience early in my career of working for a small publishing company led by a CEO who believed (quite rightly) that it was faster and cheaper to put a product on the market with clear expectations and measurements, see how it performed, and if it failed to meet expectations, pull it down. The company became very good at launching, measuring, and either building on success or avoiding costly failures. In short, this organization didn't fail to fail -- that is, it failed a product when it needed to. It didn't keep struggling products around out of sentimentality or false hopes.

"Failing to fail" may be a key type of failure dogging a number of organizations these days.

Back in this early career incubator, the comfort with "failing fast" bred a culture of experimentation and clear-eyed measurement that was reassuring and energizing. Most people were involved in a product launch at some point each year, so you didn't get rusty or bored. Market research was done quickly and with an eye toward key metrics focused on a quick market entry and upside exploration. Investments were relatively small, and the spending and results were known to all, because measuring results was key to knowing when to pull the plug or build on a winner.

The organization succeeded about 30% of the time with product introductions. Because the bets were small on the losers and the plug was pulled quickly, the winners more than offset the expense of small failures. Sometimes, the organization would hit a home run, and have a crown jewel. Other times, the results would be solid, fit into a portfolio we were building, and add punch and vitality while being financially viable. The failures were quickly forgotten. It wasn't an emotional experience. It was business.

Is your organization "failing to fail" at anything? Is it holding onto projects, products, or services that are "dead men walking" from a financial, market, or strategic perspective? Failing to fail isn't a guarantee against failure. In fact, it may be just the opposite.

Dealing with the Multi-layered Future

Predicting the future is difficult, but that doesn't keep people from trying. Get it right, and the rewards can be significant.

One way to improve your odds is to spread your risks. Another is to think hard about the dynamics of the market, which has multiple levels of related activity. By placing bets of varying sizes at various levels, your odds of success climb.

What do I mean when I talk about "levels"? One of the most informative graphics about these levels comes from Stuart Brand and Paul Saffo and their work on the Clock of the Long Now (the book from 2000 remains a classic, in my opinion). In this layered model, fashion is a superficial, fast-moving layer, while commerce regulates fashion as a moderating layer, which itself is regulated by infrastructure. The layers continue through governance and culture, each of which moderates and regulates the levels above it, until we hit the ultimate rate-limiter, nature itself.

These layers translate fairly directly into business, with investments at the fast-moving levels being more cash-intensive and more prone to failure, while the deeper you go -- into commerce, infrastructure, and down to nature (commodities and real property) -- tends toward stability. Of course, ask anybody in the petroleum or housing markets, and they'll not feel that even the more fundamental layers are behaving predictably these days.

Nevertheless, odds are odds, so playing them well can pay off. Warren Buffett is a classic example of enduring success, and his approach fits this narrative. Buffet spreads his risk, and tends to look for investments at deeper levels than the fashionable, generally avoiding the glitz and glamour. He likes to find infrastructure or commerce that binds many fashionable trends, or is itself immune to the vicissitudes of trendiness.

Take, for instance, Buffett's investments during the dot-com bubble of the late-1990s. While investors were chasing IPOs and stocks with artificially inflated valuations from Silicon Valley, Buffett was looking at the infrastructure level supporting the Internet -- namely, energy. With more computers, more servers, and more cell phones, demand for electricity was destined to surge. Buffet bought energy companies instead of tech stocks, and had the last laugh when the bubble burst.

To Buffett, the significance of Silicon Valley at the time wasn't Pets.com or Pointcast -- it was that a lot of new demands on the electrical grid were emerging, making investments in energy smart.

Sometimes, it seems as if purveyors of the latest fad try to make it look like it comes from a deeper level of experience. For example, one hallmark of many flash-and-crash dot-coms (and it still holds true to some extent today) is that they revolve around "a big idea" -- that all commerce will move online, or that all data will matter, or that people will value connection over commerce. This helps these ideas masquerade as being more important than they really are. While that may attract investors, it also increases the likelihood for failure. Not everything is fundamental. Most things are ancillary.

Identifying what kind of work you're doing in building your business helps to prevent misadventures, as well.

In a fascinating piece covering the failed alliance of Facebook CEO Mark Zuckerberg and New Jersey politicians Chris Christie and Cory Booker to rescue Newark's schools, NPR's "Fresh Air" discussed how Zuckerberg learned that it wasn't "the big idea" that would make the difference, but using experts, activating infrastructure, and engaging stakeholders, which led him to a more modest, local approach in his own backyard, and one that is apparently working. The problem failed when framed as an infrastructure upgrade. It worked when recast as emanating from the governance level.

As I wrote recently, mistaking the superficial changes for fundamental changes in a market can lead to poor decisions. Root causes and fundamental market forces are important to appreciate, understand, and address. Spreading risk into deeper layers girding the more obvious changes may be an approach to consider in seeking growth and sustainability.

The Non-profit Revenue Tripod -- Risks and Rewards in Equipoise?

Last fall, I was attending a meeting of non-profit executives in which revenue growth was a major topic. You could tell from the conversations over breakfast that some executives came to the meeting with the idea of growing revenues by growing membership, while others were favoring expanding their meetings to generate new revenues. Still others were drawn to further developing their ancillary revenues (certification programs, services).

At the outset of the meeting, no one was really talking about growing their publishing business as a way to add revenues, grow membership, or even in relation to improving the other lines. Publishing was something they seemed to take for granted.

That all changed after a few presentations focusing on the concepts of risk, reward, reliability, and predictability, all in the context of current non-profit, economic, and demographic trends. 

  • Unlike conferences and meetings, a severe storm is very unlikely to dramatically interfere with a year's revenue projections in publishing. As conversations and questions emerged, each executive seemed to have a story of a major meeting being disrupted by a snow storm, ice storm, or hurricane, causing their organization to bear heavy losses and the long-term consequences of a meeting gone bad (extra marketing, venue reconsiderations, making amends with board members and major sponsors, and long memories). Publishing's lack of vulnerability to the vicissitudes of Mother Nature shelters organizations for a good amount of risk when it comes to reliable revenues.
  • Membership has been a riskier proposition for more than a decade, with the rewards shrinking as well. Again, these executives all pointed to their own experiences with membership growth and associated revenues, experiences which suggested exquisite price sensitivity (requiring heavy discounting of member dues) as well as difficulty appealing to non-members and struggles retaining current members. They also noted that publications -- journals, trade magazines, and online portals -- continue to serve as the primary membership draw, another reinforcement of the publishing value proposition they became more aware of as the meeting progressed.
  • Ancillary revenues that worked tended to resemble publishing revenues -- reliable, immune to weather, vital to the community or individual user, delivering tangential benefits to other categories.

Non-profits are stamped with a "mission," and often that mission consists of wording and beliefs that have been inherited from the 1950s or 1980s. The related bylaws structuring a governance body to support this mission require leadership to pursue membership or meetings, but not usually publications. This may be a source of some cognitive dissonance -- the organization's history is based on membership and meetings, but current revenues, membership appeal, and sustainability are more often built on the value of publications. The bylaws and mission have created constraints. It has been assumed that running these lines of business is a vital part of the mission, even if there are new risks now built into executing that mission in that way.

Stepping back and looking at the modern world and how organizations fit into the current and developing economy is a big step, often requiring frank assessment of the mission and not self-congratulatory, winking appraisals. In the framework of a stronger, more relevant mission, updated so it makes sense today, membership and meetings can still play a vital role, but success won't look the same as it did in 1959 or 1979. The dynamics between the three major legs of the non-profit society tripod -- meetings, membership, and publications -- have changed.

Some organizations have taken this step back and completed this examination, benefiting tremendously. I know of one large non-profit that recently completed a deep, strategic look -- which was not without some pain -- and is seeing its membership grow for the first time in decades, while also growing its publishing and ancillary activities. It was an act of reinvention and simplification. It was an acknowledgement that it's not 1990 anymore . . .

"Rinse and repeat" doesn't work in the current distributed, digital, dynamic information and personal services economy. Customers are facing a new reality, and responding as anyone would. New strategies have to be articulated, old habits re-examined, and new ventures used to probe for growth opportunities. Positions in the market, articulated decades ago, may no longer apply. The tripod we assumed would stand forever in equipoise on a flat surface may need to be rebuilt for steeper terrain.

Who Holds the Cards in the Digital Advertising Game?

For a number of academic publishers, online advertising is a viable, if peculiar, line of business. For others, online advertising is viewed as an interesting potential in a market with many constraints on growth.

Digital advertising sprouted in strange soil, and has developed in ways that continue to make it a difficult revenue line. Back in the late-1990s, the siren song of digital advertising was around the concepts -- still valid and valuable, by the way -- of targeting and personalization. It gained many attributes of direct mail marketing, with response rates represented by CTRs. Advertisers could finally know how many people were clicking on their ads, track advertising through to conversion, and calculate a return on investment (ROI) for their advertising spend. This promised to be a huge improvement over the past, which gave us the memorable thought, "Advertisers know that half of their advertising doesn't work. They just don't know which half."

Executing on this promise turned out to be complicated, and the main beneficiaries of this complexity thus far have been the advertising agencies, who intermediate between buyers and sellers.

Understanding this sheds some light on the overall digital landscape.

Technology purveyors in the digital advertising space seek what technology purveyors in any space seek -- scale. The larger the number of users on your platform, the more valuable it is, and the more features you can add at marginal costs. The economies of scale multiply, and dominance can follow, leading to long-term, high-return business advantages.

Riding on the back of these large-scale platforms has led to a fragmentation in the market between high-quality, well-managed sites and lower-quality, loosely managed competitors. However, this bifurcation is not generally seen for what it is by the platform providers, who only see more scale, while the agencies are torn between quality and quantity, leaving content providers stressed to cater to both demands at once.

Scale is so sought after by the platforms that fraud has become an open secret in the world of digital advertising. This may actually play to the benefit of professional, well-run digital sites like those scholarly and scientific publishers generally run, as a recent essay by John Battelle explains. Quality venues free from fraud are going to become more valuable as quality becomes a greater consideration. Scale only works to a point, and then the pendulum swings back to quality. However, scale often leads, while quality consolidates its wins.

The scale of digital advertising is also more constrained than many people think, even some advertising professionals and platform developers. As Battelle writes:

Over and over, I hear that the reason CPMs (the amount of money a marketer is willing to pay for one thousand advertising impresssions) are so low is because “there’s infinite inventory.” Hogwash. There’s only so much time in the day, and only so many pages where actual human beings are really paying attention, and the web (including mobile) is growing at a finite pace. There are even fewer places where marketers can be assured of quality, engagement, and appropriate context.

Advertising agencies are generally not as consolidated, and they have a different position in the market -- negotiating between distribution outlets and advertisers. In the online negotiating game that has emerged, the squeeze is on the distribution outlets. The complicated promise of digital advertising -- metrics, targeting, testing, and personalization -- falls generally on the publisher and its sales/support staff. Agencies simply dictate and negotiate the terms. As the costs of catering to more and more agency and industry demands increase, CPMs will also need to rise to absorb them. More cards may be shifting to the distribution outlets and publishers.

Trends from the consumer side may tilt the table further toward distributors and publishers, if they can take advantage of the trend. A recent op-ed from Richard Reeves, the Managing Director of the Association of Online Publishers, cites the trend in ad blockers (a topic I touched on recently, as well) as an opportunity for publishers to re-engage with their audiences to find a mutually agreeable solution. Some publishers have been brusque with their users, with the Washington Post memorably putting up a content barrier if an ad blocker was detected, forcing users to register with the site since the user wouldn't allow ads. As Reeves writes:

Accepting that ad blockers are now a part of the advertising ecosystem is vital to moving forwards. Publishers, advertisers, creative and buying agencies, and technology providers need to join forces - not necessarily to fight ad blockers - but to investigate how to harness the technology and ensure it eventually becomes a positive contributor to the industry.

For the time being, as digital advertising continues to scale, it remains a "buy side" game, and consumers' only recourse is to block ads entirely. But the sands are shifting, putting more points of leverage into the hands of digital advertising venues that have quality inventory, strong site management practices, reliable deliverability, and strong reputations. These players -- and consumers in general -- could feel the benefits of the shift to "sell side" sooner. But the issues need to be discussed, and the story needs to be told. And it may be time for publishers to look anew at the cards they hold.

Are We in a Digital Transformation? Or a Customer Transformation?

We often hear about the digital transformation of professional societies and their publishing, membership, and meetings programs. Beneath this digital veneer, the real action is not "digital" but resides far closer to the customer -- the people you and your organization are trying to reach and retain.

Take, for example, the professional conference or annual meeting of a professional society. Over a period of decades, meetings moved from regional to national, from social to scientific to commercial, swept along by changes that we realize, with hindsight, were dependent on multiple demographic, technological, and commercial realities rolling across the meetings landscape, sometimes in convergence, sometimes not. Relatively cheap, fast, and reliable air travel was certainly an enabling technology for national society meetings, but discretionary time, discretionary income, and institutions that reinforced professional membership participation certainly contributed.

Today, discretionary time has diminished a great deal, discretionary income levels are far lower than in the past, and fewer institutions offer incentives that reinforce membership or affiliation in a meaningful way.

But if you go long enough without attending a professional meeting, you begin to feel a disconnection. An interesting article in the New Yorker recently by Nathan Heller entitled, "Air Head," outlines some sources for this unease -- namely, the difference between being fed information and what he calls "encounter thinking":

The battle between jet planes and smartphones isn't about speed and glamour. It's about ways of knowing. . . . When physical travel cedes to digital exploration, a certain style of discovery falls away. . . . The stories you uncover through your smartphone are stories, basically, asking to be found. . . . There is a moment in "The Big Short," the new film based on Michael Lewis's reporting on the mortgage crisis, in which Steve Carell's character, a querulous hedge-fund manager, wants to figure out whether to bet against the market. It is 2007. The wisest analysts in New York and Washington are sanguine. He gets on a plane, flies to Florida, and meets some people on whose mortgages the market rides. He finds abandoned homes, loans for the unemployed, and a pole dancer who owns many homes. Encounter thinking, our response to the exceptional, saves us from the errors of consensus and the expectations of smooth process . . .

If improving meetings attendance involves improving the yield of "encounter thinking," then meeting strategies start to clarify. Imagine a meeting consisting of real surprises (keynotes unannounced but trusted to be worth seeing, with no option to watch after the event), heavy networking sessions structured for maximum results, and more community interactions geared to helping people make or sustain connections. In the society space, stagnant meeting formats are not a trivial part of the meetings problem.

But there are other forces at work causing professionals to disassociate from membership societies and their meetings.

One nearly invisible factor has robbed professionals at all levels of discretionary time. This thief does not emanate from digital technology. It's more prosaic. It's your local transportation system, and the long daily commutes that come from its inadequacies. In the US and other developed countries, professionals at all levels -- doctors, lawyers, scientists, administrators, professors, researchers -- can be stuck in traffic for 1-3 hours per day in major metropolitan areas. In addition, commutes are more sedentary -- more driving and sitting, rather than walking or cycling to work. Many professionals compensate for these hours of sedentary time by spending time at the gym or exercising. The net effect on discretionary time to spend with publications and membership organizations has consequently dropped off, which is an unseen factor we are all grappling with. It affects marriages, child rearing, and community involvement.

How is this broadly felt change affecting social connections and professional meetings? Many successful new approaches involve creating opportunities for social and professional gatherings immediately after work, so that professionals can avoid rush hour and spend time with colleagues or friends. From social game clubs to "pub nights," this timeshift seems slight -- away from dinner meetings to happy hours -- but it's important. It also carries a more casual atmosphere promising more fun and excitement, something that increases the appeal when compared to sitting in a car or on a train for an hour or more.

Other demographic changes are also changing society fundamentally -- higher college tuitions have diverted significant household income toward paying for college. In 1960, families spent 2% of household income on child care and education. By 2013, that number had swelled to 18%. With both parents working and caught in long commutes, daycare costs sap funds for the preschool years, and then college tuition comes along later to deplete funds when the adults are at peak earning potential.

How do these pressures manifest themselves? More acute price sensitivity, lower loyalty from year to year, and reluctance to spend money on non-essential items. These trends are all important for the success of a professional society. Unfortunately, current economic and social realities diverge strongly from how most professional societies are positioned in the world. Bylaws and structures established 50-75 years ago don't reflect the modern world, it inhabitants, and their needs. The underlying mental model doesn't jibe.

Publishers can have blind spots about customer needs, some of them pretty glaring. When it comes to digital, it's easy to succumb to seeing customer changes as emanating from the proliferation of technologies, but really the technologies are useful (or not) depending on the customers. Why is ORCID becoming more useful? The technology has certainly evolved, but the main factor is that authors are now grasping the value of identity disambiguation as a way to give them full credit for multiple contributions. And this eagerness for full credit in the scholarly sphere is driven by the heightened competition for scarce funding, which leads to academic and career success. The technology supports the need, and is becoming more valuable because customers are changing.

The list goes on. ResearchGate, Mendeley, and Academia.edu don't have any truly proprietary technology, but have arrayed some standard technology in a way that users find valuable.

Behind any successful technology -- print, online, mobile, broadcast, streaming -- is a satisfied customer. There are many more failures that have used the same technologies yet never satisfied enough customers to survive.

What is the market positioning today that will lead to long-term differentiation and success? Whatever your answer turns out to be, it's safe to say the successful approaches will be based in understanding today's customer and human realities. If "digital" is involved, its involvement will be incidental.

Do You Know Your Risk Profile?

The concept of "risk" has been one of the most useful abstractions for organizations everywhere. It captures uncertainties, changes in fortune or competitive sets, macroeconomics, and organizational preparedness. The concept of risk management originated in the 1960s with a Canadian Rhodes scholar, Doug Barlow, who asserted that "all management is risk management."

A decade ago, the risks around digital transformations may have been the top of the list. After the Great Recession of 2008 and the austerity measures that followed, economic growth has become a challenge.

Today, the risks have morphed yet again.

No matter what kind of organization you're in or run, risks accumulate from a number of sources -- financial, strategic, personnel, competitive, technology, governance, and product development. Looking across these effectively requires a clear view, as some risks can become so familiar otherwise that they no longer register.

Crafting a risk assessment is a high-yield way to understand the risks an organization is facing at a particular point in time. Commissioning an independent assessment that includes mitigation strategies can move an organization forward in important ways, and is often a stepping stone toward a strategic refresh.

That's why we're pleased to announce our Risk Assessment and Mitigation Plan package.

Risk assessments can deliver great results for executive leadership, senior management, and line managers. Risk assessments capture known risks while unearthing hidden or poorly articulated risks. Tying the assessment to a mitigation plan moves the assessment into a proactive mode, so that leadership can grapple effectively with risks like customer concentration, new competitors, missed opportunities, technology gaps, management misalignments, product inadequacies, or skills gaps.

For a fixed price, we'll dig in and deliver within 30 days an insightful and useful risk assessment and mitigation plan. Consisting of brief interviews with senior management, reviews of relevant documents and reports, and comparisons to cognate organization, our approach can kickstart important strategic and tactical discussions for your organization.

If you'd like to learn more, email us at: contact@caldera-publishing.com.

Crossing Route -- Will Cognitive Detraining Make AI Win Sooner?

I was recently (and belatedly) reading the 2016 Industry Trends report from Outsell. In addition to a lot of great data and takeaways, it included a tantalizing and slightly unsettling thought -- namely, that habituating ourselves to shorter information tidbits and residing within a fragmented and noisy knowledge sphere are leading to a general decline in our cognitive skills and stamina, a detraining of concentration. In addition to the direct effects of this, the phenomenon may affect the rate at which artificial intelligence (AI) overtakes human intelligence, as the distance between AI today and human intelligence in the future is shrinking because both aspects are moving toward each other. We're on the decline, and AI is on the rise.

There is a lot to unpack here. First, it is a stark reminder that the human brain is not something that you have, but something you develop. How you train it and feed it matters. There's a great passage in a book called, "How Not to Be Wrong." The author, Jordan Ellenberg, addresses the common complaint among school kids that they're never going to use what they learn in school in their everyday adult lives. Ellenberg argues that this is missing the point.

Just as a tennis star doesn't use weight-lifting, wind sprints, healthy eating, and stretching exercises during a tennis match, an educated person doesn't use their ability to solve quadratics or memorization of dates and events in their work lives. However, the training and conditioning from the preparatory activities make a difference in the end. Having lifted weights and pursuing a careful diet will give a tennis player an edge over a weaker player with a lousy diet, just as a person who has trained her mind to solve quadratics and memorize facts will do better with intellectual challenges than a person who slacked off. You don't use the wind sprints or quadratic drills themselves later, but you do leverage the benefits they deliver.

Every time we sneak a peek at our smartphones during a movie, we are detraining our brains from being able to sustain moderate-to-intense concentration for two hours. Every time we look away from writing a long document to check social media or email, we are detraining our minds for extended workloads.

Now, we are human, and we need breaks from work. Research has shown that taking a break helps knowledge workers, because the human brain is complex -- hot showers and long walks deliver better blood supply, which helps thinking; resting the conscious brain can help the limbic brain churn out answers in our multi-tier cranial system, leading to that classic "aha!" moment.

Another aspect of this thought is that if we're lowering the bar AI has to clear, then we're not going to get AI that's as impressive or helpful as it could be.

All in all, I found this to be a sobering reminder that we need to remain aware that while our intellectual habits are trending toward burst communication, attention fragmentation, and short attention spans, we can benefit from working out with extended mental concentration. It's not easy, the payoffs are indirect, but the benefits will likely come.

The (Exposed) Internet of Things

The so-called Internet of Things is certainly an important commercial trend, from FitBits to Nest thermostats. Yet, for all the potential virtues and benefits, it's apparently not very secure from a privacy standpoint. A recent examination of a search service (Shodan) devoted to crawling the IoT found that you can see a lot of amazing things -- sleeping children, marijuana grow rooms, classrooms, and so forth -- because security on webcams is either poor or non-existent.

Quoting from the ArsTechnica article:

The cameras are vulnerable because they use the Real Time Streaming Protocol (RTSP, port 554) to share video but have no password authentication in place. The image feed is available to paid Shodan members at images.shodan.io. Free Shodan accounts can also search using the filter port:554 has_screenshot:true. Shodan crawls the Internet at random looking for IP addresses with open ports. If an open port lacks authentication and streams a video feed, the new script takes a snap and moves on. While the privacy implications here are obvious, Shodan’s new image feed also highlights the pathetic state of IoT security, and raises questions about what we are going to do to fix the problem.

The root cause of these insecure webcams is the drive for the minimum viable product, along with consumer hesitation to pay much. Manufacturers have scrimped on security to lower prices, and consumers are ignorant of what kind of access this can give to hackers, search engines, or just curious passers-by in the virtual world.

Groups are working on security rating systems so that consumers can gain insights into vulnerabilities before purchase, and the US Federal Trade Commission is becoming aware of the problem. It has already gone after more than 50 companies that did not reasonably secure their devices or networks, and has more on its docket. The US Department of Defense and DARPA are also concerned and involved. However, this is a worldwide problem, and the US government's involvement is only the start.

One model being discussed is the UL model used to set standards for electrical devices. If something is UL-approved, it meets basic standards for safety and construction. Some critics deride this comparison as imposing a lot of bureaucracy for little benefit -- after all, UL listings assure that manufacturing is sufficient to support the intended use, but are not built or developed to deal with intentional attacks on the electrical grid. The question remains how to build cyber-infrastructure that fends off intentional attacks. As one expert states:

Our dependence on technology is growing faster than our ability to secure it.

Opportunities exist for information providers, and the ArsTechnica article does not touch on the number of connected devices being developed for professional and scientific use. What kind of risks exist? I touched on this in an essay in 2013, which noted that the level of cyberwarfare was already high then. With the growth of tensions around economic and ideological matters since then, the level of hacking has only increased. Publishing, e-commerce, and customer data platforms are under a nearly constant barrage of probing attacks from outside sources with presumably unsavory intentions. As one expert wrote a few years ago:

In 10 years, computers will be everywhere we look, and they’ll all have wireless. Will you be able to compromise someone’s insulin pump through their car? Will you be able to induce seizures by subverting their house lights? Will you be able to run these exploits by cell phone? What’s possible? It’s more like ‘What won’t be possible?’

It's important to see things for what they are. The "Internet of Things" requires placing connected Internet devices all around us. The implications of doing so are not clear, nor is the necessity or value. Technology is not benign, its benefits are not automatic. 

Pushwalls -- A New Phenomenon in Digital Publishing?

The term "paywalls" is commonly used to describe the commercial practice of requiring inbound traffic for a transaction (registration, payment) before granting access to content. However, more and more often, users and publishers alike are encountering elements of the digital ecosystem asking outbound traffic to pass certain standards or meet certain criteria, as well.

I've taken to calling these "pushwalls" -- barriers to getting your content or online business out to customers and users.

It's no longer safe to assume that users or digital publishers have an unfettered, straight path on the open Web to purveying content or even having their content seen or discovered. New elements -- the Great Firewall in China, geoblocking, ad blockers, spam filters, profanity filters -- create new impediments in digital publishing.

In scientific publishing, with tech-savvy users now the norm, ad blocking is particularly concerning for many publishers. Advertising is a way to diversify revenues and lower subscription rates or, for OA publishers, APCs. Whether those ads are in-house promotions -- calls for papers, announcements of meetings, appeals for membership -- or traditional commercial advertising, the percentage of users blocking these ads should be known. In one case I've seen, it's as high as 20%, which means that 1 in 5 users will not see these messages. That is a significant pushwall that can move commercial behavior away from advertising and back to charging users. In effect, users are blocking one commercial option in a way that will simply move publishers back to another commercial option -- asking users to pay directly. Yet this pushwall is growing in size.

The Great Firewall in China also poses barriers to scientific content, which can consist of materials and opinions the Firewall does not initially tolerate. Technology practices can also create barriers, as Google elements -- because of various ongoing and past dustups between Google and China -- often trigger a reaction within the Great Firewall, which blocks APIs and other elements from displaying.

Email continues to be the coin of the realm for driving awareness and the traffic pulse of journal sites, and spam filters have improved immensely in recent years, but some messages still get caught unexpectedly, and changes to email systems and practices have to be done carefully to ensure white-listing and compliance with best practices.

Social media pushwalls aren't as visible, but scientific content can trigger them inadvertently, especially if anatomical parts are described in headlines or included as images. Some systems have a Puritanical streak, and certain articles and headlines crash and burn against their pushwall.

Recognizing where delivery barriers exist, how to overcome or avoid them, and remaining adaptable to a slightly byzantine online publishing infrastructure is now a requirement for successful online business and publishing. It's not as simple as hitting "publish." There's a lot more to it these days.

Academic Publishing in Europe 2016 -- Notes from Day 2

Yesterday, I wrote up some notes and observations from Day 1 of the APE 2016 meeting in Berlin. Naturally, there's a bit more to say after Day 2.

Day 2 consisted of three major pieces -- a discussion of the STM Consultation on Article Sharing; a session devoted to pitches from five start-ups with audience evaluation; and a session on publication and research ethics. 

Richard Padley of Semantico facilitated the session on the STM Consultation on Article Sharing and the Voluntary Principles for Scholarly Communication Networks. Fred Dylla, who led the taks force that assembled these, was front and center, while I and Grace Baynes from Springer-Nature filled out the representation from that group. Charlie Rapple from Kudos and Hannfried von Hindenburg from Elsevier also joined the panel.

The session revolved around a few main points -- everyone's acceptance that article sharing and scholarship go hand in hand; that online sharing requires some additional thought and considerations, especially if it is organized for scale; the effective and open process used to generate the voluntary principles, and their subsequent adoption by many scholarly communication networks and publishers; the positive side-effects of the effort, including improvements and softening of language on many major publishers' sites; and the ongoing concern that publishers were playing catch-up with the scholarly communication networks in some respects.

A lack of research into what actually occurs on these scholarly communication networks -- the big three being Academia.edu, ResearchGate, and Mendeley -- was noted. However, research results from work being done by Carol Tenopir and Anthony Watkinson and others is apparently imminent (a preliminary report is here). In lieu of this, some data were shared by people with experience with these networks, and some key attributes were discussed. The main message was that we need to continue to understand the users of the literature as well as the authors, and that perhaps our attention slipped during the past few years in the midst of "author as customer" emphasis.

The session spotlighting five new startups included Overleaf, Bookmetrix, Dryad, Publons, and Zapnito. It was an interesting set, with focus on a number of gaps in the market. In my estimation, four of the five represent "wide" plays -- you need a lot of customers paying small fees each to make it work financially. Only one -- Zapnito -- represented a "tall" play, wherein a few customers using their integrated media platform robustly can represent a lot of revenue and market penetration. Of the five, three had working business models -- Overleaf, Dryad, and Zapnito -- while the other two were in various stages of early development. Having personally tried the "wide" approach a couple of times, it's clear that proper positioning within the market is critical. In this regard, Overleaf has a clear advantage, as it is part of Digital Science. Bookmetrix is being cultivated by Wiley, so it probably also can benefit from the heft of its parent. It will be interesting to see the results of one question the moderator posed -- Which do you think will be around five years from now?

The final session on research and publication ethics was moderated by Bernd Pulverer from EMBO. Covering the concerns emanating from an increasing number of papers, decreasing funding for science, and the competition for short-term reputation this can drive, the session also tackled the place of journals, editors, and publishers in the research ethics system. This is a topic that needs constant updating and attention, so it was good to have a robust set of speakers and an in-depth discussion of these issues. A main theme centered around the issue of addressing ethical breakdowns farther up in the process, so that they don't arrive on journal doorsteps or, worse, in published articles, which journals then have to deal with.

Once again, this Berlin meeting seems to be a good kick-off to the year, and the energy level -- either because everyone is rested from the long holidays, or because there is the renewed optimism of a New Year, or because budgets are fresh and untroubled yet -- is generally better than at other meetings later, when people are more travel-weary, overworked, and grappling with projects going sideways.

Until next year, Berlin . . .

Academic Publishing in Europe 2016 -- Notes from Berlin, Day 1

This is my second Academic Publishing in Europe meeting. Last year, I wrote on the Scholarly Kitchen about my first. I'm happy to be back again, as the meeting possesses a scale and energy I find appealing and helpful for discussions.

Tuesday's sessions focused on a number of interesting topics, including data publishing, scholarly publishing infrastructure, and the financing of science and scientific publishing. Sprinkled within these and other broad topics were strong presentations on publishing technologies, a couple of futuristic visions for academic publishing, and some updates on policy discussions in Germany and Europe. There were good questions from the approximately 200 attendees.

Of course, the best part of a meeting is in the networking, and the APE 2016 meeting has this in spades, with a relaxed atmosphere, a nice dinner, and a good conference space.

I found the most compelling session Tuesday to be about scholarly publishing infrastructure, which we often take for granted (that's what you do with infrastructure that's working -- the lights come on, the faucet works). Geoffrey Bilder revisited a great reminder of some of the infrastructure we don't see, putting up a silhouette of a scholarly article, with the text blacked out, which illustrated that even the scholarly format itself is an infrastructure element -- after all, everyone could see this was a scholarly article, and easily identify the zones for title, authors, abstract, full-text, citation information, and so forth, just from the structure of the page. This may help to explain why the PDF is so persistently useful -- it captures useful infrastructure scholars rely upon.

Publishers have been helping to build effective new technology infrastructure for years now, from CrossRef to CHORUS. The way these and other similar infrastructure elements have stabilized and enhanced scholarly communication bears contemplation. From archiving to linking to public access, we are now working on a fundamentally improved infrastructure. What's going to come next? And are we sure what we have will stay and remain viable? Most importantly, will the scientific community own it? Or will commercial entities, entering through public-private partnerships, end up capturing the scholarly infrastructure? This is an issue, as academia tends to fund work in fits and starts, while commercial entities with clear goals can execute consistent strategies for years.

It's clear that we're still in the midst of a series of changes, which may never stop, but which instead speed up and slow down depending on how various trends overlap and push one another forward. It seems wise to have a solid, reliable infrastructure that we can take for granted while we deal with other changes.

The World Is Not Flat -- In Fact, It's Adding Hills

When I was running International Business for the New England Journal of Medicine, Thomas Friedman's book "The World Is Flat" was on the bestseller list. The meme of the title had become a strategic calling card of sorts for people interested in international business. Since the Internet had flattened the commercial world, didn't this mean that we could do X and Y and Z for the first time ever?

Our experiences did not line up with the notion that the world had become flat, however. Because we had a tradition of holding half-day retreats annually to assess our situation and plan for the year, that year we discussed the cognitive dissonance we were experiencing around this "world is flat" notion. What we came up with still seems right to me:

The information landscape has flattened to a degree, but the business landscape is extremely hilly.

It's not an earth-shattering insight, but it was more accurate and useful to us than Friedman's simplified arguments that we were in a post-modern business era with fluid, borderless markets.

In the years since then, the hills have become steeper, both for information dissemination and for business practices. China is a particularly interesting example, as the Great Firewall makes it harder to get information in, hacker attacks make hosting more expensive and fraught than before, and Chinese business practices and market volatility continue to present moving targets and intricate puzzles. Geoblocking by companies like the YouTube, BBC, and Netflix is being challenged in Europe. And geotargeting of content and advertising is normal -- as I write this from Berlin, Germany, my Google results still lapse into German occasionally, and my iPhone presents information as if I'm a German-language speaker.

Other parts of the world have seen their share of distinct local problems. Spain took it on the chin in 2008, and is still reeling. Argentina continues to be unpredictable. The low prices of oil have made Saudi Arabia a more volatile place, with new payment problems and uncertainties. Various local approaches to open access -- the UK, the Netherlands most recently -- also create new regional business-and-information conundrums.

This is disappointing news for those envisioning a seamless approach to global business, with easy growth in highly accessible markets. The upshot is that partnerships at the local level, or with organizations with local expertise, are becoming more important than ever. Local knowledge continues to be a major advantage. Operating as if the world is flat is a perfect way to keep your growth curve flat, too.

New Players, New Costs: Why Is Digital Publishing So Expensive?

Back in 2014, I wrote a post on the Scholarly Kitchen about the confounded complexity of digital and its endless upgrade paths. Since then, it's only become clearer that online businesses -- from banking to publishing to shopping -- are more expensive to run, and require new thinking about how to make them succeed.

The expensive nature of digital publishing flies counter to initial expectations which seem to persist despite abundant empirical evidence to the contrary. Users expect that online music, online publishing, online banking, and other online ventures will generate savings through scale or lower technology costs compared to physical goods or via some other means. There's no paper, no vinyl, no tellers, so of course it's much cheaper.

But experience trumps expectations, and the experience of the online world is that the new costs -- of talented programmers, data facilities, firewalls and security, project management, software licenses, e-commerce systems, SEO, SEM, archiving, data revisions, content revisions, and so forth -- are greater than the costs of the past. In addition, these costs are more often fixed costs, compared to the more frequent variable costs of the past production environment. This makes the new costs riskier to adopt and more challenging to manage. Multi-year contracts, high-salary employeees, and other factors limit the nimbleness of management in the digital realm.

Organizations often struggle to come to grips with these realities. Some evangelists and some leftover expectations of technophiles can perpetuate the perception that digital business can be cheaper and easier to run than traditional business lines. These same dreamers can also inflate revenue expectations in their ruminations of what must surely be possible in this brave new world. Yet, without a realistic and actionable path to rationalize expenses and make achievable projections based on actual expenses and revenues, organizations can fall prey to vague strategies that lead to overspending without a rational commercial upside.

An analog worth contemplating is banking. More than 97% of financial transactions occur in the digital world and not through the exchange of physical money -- from swiping debit cards to e-commerce to wire transfers. Yet, interest rates on deposit accounts trend toward 0%, while fees for transactions are rampant. Credit card companies charge high lending rates to cover the risk exposure they face from hackers, defaults, and card thieves. Customer data protections are expensive and vulnerable. ATMs and other customer conveniences have to be maintained in the physical world, to dispense the small amount of physical goods bank customers still rely upon. New entrants -- Apple Pay and Samsung and PayPal and Square -- extract fees themselves for offering customer convenience, putting their fingers in the economic pie.

The scenario of more middlemen, more outlets, and larger players isn't alien to academic and scholarly publishing. And all the players in the digital game require support and management. New business relationships have some new players holding more cards (e.g., Google, advertising agencies, funders), and publishers seem to have fewer or worse cards than before. Different data distribution tasks, management of risk around commerce and customer data, and the lingering desire for print and print analogs in a customer base largely protected from the actual costs of information, all contribute to what is a more complex and expensive publishing environment. Meanwhile, leverage continues to shift to enterprises with network-effect scale.

Having solid strategies and a clear overview of the financial cost and potential upside is critical during this period. Belief in a more efficient digital business isn't enough. You have to build that efficient business. Every blind spot in this realm can be costly. And every organization can benefit from a long-term strategy and a proactive approach to the digital publishing technology and evolving forms of customer engagement.

Why Caldera?

Since announcing Caldera Publishing Solutions, a number of friends and colleagues have asked why I chose the name "caldera."

I was born in Colorado and grew up in the Rocky Mountains, in a family with many generations involved in building and preserving Yellowstone National Park. As part of this tradition, I spent a week or more each year visiting the park, and have maintained the tradition for my own family.

One of the most interesting features of Yellowstone -- in fact, the feature that largely makes the park so distinctive -- is the caldera beneath it. This remnant of an ancient volcano still simmers, with geyser basins, mud pots, and heated prismatic pools as evidence.

So, when I was thinking of a name for my consulting firm, I hearkened back to this image of a long-standing reservoir of power, tapped sporadically and unpredictably, ultimately settling on the image of Old Faithful, the most predictable geyser in the park.

To me, the caldera represents a reserve of potential and power, which most organizations possess. My goal with Caldera Publishing Solutions is to help tap into that reserve for the long-term. Old Faithful has been attracting crowds for well over a century. It is a feature of the caldera. As the landscape shifts, however, new features are emerging, and we have to be ready.

It's not a perfect image, but to me, it's inspiring and personal. It brings my family's story to the present day. Taming the caldera is impossible, but benefiting from its power and features is certainly within our ability.

Decimalization, Money, Science, and Libraries

One of the more interesting perspectives I've read recently related to the history of science comes from a book called, "The History of Money," by Jack Weatherford. This is one of my favorite books because, like it or not, money makes the world go 'round, and this clearly written and well-constructed book, although a bit dated, still illuminates much about a subject that is relevant to more than we might imagine, science and technology included.

How we value goods and services has been an eons-long project, and quirky systems proliferated for a long time (and do to this day). However, a familiar and more rigorous system emerged with the move toward a decimalized (100-base) currency system.

The first major move toward decimalization of our world came from Russia's move in the early 18th century. Other currency systems of the time were based on an odd assemblage of values. The Spanish dollar consisted of eight reales. The English system rested on farthings, crowns, sovereigns, pennies, guineas, and pounds. In this system, a shilling consisted of 12 pence, and a pound of 20 shillings, making a pound worth 240 pence. A guinea (a pound and a shilling) was worth 260 pence. This made it easy to perpetuate fraud and obscure price changes, and generally harder to do business.

The Russian system was not adopted in Europe because Russia was viewed as backwards. However, the United States, an ocean away, felt no such qualms, and became the first coinage system to become wholly decimal-based -- 100 pennies = 1 dollar. France soon followed, with the move to decimal-based currency coinciding nicely with the development of the metric system.

However, the metric system was not an instant success. While the scientific community saw the value of a standardized system of measures, agreeing on which set to use was a challenge. James Watt was a major proponent of standardized systems, but his ideas were adopted incompletely, with the main ones remaining being the watt and horsepower (747.5 watts).

It was within the bounds of international trade shows -- World Fairs and the like -- that commercial and scientific measurement systems commingled, synthesized, and standardized. As Weatherford writes:

At the urging of industrialists and other supporters of the metric system for coins, weights, and measures, scientists gathered for an international statistical meeting during the . . . world's fair, held in Paris in 1855. . . . The Vienna Coin Treaty of January 24, 1857, helped move forward the adoption of the decimal system for coinage, and it encouraged the adoption of metric weights and measures.

There is a particularly rich and rueful section omitted from this simple summary, which is one of many examples from the book showing how Weatherford has a great sense for historical irony.

In another development tying information systems into this entire commercial and scientific mix, we have this, following the observation that Americans have been loathe to adopt the decimal system for their weights and measures:

Americans, however, did apply the decimal system in an unexpected way through the work of an otherwise little known New York librarian and Columbia University professor, Melvin Dewey. He divided library books into ten classifications, which he further divided and redivided into what became known as the Dewey decimal system.

Money, libraries, and measurements all filtered down from the upper reaches of society to the poorest segments, making simple acts like "counting to 100" and "dividing by 10" second-nature for all. Stories reflected the shift into greater awareness of financial matters and the transformative power of harnessing nature, as "Jack and the Beanstalk" deals with a silly barter for beans that ultimately transform into a way to gain the secret wealth of a giant, and Rumplestiltskin, in which straw can be spun into gold and a pinprick can lead to death.

Weatherford writes convincingly about how the decimalization of value, the natural world, and our information sphere changed the world:

Money forces humans to reduce qualitative differences to quantitative ones. It forces a numbering of things, and this quantification allows things that are very unalike to be compared. This numbering pushed ahead the development of mathematics, and when applied to other fields, it made science possible. This tendency toward numbering became the basis of modern thought from the rise of mathematics and science through to the increased objectification of law, medicine, teaching, and virtually all other professions.

We continue to see these approaches proliferate, from sabremetrics in baseball to the scores related to movies in Rotten Tomatoes to the ratings on shopping and recipe sites -- they all point to the drive to quantify experiences and activities for the sake of comparison. In scholarly publishing, the impact factor, altmetrics, and a variety of other measures represent similar attempts to allow us to compare.

This kind of empirical approach to the world informs success at many levels -- in how information is organized, how we understand the world (including customers and their values), and how we plan for the future.

The Federal Trade Commission Contemplates Big Data

Earlier this month, the US Federal Trade Commission issued recommendations for businesses seeking to implement big data solutions. The report summarizes multiple sources -- a public workshop held last September, with four panels tackling different aspects; 65 public comments from citizens, industry representatives, think tanks, consumer groups, privacy advocates, and academics; and an earlier seminar on big data held in 2014.

The main caveat emerging from the report is that companies can inadvertently stumble into discriminatory actions if their big data inferences hit a protected class in a discriminatory manner, especially if a more direct approach could have prevented the discrimination. For example, a big data decision to not market a good price to a particular postal zone could also be perceived as depriving a protected class (by race, religion, gender).

Potential problems with careless big data inferences include inferring that certain geographic or behavioral traits make an individual ineligible for credit, favorable rates and discounts, or beneficial treatment (such as solicitations from top universities). Because individuals sharing certain protected demographic features (race, religion) can share a common postal zone or set of shops, inferring an individual's traits from population correlations could put a business at risk for being sued for discriminatory inferences, so to speak.

The potential for bias seems considerable, and the subtitle of the report captures the main concern: "A Tool for Inclusion or Exclusion?" For instance, if data from wearable devices are used to determine whether certain civic funding occurs (parks, repaving projects), the affluent areas could benefit disproportionately because wealthy people have FitBits while poor people don't.

The authors are careful to note that many benefits can accrue from big data used properly, including finding clever ways to pick promising people out of larger pools. For example, some people in impoverished areas are on the right track, and by finding them, new lending, educational, and work opportunities can be created. Some companies are using their big data tools in just this manner, and creating win-win scenarios.

The report has plenty of interesting examples worth contemplating:

  • A credit card company that rated consumer credit worthiness based on whether they'd paid for marriage counseling, therapy, or tire repair services, based on inferences within their big data set.
  • The ability of companies to identify "suffering seniors" with early Alzheimer's to exploit with offers.
  • Preferring job applicants based on whether they used a browser they installed themselves (Firefox, Chrome), rather than one that came with their computer, as they'd found employees with these traits "perform better and change jobs less often."

The references point to a site worth a visit, if only to remind you that correlation does not equal causation: Spurious Correlations. On the day I visited, the featured chart correlated US spending on science, space, and technology with suicides by hanging, suffocation, and strangulation. Maybe the recent increases in the NIH, NOAA, NASA, and NSF budgets aren't the unvarnished good news we initially thought . . .

The FTC report is worth a look, if only to remind ourselves of the limitations of big data, which the authors capture succinctly:

Companies should remember that while big data is very good at detecting correlations, it does not explain which correlations are meaningful.

Or which correlations are risk-free.

As businesses become more digital at their core, data will become more central to success. And not just analytics, but rich customer data. Managing these data, using them judiciously and efficiently, and ensuring compliance with various laws and expectations will be vital to long-term strategic change and growth. Companies that begin early will learn first and best if they are diligent.