The (Exposed) Internet of Things

The so-called Internet of Things is certainly an important commercial trend, from FitBits to Nest thermostats. Yet, for all the potential virtues and benefits, it's apparently not very secure from a privacy standpoint. A recent examination of a search service (Shodan) devoted to crawling the IoT found that you can see a lot of amazing things -- sleeping children, marijuana grow rooms, classrooms, and so forth -- because security on webcams is either poor or non-existent.

Quoting from the ArsTechnica article:

The cameras are vulnerable because they use the Real Time Streaming Protocol (RTSP, port 554) to share video but have no password authentication in place. The image feed is available to paid Shodan members at images.shodan.io. Free Shodan accounts can also search using the filter port:554 has_screenshot:true. Shodan crawls the Internet at random looking for IP addresses with open ports. If an open port lacks authentication and streams a video feed, the new script takes a snap and moves on. While the privacy implications here are obvious, Shodan’s new image feed also highlights the pathetic state of IoT security, and raises questions about what we are going to do to fix the problem.

The root cause of these insecure webcams is the drive for the minimum viable product, along with consumer hesitation to pay much. Manufacturers have scrimped on security to lower prices, and consumers are ignorant of what kind of access this can give to hackers, search engines, or just curious passers-by in the virtual world.

Groups are working on security rating systems so that consumers can gain insights into vulnerabilities before purchase, and the US Federal Trade Commission is becoming aware of the problem. It has already gone after more than 50 companies that did not reasonably secure their devices or networks, and has more on its docket. The US Department of Defense and DARPA are also concerned and involved. However, this is a worldwide problem, and the US government's involvement is only the start.

One model being discussed is the UL model used to set standards for electrical devices. If something is UL-approved, it meets basic standards for safety and construction. Some critics deride this comparison as imposing a lot of bureaucracy for little benefit -- after all, UL listings assure that manufacturing is sufficient to support the intended use, but are not built or developed to deal with intentional attacks on the electrical grid. The question remains how to build cyber-infrastructure that fends off intentional attacks. As one expert states:

Our dependence on technology is growing faster than our ability to secure it.

Opportunities exist for information providers, and the ArsTechnica article does not touch on the number of connected devices being developed for professional and scientific use. What kind of risks exist? I touched on this in an essay in 2013, which noted that the level of cyberwarfare was already high then. With the growth of tensions around economic and ideological matters since then, the level of hacking has only increased. Publishing, e-commerce, and customer data platforms are under a nearly constant barrage of probing attacks from outside sources with presumably unsavory intentions. As one expert wrote a few years ago:

In 10 years, computers will be everywhere we look, and they’ll all have wireless. Will you be able to compromise someone’s insulin pump through their car? Will you be able to induce seizures by subverting their house lights? Will you be able to run these exploits by cell phone? What’s possible? It’s more like ‘What won’t be possible?’

It's important to see things for what they are. The "Internet of Things" requires placing connected Internet devices all around us. The implications of doing so are not clear, nor is the necessity or value. Technology is not benign, its benefits are not automatic. 

Pushwalls -- A New Phenomenon in Digital Publishing?

The term "paywalls" is commonly used to describe the commercial practice of requiring inbound traffic for a transaction (registration, payment) before granting access to content. However, more and more often, users and publishers alike are encountering elements of the digital ecosystem asking outbound traffic to pass certain standards or meet certain criteria, as well.

I've taken to calling these "pushwalls" -- barriers to getting your content or online business out to customers and users.

It's no longer safe to assume that users or digital publishers have an unfettered, straight path on the open Web to purveying content or even having their content seen or discovered. New elements -- the Great Firewall in China, geoblocking, ad blockers, spam filters, profanity filters -- create new impediments in digital publishing.

In scientific publishing, with tech-savvy users now the norm, ad blocking is particularly concerning for many publishers. Advertising is a way to diversify revenues and lower subscription rates or, for OA publishers, APCs. Whether those ads are in-house promotions -- calls for papers, announcements of meetings, appeals for membership -- or traditional commercial advertising, the percentage of users blocking these ads should be known. In one case I've seen, it's as high as 20%, which means that 1 in 5 users will not see these messages. That is a significant pushwall that can move commercial behavior away from advertising and back to charging users. In effect, users are blocking one commercial option in a way that will simply move publishers back to another commercial option -- asking users to pay directly. Yet this pushwall is growing in size.

The Great Firewall in China also poses barriers to scientific content, which can consist of materials and opinions the Firewall does not initially tolerate. Technology practices can also create barriers, as Google elements -- because of various ongoing and past dustups between Google and China -- often trigger a reaction within the Great Firewall, which blocks APIs and other elements from displaying.

Email continues to be the coin of the realm for driving awareness and the traffic pulse of journal sites, and spam filters have improved immensely in recent years, but some messages still get caught unexpectedly, and changes to email systems and practices have to be done carefully to ensure white-listing and compliance with best practices.

Social media pushwalls aren't as visible, but scientific content can trigger them inadvertently, especially if anatomical parts are described in headlines or included as images. Some systems have a Puritanical streak, and certain articles and headlines crash and burn against their pushwall.

Recognizing where delivery barriers exist, how to overcome or avoid them, and remaining adaptable to a slightly byzantine online publishing infrastructure is now a requirement for successful online business and publishing. It's not as simple as hitting "publish." There's a lot more to it these days.

Academic Publishing in Europe 2016 -- Notes from Day 2

Yesterday, I wrote up some notes and observations from Day 1 of the APE 2016 meeting in Berlin. Naturally, there's a bit more to say after Day 2.

Day 2 consisted of three major pieces -- a discussion of the STM Consultation on Article Sharing; a session devoted to pitches from five start-ups with audience evaluation; and a session on publication and research ethics. 

Richard Padley of Semantico facilitated the session on the STM Consultation on Article Sharing and the Voluntary Principles for Scholarly Communication Networks. Fred Dylla, who led the taks force that assembled these, was front and center, while I and Grace Baynes from Springer-Nature filled out the representation from that group. Charlie Rapple from Kudos and Hannfried von Hindenburg from Elsevier also joined the panel.

The session revolved around a few main points -- everyone's acceptance that article sharing and scholarship go hand in hand; that online sharing requires some additional thought and considerations, especially if it is organized for scale; the effective and open process used to generate the voluntary principles, and their subsequent adoption by many scholarly communication networks and publishers; the positive side-effects of the effort, including improvements and softening of language on many major publishers' sites; and the ongoing concern that publishers were playing catch-up with the scholarly communication networks in some respects.

A lack of research into what actually occurs on these scholarly communication networks -- the big three being Academia.edu, ResearchGate, and Mendeley -- was noted. However, research results from work being done by Carol Tenopir and Anthony Watkinson and others is apparently imminent (a preliminary report is here). In lieu of this, some data were shared by people with experience with these networks, and some key attributes were discussed. The main message was that we need to continue to understand the users of the literature as well as the authors, and that perhaps our attention slipped during the past few years in the midst of "author as customer" emphasis.

The session spotlighting five new startups included Overleaf, Bookmetrix, Dryad, Publons, and Zapnito. It was an interesting set, with focus on a number of gaps in the market. In my estimation, four of the five represent "wide" plays -- you need a lot of customers paying small fees each to make it work financially. Only one -- Zapnito -- represented a "tall" play, wherein a few customers using their integrated media platform robustly can represent a lot of revenue and market penetration. Of the five, three had working business models -- Overleaf, Dryad, and Zapnito -- while the other two were in various stages of early development. Having personally tried the "wide" approach a couple of times, it's clear that proper positioning within the market is critical. In this regard, Overleaf has a clear advantage, as it is part of Digital Science. Bookmetrix is being cultivated by Wiley, so it probably also can benefit from the heft of its parent. It will be interesting to see the results of one question the moderator posed -- Which do you think will be around five years from now?

The final session on research and publication ethics was moderated by Bernd Pulverer from EMBO. Covering the concerns emanating from an increasing number of papers, decreasing funding for science, and the competition for short-term reputation this can drive, the session also tackled the place of journals, editors, and publishers in the research ethics system. This is a topic that needs constant updating and attention, so it was good to have a robust set of speakers and an in-depth discussion of these issues. A main theme centered around the issue of addressing ethical breakdowns farther up in the process, so that they don't arrive on journal doorsteps or, worse, in published articles, which journals then have to deal with.

Once again, this Berlin meeting seems to be a good kick-off to the year, and the energy level -- either because everyone is rested from the long holidays, or because there is the renewed optimism of a New Year, or because budgets are fresh and untroubled yet -- is generally better than at other meetings later, when people are more travel-weary, overworked, and grappling with projects going sideways.

Until next year, Berlin . . .

Academic Publishing in Europe 2016 -- Notes from Berlin, Day 1

This is my second Academic Publishing in Europe meeting. Last year, I wrote on the Scholarly Kitchen about my first. I'm happy to be back again, as the meeting possesses a scale and energy I find appealing and helpful for discussions.

Tuesday's sessions focused on a number of interesting topics, including data publishing, scholarly publishing infrastructure, and the financing of science and scientific publishing. Sprinkled within these and other broad topics were strong presentations on publishing technologies, a couple of futuristic visions for academic publishing, and some updates on policy discussions in Germany and Europe. There were good questions from the approximately 200 attendees.

Of course, the best part of a meeting is in the networking, and the APE 2016 meeting has this in spades, with a relaxed atmosphere, a nice dinner, and a good conference space.

I found the most compelling session Tuesday to be about scholarly publishing infrastructure, which we often take for granted (that's what you do with infrastructure that's working -- the lights come on, the faucet works). Geoffrey Bilder revisited a great reminder of some of the infrastructure we don't see, putting up a silhouette of a scholarly article, with the text blacked out, which illustrated that even the scholarly format itself is an infrastructure element -- after all, everyone could see this was a scholarly article, and easily identify the zones for title, authors, abstract, full-text, citation information, and so forth, just from the structure of the page. This may help to explain why the PDF is so persistently useful -- it captures useful infrastructure scholars rely upon.

Publishers have been helping to build effective new technology infrastructure for years now, from CrossRef to CHORUS. The way these and other similar infrastructure elements have stabilized and enhanced scholarly communication bears contemplation. From archiving to linking to public access, we are now working on a fundamentally improved infrastructure. What's going to come next? And are we sure what we have will stay and remain viable? Most importantly, will the scientific community own it? Or will commercial entities, entering through public-private partnerships, end up capturing the scholarly infrastructure? This is an issue, as academia tends to fund work in fits and starts, while commercial entities with clear goals can execute consistent strategies for years.

It's clear that we're still in the midst of a series of changes, which may never stop, but which instead speed up and slow down depending on how various trends overlap and push one another forward. It seems wise to have a solid, reliable infrastructure that we can take for granted while we deal with other changes.

The World Is Not Flat -- In Fact, It's Adding Hills

When I was running International Business for the New England Journal of Medicine, Thomas Friedman's book "The World Is Flat" was on the bestseller list. The meme of the title had become a strategic calling card of sorts for people interested in international business. Since the Internet had flattened the commercial world, didn't this mean that we could do X and Y and Z for the first time ever?

Our experiences did not line up with the notion that the world had become flat, however. Because we had a tradition of holding half-day retreats annually to assess our situation and plan for the year, that year we discussed the cognitive dissonance we were experiencing around this "world is flat" notion. What we came up with still seems right to me:

The information landscape has flattened to a degree, but the business landscape is extremely hilly.

It's not an earth-shattering insight, but it was more accurate and useful to us than Friedman's simplified arguments that we were in a post-modern business era with fluid, borderless markets.

In the years since then, the hills have become steeper, both for information dissemination and for business practices. China is a particularly interesting example, as the Great Firewall makes it harder to get information in, hacker attacks make hosting more expensive and fraught than before, and Chinese business practices and market volatility continue to present moving targets and intricate puzzles. Geoblocking by companies like the YouTube, BBC, and Netflix is being challenged in Europe. And geotargeting of content and advertising is normal -- as I write this from Berlin, Germany, my Google results still lapse into German occasionally, and my iPhone presents information as if I'm a German-language speaker.

Other parts of the world have seen their share of distinct local problems. Spain took it on the chin in 2008, and is still reeling. Argentina continues to be unpredictable. The low prices of oil have made Saudi Arabia a more volatile place, with new payment problems and uncertainties. Various local approaches to open access -- the UK, the Netherlands most recently -- also create new regional business-and-information conundrums.

This is disappointing news for those envisioning a seamless approach to global business, with easy growth in highly accessible markets. The upshot is that partnerships at the local level, or with organizations with local expertise, are becoming more important than ever. Local knowledge continues to be a major advantage. Operating as if the world is flat is a perfect way to keep your growth curve flat, too.

New Players, New Costs: Why Is Digital Publishing So Expensive?

Back in 2014, I wrote a post on the Scholarly Kitchen about the confounded complexity of digital and its endless upgrade paths. Since then, it's only become clearer that online businesses -- from banking to publishing to shopping -- are more expensive to run, and require new thinking about how to make them succeed.

The expensive nature of digital publishing flies counter to initial expectations which seem to persist despite abundant empirical evidence to the contrary. Users expect that online music, online publishing, online banking, and other online ventures will generate savings through scale or lower technology costs compared to physical goods or via some other means. There's no paper, no vinyl, no tellers, so of course it's much cheaper.

But experience trumps expectations, and the experience of the online world is that the new costs -- of talented programmers, data facilities, firewalls and security, project management, software licenses, e-commerce systems, SEO, SEM, archiving, data revisions, content revisions, and so forth -- are greater than the costs of the past. In addition, these costs are more often fixed costs, compared to the more frequent variable costs of the past production environment. This makes the new costs riskier to adopt and more challenging to manage. Multi-year contracts, high-salary employeees, and other factors limit the nimbleness of management in the digital realm.

Organizations often struggle to come to grips with these realities. Some evangelists and some leftover expectations of technophiles can perpetuate the perception that digital business can be cheaper and easier to run than traditional business lines. These same dreamers can also inflate revenue expectations in their ruminations of what must surely be possible in this brave new world. Yet, without a realistic and actionable path to rationalize expenses and make achievable projections based on actual expenses and revenues, organizations can fall prey to vague strategies that lead to overspending without a rational commercial upside.

An analog worth contemplating is banking. More than 97% of financial transactions occur in the digital world and not through the exchange of physical money -- from swiping debit cards to e-commerce to wire transfers. Yet, interest rates on deposit accounts trend toward 0%, while fees for transactions are rampant. Credit card companies charge high lending rates to cover the risk exposure they face from hackers, defaults, and card thieves. Customer data protections are expensive and vulnerable. ATMs and other customer conveniences have to be maintained in the physical world, to dispense the small amount of physical goods bank customers still rely upon. New entrants -- Apple Pay and Samsung and PayPal and Square -- extract fees themselves for offering customer convenience, putting their fingers in the economic pie.

The scenario of more middlemen, more outlets, and larger players isn't alien to academic and scholarly publishing. And all the players in the digital game require support and management. New business relationships have some new players holding more cards (e.g., Google, advertising agencies, funders), and publishers seem to have fewer or worse cards than before. Different data distribution tasks, management of risk around commerce and customer data, and the lingering desire for print and print analogs in a customer base largely protected from the actual costs of information, all contribute to what is a more complex and expensive publishing environment. Meanwhile, leverage continues to shift to enterprises with network-effect scale.

Having solid strategies and a clear overview of the financial cost and potential upside is critical during this period. Belief in a more efficient digital business isn't enough. You have to build that efficient business. Every blind spot in this realm can be costly. And every organization can benefit from a long-term strategy and a proactive approach to the digital publishing technology and evolving forms of customer engagement.

Why Caldera?

Since announcing Caldera Publishing Solutions, a number of friends and colleagues have asked why I chose the name "caldera."

I was born in Colorado and grew up in the Rocky Mountains, in a family with many generations involved in building and preserving Yellowstone National Park. As part of this tradition, I spent a week or more each year visiting the park, and have maintained the tradition for my own family.

One of the most interesting features of Yellowstone -- in fact, the feature that largely makes the park so distinctive -- is the caldera beneath it. This remnant of an ancient volcano still simmers, with geyser basins, mud pots, and heated prismatic pools as evidence.

So, when I was thinking of a name for my consulting firm, I hearkened back to this image of a long-standing reservoir of power, tapped sporadically and unpredictably, ultimately settling on the image of Old Faithful, the most predictable geyser in the park.

To me, the caldera represents a reserve of potential and power, which most organizations possess. My goal with Caldera Publishing Solutions is to help tap into that reserve for the long-term. Old Faithful has been attracting crowds for well over a century. It is a feature of the caldera. As the landscape shifts, however, new features are emerging, and we have to be ready.

It's not a perfect image, but to me, it's inspiring and personal. It brings my family's story to the present day. Taming the caldera is impossible, but benefiting from its power and features is certainly within our ability.

Decimalization, Money, Science, and Libraries

One of the more interesting perspectives I've read recently related to the history of science comes from a book called, "The History of Money," by Jack Weatherford. This is one of my favorite books because, like it or not, money makes the world go 'round, and this clearly written and well-constructed book, although a bit dated, still illuminates much about a subject that is relevant to more than we might imagine, science and technology included.

How we value goods and services has been an eons-long project, and quirky systems proliferated for a long time (and do to this day). However, a familiar and more rigorous system emerged with the move toward a decimalized (100-base) currency system.

The first major move toward decimalization of our world came from Russia's move in the early 18th century. Other currency systems of the time were based on an odd assemblage of values. The Spanish dollar consisted of eight reales. The English system rested on farthings, crowns, sovereigns, pennies, guineas, and pounds. In this system, a shilling consisted of 12 pence, and a pound of 20 shillings, making a pound worth 240 pence. A guinea (a pound and a shilling) was worth 260 pence. This made it easy to perpetuate fraud and obscure price changes, and generally harder to do business.

The Russian system was not adopted in Europe because Russia was viewed as backwards. However, the United States, an ocean away, felt no such qualms, and became the first coinage system to become wholly decimal-based -- 100 pennies = 1 dollar. France soon followed, with the move to decimal-based currency coinciding nicely with the development of the metric system.

However, the metric system was not an instant success. While the scientific community saw the value of a standardized system of measures, agreeing on which set to use was a challenge. James Watt was a major proponent of standardized systems, but his ideas were adopted incompletely, with the main ones remaining being the watt and horsepower (747.5 watts).

It was within the bounds of international trade shows -- World Fairs and the like -- that commercial and scientific measurement systems commingled, synthesized, and standardized. As Weatherford writes:

At the urging of industrialists and other supporters of the metric system for coins, weights, and measures, scientists gathered for an international statistical meeting during the . . . world's fair, held in Paris in 1855. . . . The Vienna Coin Treaty of January 24, 1857, helped move forward the adoption of the decimal system for coinage, and it encouraged the adoption of metric weights and measures.

There is a particularly rich and rueful section omitted from this simple summary, which is one of many examples from the book showing how Weatherford has a great sense for historical irony.

In another development tying information systems into this entire commercial and scientific mix, we have this, following the observation that Americans have been loathe to adopt the decimal system for their weights and measures:

Americans, however, did apply the decimal system in an unexpected way through the work of an otherwise little known New York librarian and Columbia University professor, Melvin Dewey. He divided library books into ten classifications, which he further divided and redivided into what became known as the Dewey decimal system.

Money, libraries, and measurements all filtered down from the upper reaches of society to the poorest segments, making simple acts like "counting to 100" and "dividing by 10" second-nature for all. Stories reflected the shift into greater awareness of financial matters and the transformative power of harnessing nature, as "Jack and the Beanstalk" deals with a silly barter for beans that ultimately transform into a way to gain the secret wealth of a giant, and Rumplestiltskin, in which straw can be spun into gold and a pinprick can lead to death.

Weatherford writes convincingly about how the decimalization of value, the natural world, and our information sphere changed the world:

Money forces humans to reduce qualitative differences to quantitative ones. It forces a numbering of things, and this quantification allows things that are very unalike to be compared. This numbering pushed ahead the development of mathematics, and when applied to other fields, it made science possible. This tendency toward numbering became the basis of modern thought from the rise of mathematics and science through to the increased objectification of law, medicine, teaching, and virtually all other professions.

We continue to see these approaches proliferate, from sabremetrics in baseball to the scores related to movies in Rotten Tomatoes to the ratings on shopping and recipe sites -- they all point to the drive to quantify experiences and activities for the sake of comparison. In scholarly publishing, the impact factor, altmetrics, and a variety of other measures represent similar attempts to allow us to compare.

This kind of empirical approach to the world informs success at many levels -- in how information is organized, how we understand the world (including customers and their values), and how we plan for the future.

The Federal Trade Commission Contemplates Big Data

Earlier this month, the US Federal Trade Commission issued recommendations for businesses seeking to implement big data solutions. The report summarizes multiple sources -- a public workshop held last September, with four panels tackling different aspects; 65 public comments from citizens, industry representatives, think tanks, consumer groups, privacy advocates, and academics; and an earlier seminar on big data held in 2014.

The main caveat emerging from the report is that companies can inadvertently stumble into discriminatory actions if their big data inferences hit a protected class in a discriminatory manner, especially if a more direct approach could have prevented the discrimination. For example, a big data decision to not market a good price to a particular postal zone could also be perceived as depriving a protected class (by race, religion, gender).

Potential problems with careless big data inferences include inferring that certain geographic or behavioral traits make an individual ineligible for credit, favorable rates and discounts, or beneficial treatment (such as solicitations from top universities). Because individuals sharing certain protected demographic features (race, religion) can share a common postal zone or set of shops, inferring an individual's traits from population correlations could put a business at risk for being sued for discriminatory inferences, so to speak.

The potential for bias seems considerable, and the subtitle of the report captures the main concern: "A Tool for Inclusion or Exclusion?" For instance, if data from wearable devices are used to determine whether certain civic funding occurs (parks, repaving projects), the affluent areas could benefit disproportionately because wealthy people have FitBits while poor people don't.

The authors are careful to note that many benefits can accrue from big data used properly, including finding clever ways to pick promising people out of larger pools. For example, some people in impoverished areas are on the right track, and by finding them, new lending, educational, and work opportunities can be created. Some companies are using their big data tools in just this manner, and creating win-win scenarios.

The report has plenty of interesting examples worth contemplating:

  • A credit card company that rated consumer credit worthiness based on whether they'd paid for marriage counseling, therapy, or tire repair services, based on inferences within their big data set.
  • The ability of companies to identify "suffering seniors" with early Alzheimer's to exploit with offers.
  • Preferring job applicants based on whether they used a browser they installed themselves (Firefox, Chrome), rather than one that came with their computer, as they'd found employees with these traits "perform better and change jobs less often."

The references point to a site worth a visit, if only to remind you that correlation does not equal causation: Spurious Correlations. On the day I visited, the featured chart correlated US spending on science, space, and technology with suicides by hanging, suffocation, and strangulation. Maybe the recent increases in the NIH, NOAA, NASA, and NSF budgets aren't the unvarnished good news we initially thought . . .

The FTC report is worth a look, if only to remind ourselves of the limitations of big data, which the authors capture succinctly:

Companies should remember that while big data is very good at detecting correlations, it does not explain which correlations are meaningful.

Or which correlations are risk-free.

As businesses become more digital at their core, data will become more central to success. And not just analytics, but rich customer data. Managing these data, using them judiciously and efficiently, and ensuring compliance with various laws and expectations will be vital to long-term strategic change and growth. Companies that begin early will learn first and best if they are diligent.

Review: "The Victorian Internet" by Tom Standage -- and Why It's Relevant

Before TCP/IP, there was tap-tap-tap -- and similar social and economic consequences came with both communication improvements. That's the underlying theme of Tom Standage's "The Victorian Internet," a short book packed with interesting facts, tie-ins, and characters relevant to the modern communications age. The book was first published in 1998, and updated in 2007. It seems very relevant, however, as the history it captures speaks so clearly to the modern era of electronic communication.

The evolution of the electric telegraph from the visual telegraph -- yes, the Victorian Internet actually started with two L-shaped arms waving signals across the French countryside, which the British changed to eight lozenges in two columns -- was not assured. It took new technology, materials, and systems before it came into existence. Even then, it wasn't obvious what to do with the electric telegraph. Military and government use? Commerce? News? Personal communications?

The technical challenges were daunting. Insulating wires, finding impedances, powering signals, laying trans-oceanic cables, and so forth -- it was all new territory, and competing theories and inventors provide a brief, colorful backdrop to descriptions of the electric telegraph's early days. Connections to some parts of the modern world also become clear, such as why there are a number of place names called "telegraph hill" and why AP and UPI are/were called "wire services." How Reuters emerged is particularly interesting, as is the utility of pneumatic tubes as an affordance for decentralized telegraph offices. The prevalence of "handles" in place of names for operators, the use of abbreviations to capture ideas and emotions -- it all seems a bit familiar.

These social aspects of the Victorian Internet are perhaps the most riveting part of Standage's stories. Romance online. Spying and misinformation, as well as encryption. The tension between local and world news. Governments struggling to keep up with the pace of commercial change. The role of electronic communication to prevent war and as a tool for warmongers. There were even authors speculating at what all this meant, futurists from the past.

And the Victorian Internet is still with us. Remember the earlier description of the eight lozenges in two columns? These lozenges could be turned to represent certain characters or concepts. Today, we use a similar scheme, called ASCII -- 8-bit strings to represent various characters or concepts in the online world. The distance from telegram to Twitter to Instagram is shorter than we may realize.

Written with verve, Standage's book is worth a read, especially as organization's face uncertainty about how to respond to today's unique publishing challenges. If past is prelude, there are clues here as to how a combination of clear thinking and nimble response helped some firms and individuals thrive in new realms.

Introducing Caldera Publishing Solutions

Today, I'm pleased to announce Caldera Publishing Solutions, a new consulting firm in academic, scholarly, and scientific publishing designed to help non-profit and for-profit publishers gain market advantages and develop sound long-term strategies designed for growth and differentiation.

First, let me introduce myself. My name is Kent Anderson. I'm a past-President of the Society for Scholarly Publishing (SSP), former Publisher at AAAS/Science, and former Publishing Director at the New England Journal of Medicine. I've had many roles in scholarly publishing, including serving as a CEO (STRIATUS/JBJS), Board member (SSP), strategic advisor (many different organizations), Managing Editor (American Academy of Pediatrics), and blog writer and editor (Scholarly Kitchen).

For me, Caldera represents an exciting culmination of experience. I hope I can bring what I've learned over the years -- formal training, empirical experience, and general "how to" techniques -- to a broader array of organizations. My forte involves rapidly and accurately assessing a situation, then delivering a framework for future growth and success. Scholarly and scientific publishing is more complex and challenging than ever, and new strategies within and around it abound. I can help you make sense of them in a way that's appropriate to your customer and market context.

Caldera is founded with your growth and success in mind. The main image of a caldera is a large spot of tremendous, untapped power and energy, which can last for eons if vented properly. My goal is to help organizations find the best way to tap into the hidden power of their market position, communities, talent, resources, and product offerings.

To get started, please email Caldera at contact@caldera-publishing.com.