Using audience metrics to measure content impact

Content creators and mangers are often faced with a relative one-to-one need/supply proposition: A need for specific content is identified, that content is supplied – either by creating new works or reconfiguring existing modules. Too often, however, consideration of the content’s efficacy ceases to be a concern once that immediate need is fulfilled.

This is bad business. Even using automation, the act of content creation is still resource and time intensive. To avoid redundancies or ineffective content, content marketers and managers need to measure the impact of the materials they generate. But which metrics offer the most meaningful insight?

Journey mapping: Naming the nameless
Crafting content that connects with its preferred audience requires understanding audience objectives – and how these objectives match enterprise goals. While often process can be as simple as soliciting direct feedback or taking requests, sometimes this ends up being more esoteric. Audiences may not know the kind of content that will have the most impact and fulfill a direct need – primarily because they may have never encountered content like it before. You can’t identify something that has yet to be named.

“Starting by aligning with your audience personas.”

To attempt to identify a need – naming the nameless, so to speak – start by mapping the journey your audience is taking involving your enterprise. Starting by aligning with your audience personas, use trend data to trace the lifecycle of a customer and their content needs.

The Content Marketing Institute identifies five distinct stages of the customer journey:

  • Awareness. A customer encounters a piece of content.
  • Interest. The customer expresses some form of active engagement with the content or the enterprise behind the content.
  • Evaluation. Both customers and enterprises take a closer look at the content products available and determine value.
  • Decision. Customers purchase and/or implement the content.
  • Retention. Any after-purchase activity, from returning audience members to subscriptions, to organic promotion.

This journey map is by no means fixed: Smashing Magazine, for example, only has four stages, consolidating evaluation and decision into a single “conversion” stage and dubbing retention “reward.” Regardless of specific structure of terminology, the goal is a simple lifecycle map that you can use as the bedrock for tracking where each content interaction – or “touch” – occurs.

‘Attention’ metrics
As Keith Grossman, global chief revenue officer at Bloomberg Media, told eMarketer, even if you understand the audience journey, knowing where the most valuable touches took place can be tricky. There is minimal standardization across any content industry by which engagement can be meaningfully measured, leading Grossman to measure the ability to grab “attention” rather than time spent or clicks.

“There’s a slow evolution taking place. The marketplace is trying to understand the right metric to standardize against, but it hasn’t figured itself out yet,” Grossman told eMarketer. “The question is, what is the proper amount of time to measure success and engagement? If we agree on 2 minutes but then I give you a 200-word piece that takes 2 minutes to read, that’s not successful.”

Key performance indicators
At each stage of the journey, there are opportunities to identify key performance indicators as well as gaps points in the customer experience that are disjointed or painful, causing the customer to drop out of the cycle. Crafting these KPIs is particularly important since these are the metrics that you can use to measure how effective or enticing the content. Some common metrics CMI identifies include: keyword rankings, impressions, overall search visibility, webinar registrations, white paper downloads, conversions, shares, comments, subscription renewals and social media engagement.

“KPIs in a audience journey aren’t the be-all, end-all.”

Credit where credit is due
The presence of these KPIs in a audience journey isn’t the be-all end-all. Rather, the value of these KPIs as metrics fluctuates based on situational factors and audience person. What may be a valuable metric to measure engagement for one audience member may not be true for another. This is where analytics platforms step in to create an attribution model, weighting KPI value based on profile engagement trends.

This may mean mean that certain touchpoints in the audience journey have more value than others: For some, that first touch deserves more credit, for others it is the last touch that will foster long term engagement. Weighting touchpoints can help recognize the inherent attributes of the content that seems to resonate most with which audiences.

Segmentation to measure performance
Finally, once you have your KPIs and attribution models in place, you can build audience subsets out of your analytics data. Segmenting your channels and measuring engagement rates can help confirm or debunk assumptions born out of trend projections. Look at the frequency and depth of these engagements and the dividends they pay. The goal is to match segmentation, KPIs and attribution models with the data derived from your audience journey.

Dynamic Delivery vs Dynamic Content

Dynamic content is a staple of modern marketing and customized web experiences. Our ability to manage content in a CCMS at ever higher levels of granularity is matched only by the increasing sophistication of our tools for building and tagging content. Add to these increased automation and computer learning, and it’s clear that content delivery is ready for another round of innovation.

Delivery bottlenecks
With componentized data modules and advanced CCMS enabling fast, smart content configuration on demand, the possibilities related to the creation of personalized content are virtually endless. What is not endless are the ways users encounter content. Websites, social media, ads, emails and newsletters: We can customize content on demand and deliver on these platforms, but we are still running into problems when it comes to controlling delivery on a granular level and avoiding pile-ups related to speed and performance.

“How can we optimize delivery of dynamic content?”

We’ve been here before. In 2001, Greg Parker, CEO and president of SpiderCache, prophesied to ComputerWorld, “There is huge growth occurring in dynamic content. People are moving away from static ties [sic] to dynamic content-driven sites, and that exposes the bottleneck [dynamic content] causes….  [Traditional caches] can’t handle the performance required for the speed of delivery of dynamic content.”

His company sought to address this problem for web sites by building a more dynamic page-caching capability, which was good for the time but does not fully address content delivery in the era of apps and messages. So the question is: How can we optimize delivery of dynamic content?

Going where the users are
Optimizing delivery requires identifying where users are interacting with content. Innovative content delivery ideas in 2001 are ready for retirement since apps are common platform where content is being delivered. Apps provide the opportunity for real-time feedback to content delivery engines, so algorithms can take advantage of that feedback to send content with more intelligently than is possible when sending content to a website. And that may be changing soon. Some industry experts caution that the app marketplace may soon collapse given the dominance of platforms like Facebook, so there may be limited utility in investing the time into building out an innovative delivery infrastructure for mobile apps.

So where can content delivery be optimized? Newsletters and emails still remain a vibrant and engaging delivery method. With audience segmentation rules allowing a CCMS to build tailored content, more granular and specific content delivery could come from turning attention to the modules of a newsletter: inserting a customized event calendar based on user location, promoting specific products and services based on individual user engagement metrics and so on. Rather than separating a newsletter delivery into broader “campaigns,” a CCMS can manage user profiles and build a unique product on demand.

“Forms essentially act as a type of content.”

Optimizing forms
Forms, where users voluntarily submit data, are one of the most invaluable assets for content managers. While behavioral and engagement data can inform an algorithm, as well as show and potentially predict patterns, forms offer a direct pipeline of data that can be integrated straight into a CCMS.

The relationship of forms to content and delivery is more than symbiotic. Forms essentially act as a type of content. They can be optimized and made dynamic based on user behavior.For instance, if your profile shows that a particular shopper on an e-commerce platform has a high bounce ratio, you can deliver a simplified form that asks only for the most vital information. A user who returns again and again to a platform and spends a significant amount of time on certain sites can be paired with a form that focuses on data related to his or her browsing patterns. By being able to gather specific, in-depth data from users, the ability to glean even more granular insights about content grows. This in turn fosters CCMS sophistication, enabling pinpointed, dynamic delivery.

Content strategy best practices

Marketers seeking to foster engagement and content usage have a variety of tools and tricks to make sure the data hits in the right way. In this article we explore one of each to ensure that your content strategy delivers valuable material.

Science: Establish your content benchmarks
According to the Content Marketing Institute’s Ahava Leibtag, the key to creating valuable content is to establish specific benchmarks, by which you can measure the efficacy of the content you have created. While some of Ms. Leibtag’s benchmarks have psychographic elements, most of the challenges of creating high-value content can be traced to technical aspects related to CCMS integration, as well as dispersed teams struggling to follow a comprehensive, coherent plan.

Taking a page from “Checklist Manifesto” of Dr. Atul Gawande, Leibtag urges content strategists to break things down into an easy-to-follow workflow available for the entire team, checking off various metrics along the way. The internal checklist serves as a guide for all departments when assembling content for publication. As part of her step-by-step content creation checklist, Leibtag has content teams ask the following questions:

  • Is the content easily able to be be found? To ensure findability, Leibtag urges the inclusion of heading structures that produce heading tags (such as the <h1> tag) in HTML, significant SEO-optimized metadata, links within body copy and text descriptions that find there way into @alt attributes in HTML for images.
  • Is the content readable? Design and formatting play a vital role in content readability, which ultimately may determine its relevance.
  • Is the content comprehensible? Distinct from simple readability, comprehension considers the audience’s reading and understanding level as well as writing for that specific audience.
  • Is the content actionable? What kinds of calls to action or invitations to share are associated with the content? Are users compelled to comment? Is feedback solicited?
  • Will users share the content? This ties to the organic integration of a call to action and the ease by which content can be shared via social media.

Artistry: Emphasize content relevance
Creating relevant content requires a different sort of content tailoring. The first task is to zero in on data that is relevant to your audience and then structure that content to present this information clearly.

“Establishing credibility with your audience is crucial.”

Second, establish credibility with your audience. Choose data and topics that are squarely associated with your area of expertise, leaning heavily on hyperlinked connections to unimpeachable sources and timely thought leadership. By showcasing connections to other content sources within your chosen market, you show that your content is part of the larger ecosystem.

Finally, innovate. Repurposed content has limited value and may invite litigation. Instead, use previously published data to synthesize new, innovative content that challenges previous assumptions and findings, creating content of inherent value and specificity that links directly back to your organization.

Is your content safe from cybercriminals?

Privacy. Passwords. Phishing. Breaches. Trolls. Hackers. Bots. No matter where you look, everyone seems to be using these terms in the context of cybersecurity. Any business with a digital presence is a potential target for cybercrime, which helps to explain why the words used to characterize and describe cybersecurity have crept into our business vernacular and become part of the daily news—and late night comedy—cycles.

No industry sector seems particularly immune from cyberattack. Attacks have been aimed at political organizations, credit bureaus, law enforcement agencies, retailers, universities and schools, entertainment companies, financial, automotive, insurance, pharmaceutical, health and hospital firms.

You might have noticed that most cybersecurity stories have to do with the release of—or unauthorized access to—personally identifiable information or sensitive personal information. That’s because this type of information is a valuable commodity as it can be combined with other data (or utilized on its own) to identify, contact, or locate a single person, or to identify individuals in context.

The National Institute of Standards and Technology define personally identifiable information as, “any information maintained about an individual, including (1) any information that can be used to distinguish or trace an individual’s identity, such as name, social security number, date and place of birth, mother’s maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information.”

Protecting the Confidentiality of Personally Identifiable Information — bit.ly/protectingconfidentiality

50 million Facebook profiles harvested for Cambridge Analytica in major data breach — bit.ly/FacebookBreach

PREVENTING UNAUTHORIZED ACCESS

Unauthorized personal data disclosure is bad for business. It transforms a company’s carefully crafted image into headline fodder. A disclosure redirects corporate resources away from the advancement of products and services and toward the rebuilding of customer trust, brand loyalty, and operational integrity. It forces business leaders to make cybersecurity part of everything they do, and attempt to anticipate and prevent an increasing variety of cybercrimes.

Customers are not affected solely by the disclosure of personally identifiable and sensitive personal data; customers are also impacted by an organization’s content. That’s because content is the intellectual property of the company. When it is insufficiently protected, the entire organization is put at risk.

IT’S NOT JUST PERSONAL DATA: CORPORATE CONTENT SHOULD BE SECURE, TOO

History shows us that a motivated perpetrator of cyber-misdeeds can gain access to content stored in content management systems with the help of free or inexpensive software tools. Such access allows a digital hooligan to change, replace, or delete content that would otherwise be of help to a customer or prospect. Whereas in the past such tools required a certain level of digital wizardry, the widespread availability of free or low-priced hacking-as-a-service (yes, you read that right) “offerings” make it possible for novices to easily get into the mix.

The news media tends to focus most of their storytelling on sloppy data management practices and data theft at big name brands that impact hundreds of thousands or millions of consumers. There are actually plenty of examples where the lack of a formal cybersecurity plan made it relatively easy for a company to accidentally leave the data doors open for cyber-attackers to do damage to content.

All of this to say that content should be just as secure as any other type of intellectual property or data a firm collects, stores, and uses to conduct business. To be clear, the popular media’s focus on cybersecurity stories with big numbers and wide impact is designed to attract a lot of eyeballs for the benefit of their respective advertisers. Viewers get all the juicy details about data thefts and accidental disclosures reported by brands like Adobe, Exquifax, Bitly, Disqus, DropBox, Forbes, Home Depot, Yahoo, Linkedin, and Target. However, the situation is just as serious for the luncheon meat firm with lax content security.

While it may be hard to imagine how poor cybersecurity could damage a luncheon meat company, the examples that follow illustrate how casual corporate content security and laxed content governance can have a negative impact on revenue and public relations.

LEAVING CONTENT UNDER-PROTECTED: FEW EXAMPLES

HARGREAVES & SON: ALTERED LABEL CREATES NEED FOR PRODUCT RECALL
H.R. Hargreaves & Son, the makers of a luncheon meat product sold primarily in the UK, were shocked when they discovered in the news media that the ingredients list on their package had been altered. The primary ingredient was no longer ham; it was dog sh*t. The culprit was a disgruntled employee who had intentionally altered the label as a prank.

The company took a substantial financial hit as a result, having to do image repair and run a product recall.

CLOTHING MAKER: ALTERED LABEL LANDS MANUFACTURER IN POLITICAL TURMOIL
The management of a clothing line realized someone had added an extra line of text to the “Made in the USA” tag included in the inside of their garments.  The additional slogan: “Don’t blame us, we didn’t vote for him.” The culprit was again a disgruntled employee.

No product recalls were necessary, but the altered label alienated a portion of its customer-base and recast the company’s image into something more political than patriotic.

O’REILLY AUTO PARTS: FICTIONAL PRODUCTS INTRODUCED ON WEBSITE
The automotive parts retailer, O’Reilly, received an increasing number of calls from prospective customers about their catalog item with part number “121G”. O’Reilly doesn’t sell this product—the Flux Capacitor is a time-travel device made famous in the science fiction comedy film, “Back to the Future”. The culprit is, at the time of this writing, unknown.

O’Reilly has not yet taken the item out of its online inventory, opting instead to add the words, “This item is
not available for purchase” to the listing.

YOUR NEXT ACTION

What would be the impact to your firm if you woke up one day to find out that your entire content catalog had been replaced with bogus information—or worse, erased and no longer available?

As is the case with many content snafus, having a proper response plan in place—and selecting the right tools for the job—are critical success factors that will help you minimize the negative impact content hackers could have on your business.

Take a page from the book of lessons learned from the data thefts and accidental disclosures suffered by the big brands. Reexamine how your content is created, curated, translated, and disseminated, and institute policies and procedures that treat your written intellectual property as valuable as your customers personally identifying information.

When a CCMS meets customer intelligence platforms

Developing alongside the increased sophistication of CCMSs, customer intelligence (CI) platforms represent the sibling to content management and delivery software, with more focus on the marketing aspect. Yet when the two work together – either as an integrated, automated build or as different departmental concerns – the true potential of effective content can be unlocked.

The Power of Personas
As we’ve written about in the past, one emerging aspect of a modern CCMS is the ability to configure data modules based on audience persona. CI platforms are where the bulk of the persona building can occur, since CI is designed to capture and sort customer data. Using analytics, CI platforms can break consumer data into demographic categories, such as:

  • Personal demographics like age, income level, debt level, educational profile, marital status, and other lifestyle features.
  • Geographic demographics such as residential location, country of origin and travel patterns.
  • Attitudinal data, showing preferences and taste.
  • Situational customer data, identifying the specific method and environment in which engagement occurred.

This last metric is can hold particularly compelling data when building an audience member persona: Recent research has shown that the device by which a customer accesses content can portend certain behavioral patterns, which in turn can inform a trend line and help guide content marketers in pairing content to hungry audiences. For example, according to AddThis, desktop users are above 50 percent baseline consumers of job and tech related content, while mobile users consume weather-related content a whopping 355 percent above baseline.

“There are several key data aggregation tools that CI platforms can utilize.”

Sourcing Data
To glean this data, there are several key tools that CI platforms can utilize. The first is direct gathering. Surveys and response boxes built into existing content can effectively gather meaningful data, volunteered directly from the customers themselves. However, this data cannot be deemed 100 percent verifiable, as it gives customers the opportunity to mislead or omit what might otherwise be useful information.

Other means of gathering tend to be indirect. Whether it’s scraping a popular social media platform with an eye for keywords or looking at IP addresses to determine location, indirect gathering can actually offer more verifiable insight, albeit perhaps more limited in scope, and deeper implications.

Customer Development
One of the key areas of potential for CI is the concept of Customer Development. As defined by Brant Cooper and Patrick Vlaskovits, authors of The Entrepreneur’s Guide to Customer Development, customer development represents the culmination of CI insights into a responsive understanding of not just behavioral cues, but also what customers are looking for from a piece of content you can provide.

“The results of the customer development process may indicate that the assumptions about your product, your customers, and your market are all wrong,” Cooper and Vlaskovits told the content marketing Institute. “And then it is your responsibility, as the idea-generator (read: entrepreneur), to interpret the data you have elicited and modify your next set of assumptions to iterate upon.”

Customer development is still in its infancy, but the capabilities of CI platforms are ever-expanding. With increased integration with CMS, the ability to configure even more effective, impactful content is well within reach.

“Uplifting” to a Content Management System

When it comes to migrating content from an unstructured paradigm to true XML-based authoring in a component content management system, the challenges include changes in content architecture and navigation, forcing you to make important decisions about the design of your content well ahead of migration.

Unstructured content, including what is stored on intranets, can differ dramatically from content that is prepared and approved for distribution to customers in both form and function. According to CMS Wire, those looking to transfer content from an intranet to a CMS are likely to run into one or more of these issues:

  • Complex or counter-intuitive data distribution.
  • Data that is transactional or with a narrow application (and thus incomplete or incomprehensible outside of context).
  • Information that is client-specific or author-specific, requiring careful review and perhaps specific permissions for publication.

Each of these challenges represents an aspect of migration planning that must be addressed by the enterprise and the content architect.

“Each of these challenges represents an aspect of migration planning.”

‘Lift and shift’
So how should enterprises approach a content migration? The simplest answer from an organizational perspective is what many CMS experts dub the “lift and shift”: the manual re-entry of unstructured data into some kind of XML structure. This effort consumes significant man-hours, and often the data quality is compromised by the repetitive, tedious nature of the task.

Automation could be the answer to some migration woes, but before you set about re-keying unstructured data as structured content, developing a coordinated plan of attack is key. Here are the four main steps for planning the most efficient migration possible.

Step one: Content inventory
Understanding the breadth and scope of your migration will give you a sense of the resources required and a potential timeline. Access your existing content library, making sure to include any additional repositories that may be included in the structured system.

Inventorying content requires identifying:

  • Content permissions and authorship requirements.
  • Locations of content.
  • Content formats.

It should be noted that an inventory is just that: a moment prior to a migration where you take stock of what you have. The next step is where you take a more evaluative eye to the content that you will (or will not) be migrating.

“Content auditing involves looking at the content on hand.”

Step two: Content auditing
Content auditing involves looking at inventoried content and making decisions about its relevance. Often, this is where an enterprise puts its content to the test, identifying it as worthy of migration or earmarking it for deletion/archiving.

“Chances are, you have quite a bit of content out there that nobody — user or owner — has looked at in a long time. It happens to everyone,” Alicia Backlund, Product Strategist at Level Five Solutions, told Nielsen Norman Group. “An intranet redesign is the perfect reason and opportunity to take a good, hard look at your content and move only what makes sense to move.”

When auditing, ask:

  • What is the age of the content?
  • When is the last time this content has been accessed, either on its own or as a component of another piece of content?
  • Is this content still relevant/accurate? Is it incomplete on its own?
  • Could the user interface for accessing this content have been more intuitive?

Step three: Mapping
Once you’ve made sense of the content – where it currently exists and what data you need to transfer – you need to map your data into the new XML structure. This includes determining pathways to shared content as well as the basic usability plan of the new component content management system.

Consider the migration sequence of your content. Is there operationally essential data that should be moved to the front of the line, giving you leeway to complete the rest?

Get your plan in place, and then start your migration. Once initiated, track where content ultimately ends up and compare it to your original plan to help ensure that the process is a success.

How “right” where they? A look back at the content and content marketing trends predicted for 2017

2017 was right around the corner when CMS and content marketing specialists began churning out trend projections for the new year. Now that 2017 is done, let’s see how well they saw the future. More weight is given to correctly foreseeing less prominent drivers taking the spotlight; no fair claiming credit for trends that were already visible at the end of 2016, such as increased automation, growing CMS sophistication, and the increased integration of community editing.

Trend: The rise of rich media content.  Prediction Weight: Low.

CMS designers have already jumped to the task of integrating increasingly sophisticated data languages and digital asset managers into content structures. The next frontier is content with native – non-plugin-based – media embedded as modules. Video, interactive tools, games, survey and app extensions are all primed to become the next aspects of content language.

“The next frontier is native media embedded as modules.”

While pundits got this right as the “next frontier” in content, 2017 did not see a wave of new content with embedded native media.

Trend: Content as a tool of native advertising.  Prediction Weight: Low.
The ubiquity of free content has changed the way that content authors approach development models. One of the more unsavory aspects of this evolution – from the consumer perspective – is the idea of “clickbait.”

It was Columbia Law School professor Tim Wu who pointed out that “clickbait” as actually an organic evolution of content marketing and commerce. Without the ability to monetize content directly, authors would naturally become more psychologically savvy in the type of content they develop, designing it to attract the eye and hold attention.

“This attention-merchant model has spread to so many areas of our life, where we’re completely used to everything being free,” Wu explained. “But then the payoff, or the exchange, is that then we also agree to stuff that is compromised, because it is always trying to get us to click on ads at the same time.”

“Experts” warn that users are growing savvier and more cynical about brute-force advertising efforts – but there is likely to be a shift to different monetization models that may help abate some of its more unpleasant effects. The Content Marketing Institute points out that native advertising – where ads are part of the content, albeit relatively unobtrusive and embraced by authors – could prove an effective countermeasure to clickbait that is minimally disruptive to the content itself.

Perhaps.  But as a prediction, “native advertising” was a flop.

“Social media will be further divided into factions.”

Trend: The social media split.  Prediction Weight: Medium.
Social media consultant and trainer Andrew Davis warns that social media content is heading for a major fork in the road, based on behavioral patterns of users. Similar to the warnings that previous content experts have given about messaging apps supplanting traditional web pages, Davis sees a future where social media – already acting as the hub for nearly all web-based interactions – will be further divided into factions.

“Social media will be split into two areas: the visual web and the community focused web,” Davis tells Writtent. “Visual content will increase at a rapid pace but so will messaging platforms.”

This split was particularly evident in the Russian meddling of the 2016 US Presidential elections.  The so-called “Internet Research Agency” company exploited the trend toward factionalism to develop a series of Web sites and personas that attracted the interest of scores of like-minded US voters. Whether or not this tactic affected the election itself is the subject of vigorous debate, but to some extent that debate is moot given the level of participation and engagement these Web sites and personas engendered.

The Internet’s broad reach allows people of similar persuasions to connect easily even if they live in comparative isolation. It will be challenging for many brands, accustomed to mass marketing, to adapt successfully when they find themselves cut off from their consumers. As closed-loop messaging subverts content development and marketing algorithms, Davis sees many brands retreating to traditional web and visual content advertising, rather than doing the hard work of nice advertising.

Trend: Reevaluation of content efficacy.  Prediction Weight: Low.
According to the Sword and the Script, anywhere between 60 and 70 percent of B2B content developed by brands and organization goes unused. This points to the ease and inexpensive nature of modern content development, but also underscores that brands haven’t yet mastered creating content that connects with users. Focusing on additional automation and perfecting user personas may mitigate some of this waste, leading many experts to say that 2017 will be a peak year for content authors who can guarantee effective content.

Regrettably, 2017 wasn’t the peak year for effective content.  Furthermore, there will be no peaking of content that piques anyone’s interests until the incentives of content authors is aligned with the interests of content consumers.  As it stands now–and has stood for several years now–the majority of content authors are compensated on the amount of material they produce; there is no gating factor imposed by the expense or technical challenge of putting that content on the Web.  So, there is little incentive to produce content that people want to read.

If you want to see a peak in effective content, tie a content creator’s compensation to the number of page-views and the net increase in backlinks.  With a properly incentivized writing team in place, you will definitely see a reevaluation of content efficacy.

Retrenching the content ecosystem: Smaller ponds for more manageable content management

Over the past few years, the greater content ecosystem has expanded dramatically – pushing content management to its farthest reaches. What used to be small ponds of data has become an ocean, leaving many content management system designers struggling to handle all the user generated content and data modules that are now flooding in. While improvements in automation to better handle high data volumes may be on the horizon, the fact that they are beyond our current capabilities means that CMS design may be temporarily better focused elsewhere: into retrenching internal data ecosystems into smaller, more relevant ponds.

Connected themes making connections 
Content creation and discovery begets new content, as data is processed by new users and reconfigured into forms that meet a niche. This is, fundamentally, the purpose of good content and is woven into the structure of modern content languages: inter-connectivity, hyperlinking, the ability to be shared and reconfigured on demand, while still preserving some modicum of authorship tracking. Without inter-connectivity, content represents a dead end for users – a nearly insurmountable obstacle in the modern ecosystem.

“Unprecedented ease in authorship comes with a distinct drawback.”

Here comes the flood
Of course, even in this time of unprecedented connectivity comes with a distinct drawback. Users are very nearly drowning in a flood of content, with the influx showing no signs of abating any time soon. According to CSC, we will see a staggering increase in data production by 2020 – growing 4300 percent. With this much data, the focus on the interrelationships between data that matter most to us is paramount.

While many assume that this interconnection between data is the source of the volume, this is more of a situation where the tail wags the dog. Content authorship has expanded due to simplified creation tools and platforms – breaking down traditional barriers to entry – making interconnection the only means we have to navigate such high volumes of data.

Creating the smaller pond
Since the operational challenges of dealing with the unrelenting torrent of data are beyond the capabilities of even the most sophisticated automation and web-scrapping capabilities, content marketers and CMS designers may have to turn to a counter-intuitive solution: creating smaller, proprietary ecosystems.

This “smaller pond” tactic may seem like a step backward in the overall pursuit of a larger, more integrated content ecosystem, but it does help mitigate some of the issues related to handing such high volumes of data. By building up a organization’s internal content ecosystem, you are essentially damming off the data influx – allowing it to flow through automated gatekeepers for processing and integration. This is a key step in converting raw data to hypertext.

By setting up an in-house ecosystem, organizations can more effectively scale their storage and CMS needs based on the data that they have on hand. While this may pose some limitations when it comes to providing linking to data and content currently outside the parameters of the CMS, it may ultimately prove more responsive and nimble in the long run. Rather than wading into the deep end of the content, setting up your own space allows for more easy and intuitive innovation, thought leadership and the more straightforward marketing of your content insights.

How DITA and XML facilitates managing regulation revisions

Since the start of publicly distributed legislation, the U.S. government has sought to make it as easy as possible to read and distribute non-classified government documents. This has led to the sometimes tentative embracing of content languages as they are developed and deployed throughout different industries. In June 2016, House Speaker Paul Ryan (R-WI) spoke to attendees of the 2016 Legislative Data and Transparency Conference and emphasized the importance of translating all legislative measures into a standardized formatting like XML. Ryan framed this an an effort to promote governmental transparency.

“Now we’re working to go further, and publish even more current and past documents in XML,” he told the assembled. “I’ve asked our team to keep moving ahead by publishing all legislative measures in a standard format. That means enrolled measures, public laws, and statues at large. We want this data to be as accessible as possible throughout the legislative cycle.”

“The goal is simplicity – something that XML models excel at.”

Keeping It simple and accessible
As with all forms of communication, the goal is simplicity – something that XML and DITA models excel at. The Federal Register affirms the importance of making regulations readable with stylistic guidance on how to author legislative documents.

“Readable regulations help the public find requirements quickly and understand them easily,” writes the Register. “They increase compliance, strengthen enforcement, and decrease mistakes, frustration, phone calls, appeals, and distrust of government. Everyone gains.”

This focus on compliance and limiting confusion – and the accompanying administrative nightmare – is a key way that DITA and XML can make legislation and regulations less of a hassle. Since the law governing any particular industry is a living document – made up of countless, frequently revised laws that dictate everything from tax codes to prohibited transactions – ensuring that documents are not only accessible but also find their way in to the most relevant hands can be easier said than done.

A ‘quality control nightmare’
This was a particular challenge that Chris Drake, the deputy legal counsel to Connecticut Governor Dannel Malloy, identified and sought to make less troublesome. In 2014, prior to Speaker Ryan’s comments – Drake and the governor’s office attempted to launch a “e-Regulation” program, moving away from the traditional and inefficient paper-based authoring process into something that would allow users to more easily interact with legislative content.

“Some agencies didn’t know where the most recent text-edited version of a regulation was,” Drake told GCN. “It was a quality control nightmare. We needed a system that was more transparent and accessible.”

“Lawmakers may be less than experienced with content authoring platforms.”

This e-Regulation system was pioneered with the help of Fairfax Data Systems to convert PDFs into DITA XML, with the goal of over time authoring legislation directly in XML so as to limit potential conversion errors and inefficiency. This in and of itself posed a challenge: Lawmakers and their staff are typically less than experienced with certain content authoring platforms, making it a steep learning curve. To compensate for this, the e-Regulation initiative focused on breaking authorship into a two stage process, with the first stage relying on automation.

“Extraction is a mostly automated process,” Mark Gross, president and CEO of DCL, a company also assisting with the conversion, to GCN. “The trick is to do it in a consistent manner, which is not that easy.”

Following extraction, the documents were edited and approved in XML draft form by humans. While still time and resource intensive, the process will in the long run save countless hours of having to convert documents into new formatting over and over again.

“If we had tried this six or seven years ago we might not have been able to find a solution that does this,” Drake said.

The ongoing value of DITA legislation
Of course, beyond accessibility, the virtue of an XML content framework is the ability to integrate live regulatory changes into existing and future content. As legislative content is converted into DITA, each element becomes a component. If a law is amended, changed or struck from the books, the components of that law as it relates to technical documents like work manuals and safety training can be automatically reconfigured to match the most up to date regulatory guidance. With agencies like the Occupational Safety and Health Administration on board with publishing all guidance in DITA, both companies subject to the regulations and the regulators themselves can work on the same page, without requiring extensive redrafting every time the law changes.

The impact of corporate interests on content development and marketing

The idea that we live in a world driven by niche interest – particularly when it comes to the creation of new content – may not reflect the whole picture. True, the content landscape is broader and more multivariate than its ever been. Yet amid this niche content renascence, the pressure to monetize this content has never been greater, leading to the encroachment of corporate influence.

As part of an announcement related to its 2017 layoffs, Medium, the online publishing company started by Twitter co-founder Ev Williams, described how ad-driven online media is a “broken system” and how this is undermining the company’s bottom line. Williams took to the company’s blog to defend the layoffs as a step away from the ad-driven business model and a means of renewing the company’s focus on content.

“The vast majority of articles, videos, and other ‘content’ we all consume on a daily basis is paid for — directly or indirectly — by corporations who are funding it in order to advance their goals,” Williams wrote. “And it is measured, amplified and rewarded based on its ability to do that. Period. As a result, we get … well, what we get. And it’s getting worse. That’s a big part of why we are making this change today.”

A detriment to content … and society? 
While Williams remains coy about how exactly Medium will be shifting its business model to rely less on ad dollars, he isn’t alone in his assessment that ad-driven material may have a negative impact on the quality of content. Speaking to Harvard Business School’s Working Knowledge blog, Feng Zhu, assistant professor of Business Administration at Harvard had equally strong words about the impact of ad-driven content creation models. 

“Ads may have a negative impact on the quality of content.”

“Many media scholars think this revenue model is detrimental to society because it provides incentive for the content provider to produce only popular content that can attract lots of eyeballs,” said Zhu. “Content providers are serving advertisers rather than the audience, and consumers with niche preferences will be out of luck because the content they’re seeking only caters to a small group of people.”

Zhu, alongside fellow researcher Monic Sun, sought to study the impact of ad-revenue-sharing programs on bloggers and content creators. Looking at a data set from a leading Chinese media website that offers a range of services, including blogging, Zhu and Sun were able to compare posts written by authors taking part in an ad-based profit model versus ones who did not.

Comparing the two populations, Zhu and Sun were able to determine that the posts supported by ad revenue showed a significant uptick in content focusing on “popular” topics, such as the stock market, salacious content and celebrities. Interestingly, while the topics became more culturally homogenous, the ad-supported blogs were typically longer, published more frequently and included more photos and video clips than those not ad-supported.

What can we take from this data, as well as the warnings issued by Williams? The lesson here may be that content backed by advertising facilitates a certain level of depth and innovation not easily achieved without some form of sponsorship – yet this comes at a price. The key for content creators and advertisers looking to work together and leverage a content strategy is identifying the niche they are writing for and determining the demand and – ideally – value of the content before bringing it to market.