Using audience metrics to measure content impact

Content creators and mangers are often faced with a relative one-to-one need/supply proposition: A need for specific content is identified, that content is supplied – either by creating new works or reconfiguring existing modules. Too often, however, consideration of the content’s efficacy ceases to be a concern once that immediate need is fulfilled.

This is bad business. Even using automation, the act of content creation is still resource and time intensive. To avoid redundancies or ineffective content, content marketers and managers need to measure the impact of the materials they generate. But which metrics offer the most meaningful insight?

Journey mapping: Naming the nameless
Crafting content that connects with its preferred audience requires understanding audience objectives – and how these objectives match enterprise goals. While often process can be as simple as soliciting direct feedback or taking requests, sometimes this ends up being more esoteric. Audiences may not know the kind of content that will have the most impact and fulfill a direct need – primarily because they may have never encountered content like it before. You can’t identify something that has yet to be named.

“Starting by aligning with your audience personas.”

To attempt to identify a need – naming the nameless, so to speak – start by mapping the journey your audience is taking involving your enterprise. Starting by aligning with your audience personas, use trend data to trace the lifecycle of a customer and their content needs.

The Content Marketing Institute identifies five distinct stages of the customer journey:

  • Awareness. A customer encounters a piece of content.
  • Interest. The customer expresses some form of active engagement with the content or the enterprise behind the content.
  • Evaluation. Both customers and enterprises take a closer look at the content products available and determine value.
  • Decision. Customers purchase and/or implement the content.
  • Retention. Any after-purchase activity, from returning audience members to subscriptions, to organic promotion.

This journey map is by no means fixed: Smashing Magazine, for example, only has four stages, consolidating evaluation and decision into a single “conversion” stage and dubbing retention “reward.” Regardless of specific structure of terminology, the goal is a simple lifecycle map that you can use as the bedrock for tracking where each content interaction – or “touch” – occurs.

‘Attention’ metrics
As Keith Grossman, global chief revenue officer at Bloomberg Media, told eMarketer, even if you understand the audience journey, knowing where the most valuable touches took place can be tricky. There is minimal standardization across any content industry by which engagement can be meaningfully measured, leading Grossman to measure the ability to grab “attention” rather than time spent or clicks.

“There’s a slow evolution taking place. The marketplace is trying to understand the right metric to standardize against, but it hasn’t figured itself out yet,” Grossman told eMarketer. “The question is, what is the proper amount of time to measure success and engagement? If we agree on 2 minutes but then I give you a 200-word piece that takes 2 minutes to read, that’s not successful.”

Key performance indicators
At each stage of the journey, there are opportunities to identify key performance indicators as well as gaps points in the customer experience that are disjointed or painful, causing the customer to drop out of the cycle. Crafting these KPIs is particularly important since these are the metrics that you can use to measure how effective or enticing the content. Some common metrics CMI identifies include: keyword rankings, impressions, overall search visibility, webinar registrations, white paper downloads, conversions, shares, comments, subscription renewals and social media engagement.

“KPIs in a audience journey aren’t the be-all, end-all.”

Credit where credit is due
The presence of these KPIs in a audience journey isn’t the be-all end-all. Rather, the value of these KPIs as metrics fluctuates based on situational factors and audience person. What may be a valuable metric to measure engagement for one audience member may not be true for another. This is where analytics platforms step in to create an attribution model, weighting KPI value based on profile engagement trends.

This may mean mean that certain touchpoints in the audience journey have more value than others: For some, that first touch deserves more credit, for others it is the last touch that will foster long term engagement. Weighting touchpoints can help recognize the inherent attributes of the content that seems to resonate most with which audiences.

Segmentation to measure performance
Finally, once you have your KPIs and attribution models in place, you can build audience subsets out of your analytics data. Segmenting your channels and measuring engagement rates can help confirm or debunk assumptions born out of trend projections. Look at the frequency and depth of these engagements and the dividends they pay. The goal is to match segmentation, KPIs and attribution models with the data derived from your audience journey.

Dynamic Delivery vs Dynamic Content

Dynamic content is a staple of modern marketing and customized web experiences. Our ability to manage content in a CCMS at ever higher levels of granularity is matched only by the increasing sophistication of our tools for building and tagging content. Add to these increased automation and computer learning, and it’s clear that content delivery is ready for another round of innovation.

Delivery bottlenecks
With componentized data modules and advanced CCMS enabling fast, smart content configuration on demand, the possibilities related to the creation of personalized content are virtually endless. What is not endless are the ways users encounter content. Websites, social media, ads, emails and newsletters: We can customize content on demand and deliver on these platforms, but we are still running into problems when it comes to controlling delivery on a granular level and avoiding pile-ups related to speed and performance.

“How can we optimize delivery of dynamic content?”

We’ve been here before. In 2001, Greg Parker, CEO and president of SpiderCache, prophesied to ComputerWorld, “There is huge growth occurring in dynamic content. People are moving away from static ties [sic] to dynamic content-driven sites, and that exposes the bottleneck [dynamic content] causes….  [Traditional caches] can’t handle the performance required for the speed of delivery of dynamic content.”

His company sought to address this problem for web sites by building a more dynamic page-caching capability, which was good for the time but does not fully address content delivery in the era of apps and messages. So the question is: How can we optimize delivery of dynamic content?

Going where the users are
Optimizing delivery requires identifying where users are interacting with content. Innovative content delivery ideas in 2001 are ready for retirement since apps are common platform where content is being delivered. Apps provide the opportunity for real-time feedback to content delivery engines, so algorithms can take advantage of that feedback to send content with more intelligently than is possible when sending content to a website. And that may be changing soon. Some industry experts caution that the app marketplace may soon collapse given the dominance of platforms like Facebook, so there may be limited utility in investing the time into building out an innovative delivery infrastructure for mobile apps.

So where can content delivery be optimized? Newsletters and emails still remain a vibrant and engaging delivery method. With audience segmentation rules allowing a CCMS to build tailored content, more granular and specific content delivery could come from turning attention to the modules of a newsletter: inserting a customized event calendar based on user location, promoting specific products and services based on individual user engagement metrics and so on. Rather than separating a newsletter delivery into broader “campaigns,” a CCMS can manage user profiles and build a unique product on demand.

“Forms essentially act as a type of content.”

Optimizing forms
Forms, where users voluntarily submit data, are one of the most invaluable assets for content managers. While behavioral and engagement data can inform an algorithm, as well as show and potentially predict patterns, forms offer a direct pipeline of data that can be integrated straight into a CCMS.

The relationship of forms to content and delivery is more than symbiotic. Forms essentially act as a type of content. They can be optimized and made dynamic based on user behavior.For instance, if your profile shows that a particular shopper on an e-commerce platform has a high bounce ratio, you can deliver a simplified form that asks only for the most vital information. A user who returns again and again to a platform and spends a significant amount of time on certain sites can be paired with a form that focuses on data related to his or her browsing patterns. By being able to gather specific, in-depth data from users, the ability to glean even more granular insights about content grows. This in turn fosters CCMS sophistication, enabling pinpointed, dynamic delivery.

Content strategy best practices

Marketers seeking to foster engagement and content usage have a variety of tools and tricks to make sure the data hits in the right way. In this article we explore one of each to ensure that your content strategy delivers valuable material.

Science: Establish your content benchmarks
According to the Content Marketing Institute’s Ahava Leibtag, the key to creating valuable content is to establish specific benchmarks, by which you can measure the efficacy of the content you have created. While some of Ms. Leibtag’s benchmarks have psychographic elements, most of the challenges of creating high-value content can be traced to technical aspects related to CCMS integration, as well as dispersed teams struggling to follow a comprehensive, coherent plan.

Taking a page from “Checklist Manifesto” of Dr. Atul Gawande, Leibtag urges content strategists to break things down into an easy-to-follow workflow available for the entire team, checking off various metrics along the way. The internal checklist serves as a guide for all departments when assembling content for publication. As part of her step-by-step content creation checklist, Leibtag has content teams ask the following questions:

  • Is the content easily able to be be found? To ensure findability, Leibtag urges the inclusion of heading structures that produce heading tags (such as the <h1> tag) in HTML, significant SEO-optimized metadata, links within body copy and text descriptions that find there way into @alt attributes in HTML for images.
  • Is the content readable? Design and formatting play a vital role in content readability, which ultimately may determine its relevance.
  • Is the content comprehensible? Distinct from simple readability, comprehension considers the audience’s reading and understanding level as well as writing for that specific audience.
  • Is the content actionable? What kinds of calls to action or invitations to share are associated with the content? Are users compelled to comment? Is feedback solicited?
  • Will users share the content? This ties to the organic integration of a call to action and the ease by which content can be shared via social media.

Artistry: Emphasize content relevance
Creating relevant content requires a different sort of content tailoring. The first task is to zero in on data that is relevant to your audience and then structure that content to present this information clearly.

“Establishing credibility with your audience is crucial.”

Second, establish credibility with your audience. Choose data and topics that are squarely associated with your area of expertise, leaning heavily on hyperlinked connections to unimpeachable sources and timely thought leadership. By showcasing connections to other content sources within your chosen market, you show that your content is part of the larger ecosystem.

Finally, innovate. Repurposed content has limited value and may invite litigation. Instead, use previously published data to synthesize new, innovative content that challenges previous assumptions and findings, creating content of inherent value and specificity that links directly back to your organization.

Astoria Software Promotes New Techniques for Sharing Content

SAN FRANCISCO, CA—April 17, 2018 – Astoria Software, a division of TransPerfect, announces two upcoming events demonstrating the integration of the Astoria Component Content Management System with WittyParrot. The first event is a webinar on April 18, 2018, hosted by Scott Abel, The Content Wrangler, entitled, “Repurposing DITA Content for Microsoft Office Users” (click here to register: https://www.brighttalk.com/webcast/9273/313615). The second event is a “Test Kitchen” session on Tuesday, April 24, at the CMS/DITA North America conference entitled, “No More XML for the Masses: A New Way to Share Content with MS Office Users” (click here for details: https://cm-strategies.com/2018-cms-conference/day2-agenda/astoria/). Each event highlights how the combined product of Astoria and WittyParrot allow non-XML content creators in Marketing, Sales, and Customer Support departments to reuse corporate intellectual property encoded in XML by the Technical Documentation department.

Join the Free Webinar, Ask Questions

It’s one thing to understand the value of content reuse. It’s another to find a solution that defeats tool-driven content silos. Teams that don’t have access to—or knowledge of—XML editing tools should still be able to effectively reuse and repurpose XML content. Astoria Software, in partnership with Scott Abel, The Content Wrangler, will present a webinar showing techniques that technical content teams can use to share and repurpose DITA-style XML content with team members who create content using Microsoft Office software. The April 18th webinar will feature an extensive question-and-answer session so attendees can understand the details presented in the demonstration.

The free webinar is Wednesday, April 18, at 10:00 a.m. PACIFIC. To register, click here: https://www.brighttalk.com/webcast/9273/313615.

Come to the Conference, Try it Yourself

The CMS/DITA North America conference, celebrating its 20th anniversary this year, provides attendees a track to see and touch technology in action. These sessions, each one called a “Test Kitchen”, feature live demonstrations of capabilities and ideas where attendees can also get their hands dirty using the software. Astoria Software’s “Test Kitchen” will let a Technical Documentation user working in DITA share content effectively with another user working in a Microsoft Office application. Here’s the trick: the Microsoft Office user won’t know a thing about DITA or XML. It won’t be magic, but it will look like magic.

Astoria Software’s Test Kitchen session is Tuesday, April 24, at 8:00 a.m. To register for the conference, click here: https://cm-strategies.com/2018-cms-conference/day2-agenda/astoria/.

About Astoria Software

Astoria Software is the world’s most successful Enterprise solution for XML Component Content Management to help companies engage with their customers. Cisco Systems, Xylem, ITT, Siemens Healthcare, Northrop Grumman, Kohler, GE Digital, and other Forbes Global 2000 organizations rely on the Astoria platform to build meaningful, purposeful customer experiences around complex, business-critical content and documents. The Astoria platform’s reach extends to its web-based portal and to its mobile-device apps, forming an end-to-end solution that includes authoring, content management, and rendering systems, fully integrated and delivered on public or private clouds. Astoria Software, a division of TransPerfect, Inc., is based in San Francisco. For more information, visit http://www.AstoriaSoftware.com/.

About WittyParrot

WittyParrot is a disruptive, intelligent micro-content automation, collaboration and communication platform for Marketing, Sales, and Support organizations. WittyParrot improves knowledge-worker consistency in communication, productivity and responsiveness by making information nuggets available through bots and widgets. The company’s investments in artificial intelligence, machine learning, and data science enable it to automate both effectiveness and messaging consistency in all knowledge-worker communications. WittyParrot is fully integrated with Microsoft Office and Microsoft Office 365, several CRM platforms and various chat-bot technologies. WittyParrot has offices in Silicon Valley, California and in Bangalore, India. For more information, visit www.WittyParrot.com.

Astoria Software at LocWorld 37 Warsaw in June 2018

Track: Content Management
Date: Thursday, June 7, 2018
Time: 2:45pm – 4:00pm
Room: E

Michael Rosinski, President and CEO of Astoria Software, will head a panel discussion at LocWorld37 in Warsaw, Poland, on Thursday, June 7th, 2018, at 2:45 p.m.

The panel’s theme, entitled “How to Influence Top Management — Finding Funding for Digital Content Transformation Projects“, will give attendees an executive perspective on setting funding priorities for content-related projects.

Attendees will hear how top management evaluates content transformation projects, how executives balance competing priorities in tight-funding environments, and the key points in a project pitch that capture executive attention and interest.

CMS/DITA North America 2018

CMS/DITA North America 2018
April 23-25, 2018
Hilton Denver City Center
Denver, CO

Join Astoria Software for the 20th anniversary conference of one of the most influential gatherings for content curation professionals anywhere in the world.

We’ll be there with our “Test Kitchen” entry, “No more XML for the masses: A new way to share content with MS Office users” and our forward-looking presentation, “Mathematical models for systems that manage chatbot strings“.

Whether you’re a Novice or a Master, you’ll find 98 conference sessions to improve your skills, inspire your mind, and invigorate your career.

When a CCMS meets customer intelligence platforms

Developing alongside the increased sophistication of CCMSs, customer intelligence (CI) platforms represent the sibling to content management and delivery software, with more focus on the marketing aspect. Yet when the two work together – either as an integrated, automated build or as different departmental concerns – the true potential of effective content can be unlocked.

The Power of Personas
As we’ve written about in the past, one emerging aspect of a modern CCMS is the ability to configure data modules based on audience persona. CI platforms are where the bulk of the persona building can occur, since CI is designed to capture and sort customer data. Using analytics, CI platforms can break consumer data into demographic categories, such as:

  • Personal demographics like age, income level, debt level, educational profile, marital status, and other lifestyle features.
  • Geographic demographics such as residential location, country of origin and travel patterns.
  • Attitudinal data, showing preferences and taste.
  • Situational customer data, identifying the specific method and environment in which engagement occurred.

This last metric is can hold particularly compelling data when building an audience member persona: Recent research has shown that the device by which a customer accesses content can portend certain behavioral patterns, which in turn can inform a trend line and help guide content marketers in pairing content to hungry audiences. For example, according to AddThis, desktop users are above 50 percent baseline consumers of job and tech related content, while mobile users consume weather-related content a whopping 355 percent above baseline.

“There are several key data aggregation tools that CI platforms can utilize.”

Sourcing Data
To glean this data, there are several key tools that CI platforms can utilize. The first is direct gathering. Surveys and response boxes built into existing content can effectively gather meaningful data, volunteered directly from the customers themselves. However, this data cannot be deemed 100 percent verifiable, as it gives customers the opportunity to mislead or omit what might otherwise be useful information.

Other means of gathering tend to be indirect. Whether it’s scraping a popular social media platform with an eye for keywords or looking at IP addresses to determine location, indirect gathering can actually offer more verifiable insight, albeit perhaps more limited in scope, and deeper implications.

Customer Development
One of the key areas of potential for CI is the concept of Customer Development. As defined by Brant Cooper and Patrick Vlaskovits, authors of The Entrepreneur’s Guide to Customer Development, customer development represents the culmination of CI insights into a responsive understanding of not just behavioral cues, but also what customers are looking for from a piece of content you can provide.

“The results of the customer development process may indicate that the assumptions about your product, your customers, and your market are all wrong,” Cooper and Vlaskovits told the content marketing Institute. “And then it is your responsibility, as the idea-generator (read: entrepreneur), to interpret the data you have elicited and modify your next set of assumptions to iterate upon.”

Customer development is still in its infancy, but the capabilities of CI platforms are ever-expanding. With increased integration with CMS, the ability to configure even more effective, impactful content is well within reach.

“Uplifting” to a Content Management System

When it comes to migrating content from an unstructured paradigm to true XML-based authoring in a component content management system, the challenges include changes in content architecture and navigation, forcing you to make important decisions about the design of your content well ahead of migration.

Unstructured content, including what is stored on intranets, can differ dramatically from content that is prepared and approved for distribution to customers in both form and function. According to CMS Wire, those looking to transfer content from an intranet to a CMS are likely to run into one or more of these issues:

  • Complex or counter-intuitive data distribution.
  • Data that is transactional or with a narrow application (and thus incomplete or incomprehensible outside of context).
  • Information that is client-specific or author-specific, requiring careful review and perhaps specific permissions for publication.

Each of these challenges represents an aspect of migration planning that must be addressed by the enterprise and the content architect.

“Each of these challenges represents an aspect of migration planning.”

‘Lift and shift’
So how should enterprises approach a content migration? The simplest answer from an organizational perspective is what many CMS experts dub the “lift and shift”: the manual re-entry of unstructured data into some kind of XML structure. This effort consumes significant man-hours, and often the data quality is compromised by the repetitive, tedious nature of the task.

Automation could be the answer to some migration woes, but before you set about re-keying unstructured data as structured content, developing a coordinated plan of attack is key. Here are the four main steps for planning the most efficient migration possible.

Step one: Content inventory
Understanding the breadth and scope of your migration will give you a sense of the resources required and a potential timeline. Access your existing content library, making sure to include any additional repositories that may be included in the structured system.

Inventorying content requires identifying:

  • Content permissions and authorship requirements.
  • Locations of content.
  • Content formats.

It should be noted that an inventory is just that: a moment prior to a migration where you take stock of what you have. The next step is where you take a more evaluative eye to the content that you will (or will not) be migrating.

“Content auditing involves looking at the content on hand.”

Step two: Content auditing
Content auditing involves looking at inventoried content and making decisions about its relevance. Often, this is where an enterprise puts its content to the test, identifying it as worthy of migration or earmarking it for deletion/archiving.

“Chances are, you have quite a bit of content out there that nobody — user or owner — has looked at in a long time. It happens to everyone,” Alicia Backlund, Product Strategist at Level Five Solutions, told Nielsen Norman Group. “An intranet redesign is the perfect reason and opportunity to take a good, hard look at your content and move only what makes sense to move.”

When auditing, ask:

  • What is the age of the content?
  • When is the last time this content has been accessed, either on its own or as a component of another piece of content?
  • Is this content still relevant/accurate? Is it incomplete on its own?
  • Could the user interface for accessing this content have been more intuitive?

Step three: Mapping
Once you’ve made sense of the content – where it currently exists and what data you need to transfer – you need to map your data into the new XML structure. This includes determining pathways to shared content as well as the basic usability plan of the new component content management system.

Consider the migration sequence of your content. Is there operationally essential data that should be moved to the front of the line, giving you leeway to complete the rest?

Get your plan in place, and then start your migration. Once initiated, track where content ultimately ends up and compare it to your original plan to help ensure that the process is a success.

How “right” where they? A look back at the content and content marketing trends predicted for 2017

2017 was right around the corner when CMS and content marketing specialists began churning out trend projections for the new year. Now that 2017 is done, let’s see how well they saw the future. More weight is given to correctly foreseeing less prominent drivers taking the spotlight; no fair claiming credit for trends that were already visible at the end of 2016, such as increased automation, growing CMS sophistication, and the increased integration of community editing.

Trend: The rise of rich media content.  Prediction Weight: Low.

CMS designers have already jumped to the task of integrating increasingly sophisticated data languages and digital asset managers into content structures. The next frontier is content with native – non-plugin-based – media embedded as modules. Video, interactive tools, games, survey and app extensions are all primed to become the next aspects of content language.

“The next frontier is native media embedded as modules.”

While pundits got this right as the “next frontier” in content, 2017 did not see a wave of new content with embedded native media.

Trend: Content as a tool of native advertising.  Prediction Weight: Low.
The ubiquity of free content has changed the way that content authors approach development models. One of the more unsavory aspects of this evolution – from the consumer perspective – is the idea of “clickbait.”

It was Columbia Law School professor Tim Wu who pointed out that “clickbait” as actually an organic evolution of content marketing and commerce. Without the ability to monetize content directly, authors would naturally become more psychologically savvy in the type of content they develop, designing it to attract the eye and hold attention.

“This attention-merchant model has spread to so many areas of our life, where we’re completely used to everything being free,” Wu explained. “But then the payoff, or the exchange, is that then we also agree to stuff that is compromised, because it is always trying to get us to click on ads at the same time.”

“Experts” warn that users are growing savvier and more cynical about brute-force advertising efforts – but there is likely to be a shift to different monetization models that may help abate some of its more unpleasant effects. The Content Marketing Institute points out that native advertising – where ads are part of the content, albeit relatively unobtrusive and embraced by authors – could prove an effective countermeasure to clickbait that is minimally disruptive to the content itself.

Perhaps.  But as a prediction, “native advertising” was a flop.

“Social media will be further divided into factions.”

Trend: The social media split.  Prediction Weight: Medium.
Social media consultant and trainer Andrew Davis warns that social media content is heading for a major fork in the road, based on behavioral patterns of users. Similar to the warnings that previous content experts have given about messaging apps supplanting traditional web pages, Davis sees a future where social media – already acting as the hub for nearly all web-based interactions – will be further divided into factions.

“Social media will be split into two areas: the visual web and the community focused web,” Davis tells Writtent. “Visual content will increase at a rapid pace but so will messaging platforms.”

This split was particularly evident in the Russian meddling of the 2016 US Presidential elections.  The so-called “Internet Research Agency” company exploited the trend toward factionalism to develop a series of Web sites and personas that attracted the interest of scores of like-minded US voters. Whether or not this tactic affected the election itself is the subject of vigorous debate, but to some extent that debate is moot given the level of participation and engagement these Web sites and personas engendered.

The Internet’s broad reach allows people of similar persuasions to connect easily even if they live in comparative isolation. It will be challenging for many brands, accustomed to mass marketing, to adapt successfully when they find themselves cut off from their consumers. As closed-loop messaging subverts content development and marketing algorithms, Davis sees many brands retreating to traditional web and visual content advertising, rather than doing the hard work of nice advertising.

Trend: Reevaluation of content efficacy.  Prediction Weight: Low.
According to the Sword and the Script, anywhere between 60 and 70 percent of B2B content developed by brands and organization goes unused. This points to the ease and inexpensive nature of modern content development, but also underscores that brands haven’t yet mastered creating content that connects with users. Focusing on additional automation and perfecting user personas may mitigate some of this waste, leading many experts to say that 2017 will be a peak year for content authors who can guarantee effective content.

Regrettably, 2017 wasn’t the peak year for effective content.  Furthermore, there will be no peaking of content that piques anyone’s interests until the incentives of content authors is aligned with the interests of content consumers.  As it stands now–and has stood for several years now–the majority of content authors are compensated on the amount of material they produce; there is no gating factor imposed by the expense or technical challenge of putting that content on the Web.  So, there is little incentive to produce content that people want to read.

If you want to see a peak in effective content, tie a content creator’s compensation to the number of page-views and the net increase in backlinks.  With a properly incentivized writing team in place, you will definitely see a reevaluation of content efficacy.

Retrenching the content ecosystem: Smaller ponds for more manageable content management

Over the past few years, the greater content ecosystem has expanded dramatically – pushing content management to its farthest reaches. What used to be small ponds of data has become an ocean, leaving many content management system designers struggling to handle all the user generated content and data modules that are now flooding in. While improvements in automation to better handle high data volumes may be on the horizon, the fact that they are beyond our current capabilities means that CMS design may be temporarily better focused elsewhere: into retrenching internal data ecosystems into smaller, more relevant ponds.

Connected themes making connections 
Content creation and discovery begets new content, as data is processed by new users and reconfigured into forms that meet a niche. This is, fundamentally, the purpose of good content and is woven into the structure of modern content languages: inter-connectivity, hyperlinking, the ability to be shared and reconfigured on demand, while still preserving some modicum of authorship tracking. Without inter-connectivity, content represents a dead end for users – a nearly insurmountable obstacle in the modern ecosystem.

“Unprecedented ease in authorship comes with a distinct drawback.”

Here comes the flood
Of course, even in this time of unprecedented connectivity comes with a distinct drawback. Users are very nearly drowning in a flood of content, with the influx showing no signs of abating any time soon. According to CSC, we will see a staggering increase in data production by 2020 – growing 4300 percent. With this much data, the focus on the interrelationships between data that matter most to us is paramount.

While many assume that this interconnection between data is the source of the volume, this is more of a situation where the tail wags the dog. Content authorship has expanded due to simplified creation tools and platforms – breaking down traditional barriers to entry – making interconnection the only means we have to navigate such high volumes of data.

Creating the smaller pond
Since the operational challenges of dealing with the unrelenting torrent of data are beyond the capabilities of even the most sophisticated automation and web-scrapping capabilities, content marketers and CMS designers may have to turn to a counter-intuitive solution: creating smaller, proprietary ecosystems.

This “smaller pond” tactic may seem like a step backward in the overall pursuit of a larger, more integrated content ecosystem, but it does help mitigate some of the issues related to handing such high volumes of data. By building up a organization’s internal content ecosystem, you are essentially damming off the data influx – allowing it to flow through automated gatekeepers for processing and integration. This is a key step in converting raw data to hypertext.

By setting up an in-house ecosystem, organizations can more effectively scale their storage and CMS needs based on the data that they have on hand. While this may pose some limitations when it comes to providing linking to data and content currently outside the parameters of the CMS, it may ultimately prove more responsive and nimble in the long run. Rather than wading into the deep end of the content, setting up your own space allows for more easy and intuitive innovation, thought leadership and the more straightforward marketing of your content insights.