From Earth to the Moon and Back 6x: How Data Drives Digital

shutterstock_120548806

For businesses today, we live in a Data Economy, where high-quality data has become one of the most valuable commodities in pursuit of global growth. No place is that demand more prominent than in digital marketing.

Today companies are spending in the neighborhood of $11 billion a year worth of transactions on digital data to support media, advertising and technology needs. Sound like a lot? The digital universe is doubling every two years. And according to EMC, by 2020 we’ll be wrangling around 44 billion gigabytes of information, which is enough to wrap from the Earth to the Moon and back 6X! Imagine how much more will be spent on data and services to support digital and programmatic marketing by 2020…

No wonder “Driving the Data Economy” was a hot panel topic at last week’s 2016 Adobe Summit, where one particular panel expertly drilled down on digital data. Rich Phillips, Adobe Senior Manager Business Development moderated a discussion with Anudit Vikram, Dun & Bradstreet SVP and Chief Product Officer, Mike Scafidi, PepsiCo Director of Marketing Technology and Digital Operations, and Ryan LaMirand, Acxiom Director of Global Partner Development. Their key topic: How data should be regarded as a significant asset for digital marketing and for programmatic marketers to produce, use and safeguard.

The transformational amount of money being spent on commercial and barter transactions for digital marketing and advertising data provides an outstanding opportunity. Every interaction we have with customers generates a data point, or a ‘footprint.’ Using that footprint to inform the next action marketers have with that customer can be the difference between delight and disaster.

The panelists ran the gamut – including a B2B data provider (Dun & Bradstreet), a B2C data provider (Acxiom) and a B2C data consumer (PepsiCo). While digital data and digital targeting techniques are on the bleeding edge of growth marketing, the panelists agreed on some very common-sense marketing rules of thumb for choosing what data to use.

Data sourcing takes due diligence and companies should consider four measures before making a data purchase:

  1. Review data accuracy and refresh schedules.
  2. Look for data providers who are transparent in their collection practices.
  3. Determine your objectives, and find data that bests supports what you have in mind.
  4. Look at costs. Not all data is created equal. Don’t skimp on data asset quality and don’t buy what’s not needed.

The biggest question that digital marketers need to answer is how to quantify relevant data assets to achieve goals.

As Vikram said, “Data is measured not by CPM, but by the value it provides.”

Because the modern buyer’s journey has evolved into a nonlinear combination of online and offline behaviors and touch points, B2B marketers need to take into account where companies and individuals are in the engagement process. The right combination of first-, second- and third-party data enlightens marketers about their audiences with more precision and helps them synchronize online advertising with broader programs to meet their business objectives and measure their outcomes.

The right partnerships are critical in making this work. Dun & Bradstreet’s strategic partnership on the Adobe Audience Marketplace is one example within the expanded Adobe Exchange program. Joint customers like American Express, Lenovo and Samsung license Dun & Bradstreet’s Audience Solutions data to improve targeting for both digital and programmatic marketing.

Just trying to do our part in that data economy.

***

Learn more about our Adobe Exchange and Adobe Marketing Cloud Partnership here.

 

Photo by ShutterStock – copyright by Sergey Nivens

Think Data First, Platform Second – Why Data Fuels MDM

Fuel

 

As the volume of data coming into organizations – from both internal and external sources – continues to grow and makes its way across departmental systems in many different formats, there is a critical need to create a single, holistic view of the key data entities in common use across the enterprise. Master Data Management (MDM) aims to accomplish this goal. Not surprisingly, MDM has become a significant priority for global enterprises, with the market expected to triple from $9.4B to $26.8B by 2020 according to analysts.

But while everyone is investing serious cash into the tools to manage the data, few are putting any thought into the data itself. This is akin to purchasing a luxury sports car and fueling it with water. Sure it looks great, but it won’t get you very far.

 

The underlying concept of MDM is surprisingly simple: get everyone “on the same page” looking at the same data and ensure it is accurate. Yet, master data and its management continue to be a universal challenge across many industries.  Organizations of all shapes and sizes share similar problems related to master data and can all reap benefits from solving them. That means concentrating on the quality of the data before going shopping for the sexiest MDM platform. In essence, you must master data before you can manage it. Ensuring the quality, structure, and integrability is your responsibility; your MDM platform won’t do that for you. It’s like purchasing a top-of-the-line oven and expecting it to produce a delectable meal. You are responsible for what goes into it.

Master Data Defined

Master Data is the foundational information on customers, vendors and prospect that must be shared across all internal systems, applications, and processes in order for your commercial data, transactional reporting, and business activity to be optimized and accurate. Because individual businesses and departments have a need to plan, execute, monitor and analyze these common entities, multiple versions of the same data can reside in separate departmental systems. This results in disparate data, which is difficult to integrate across functions and quite costly to manage in terms of resources and IT development. Cross-channel initiatives, buying and planning, merger and acquisition activity, and content management all create new data silos. Major strategic endeavors, part of any business intelligence strategy, can be hampered or derailed if fundamental master data is not in place. In reality, master data is the only way to connect multiple systems and processes both internally and externally.

Master data is the most important data you have.  It’s about the products you make and services you provide, the customers you sell to and the the vendors you buy from. It is the basis of your business and commercial relationship. A primary focus area should be your ability to define your foundational master data elements, (entities, hierarchies and types) and then the data that is needed (both to be mastered and to be accessible) to meet your business objective. If you focus on this before worrying about the solution, you’ll be on the right course for driving success with MDM. Always remember, think data first and platform second.

Will Businesses Like Facebook’s New Reaction Buttons?

Thumbs Up

 

New Data Will Reveal How Customers Really Feel

If you recently updated your Facebook status, you may see a wide array of responses that goes beyond the fabled “like” riposte the social media platform has become known for. Last month Facebook unveiled “Reactions,” which offers its users five additional ways to express their opinions on everything from pictures of your cats to your brash political musings. While it will certainly give users more choices in how they interact with friends, it will also give businesses deeper insights into customer sentiment.

Reactions are essentially an extension of the like button, with six different buttons, or “animated emojis” (ask a millennial): “Like,” “Love,” “Haha,” “Wow,” “Sad” or “Angry.” Despite the public outcry for a “dislike” button, Facebook did not include one because the company felt it could be construed as negative. But that won’t stop people from using the “angry” button when they do not like something.

While this may upset a friend, it can actually help companies react and respond to complaints. What’s more, it may even help us predict trends and threats we may have not previously seen.

 

“I think it’s definitely possible to draw true insight from this, but you’ll need to do some very careful analytics before forming any meaningful conclusions.”

-Nipa Basu, Chief Analytics Officer, Dun & Bradstreet

 

Dun & Bradstreet’s Chief Analytics Officer, Nipa Basu, believes the new “Reactions” will be an opportunity for businesses to better understand their customers, but notes it will take a deep level of understanding to make perfect sense of it.

“I think it’s definitely possible to draw true insight from this, but you’ll need to do some very careful analytics before forming any meaningful conclusions,” explains Basu. “These ‘Reactions’ are typically based on real-time human emotion. Sometimes it will be fair. Sometimes it will be unfair. But if you have a large sample of a lot of people’s emotions reflected then you can begin to ask if that says something about the customer experience or the product or service from that business, and go from there.

“Then comes the second part. Does it matter? Looking deeper at what the comments suggest and how they correlate with the different types of ‘Reactions’ being received, a good analyst will be able to draw more accurate insights. Looking at both types of responses together will help understand what customers really felt.”

This is what Basu believes will make social sentiment analysis that much more effective. Not only will it open the door for brands to assess the success or relevance of their content, as well as measure customer satisfaction, it may paint a deeper picture about total risk and opportunity across industries that could benefit others.

“When you utilize a company like ours, where we already have our pulse on the health of global businesses and industries, and combine it with these social ‘Reactions,’ we can start to understand the correlation it has between business growth or degeneration,” said Basu. “Can looking at the amount of ‘angry’ comments predict the future health of a particular business or industry? Quite possibly. This is paving the way for even richer data that can drive even smarter analytics.”

Is Anticipatory Analytics the Path Toward Future Truth?

P1000432

New Whitepaper Explores the Arrival of Anticipatory Analytics

Most everyone is familiar with the image of the eccentric fortune-teller gazing into her crystal ball to boldly predict the future. In the business world, teams of analytic experts are doing this everyday; they’re just using data instead of a crystal ball to get a glimpse into the future.

Thanks to advanced analytics, organizations are able to understand potential outcomes and evaluate how issues can be addressed. By generating predictive models based on all the data being captured, a new level of transparency and foresight has been created that helps shape future business strategy based on historical trends. This is called predicative analytics, and it is “the fastest growing segment of the business intelligence (BI) and analytics software market,” according to Information Management.

But for all of the promise around predictive analytics, there is some criticism.  For instance, since environments and people are always changing, relying on historical trends is said to be too simplistic and sterile to say something will or will not happen with a great degree of certainty. But a new analytic approach has emerged that may be better at grasping future outcomes.

As technology has evolved, so has our ability to process data at an incredible rate, making it possible to perform what has become known as anticipatory analytics. While still a relatively new concept, anticipatory analytics is gaining prevalence as a methodology.  It leapfrogs predictive analytics in that it enables companies to forecast future behaviors quicker than traditional predictive analytics by identifying change, acceleration and deceleration of market dynamics.

In order to make this possible, the right mixture of data, processing tools, technology and expertise plays a central role. The following developments play key roles in being able to address the future, today.

Key Enablers of Anticipatory Analytics

4 trends are making anticipatory analytics a reality.

To gain a deeper understanding of the emergence of anticipatory analytics, and how it should be utilized in your organization, check out this detailed guide that outlines the differences between anticipatory and predictive.

INFOGRAPHIC: Exploring Inter-Connected Business Relationships

downloadHow Understanding Relationships Drives Better Data and Analytics

How much do you really know about the companies you do business with day in and day out?  Sure, you may understand how your growth is affected by dealings with that supplier you’ve worked with for over a decade. But what about the association that supplier has to another company you may have never even heard of?  Don’t believe how they do business can inadvertently effect you? Think again.

Among the entities you do business with lies potentially crucial insights and information that can be critical in assessing total risk and opportunity. While not obvious at first glance, these insights become visible when you dive down to explore the information that links these entitles to you, as well as their connections to other businesses. By doing so, you’ll be able to understand the full potential of your relationships with customers, prospects, suppliers and partners. This is relationship data – when information about two or more entities are brought together along with their business activities to inform an implied business impact or outcome. Through a combination of interpreting the right signal data and implementing advanced analytics uncovered in this data, unmet needs arise, hidden dangers surface and new opportunities can be identified.

Every company has relationship data; they just need to know where to look for it, or who to partner with to obtain the right information. The infographic below describes the different types of relationships that exist; some you easily see, while other relationships are harder to decipher, but just as important to your bottom-line. Understanding the way in which two or more entities are connected is the foundation of this data.

The more you connect and expose entities across your databases, the greater your visibility into the cross-company interactions with these enterprises. The ability to uncover previously hidden associations inside the data provides a catalyst for business transformation and insights. Exposing relationships across product lines, branches and countries creates opportunities to evaluate sales coverage, modify compensation plans, renegotiate terms and conditions, adjust compliance policies, improve customer experiences, build advanced segmentation categories and uncover hidden supply chain risk.

Dive down to discover the many sources of relationship data.

Exploring the Relationship Data Iceberg

 

It is important to remember that relationships can be one-to-one, one-to-many or many-to-many. They can be uni-directional or bi-directional in nature. Understanding the differences can be key to the types of questions you ask and what insights you draw from the data.

The deeper you go in connecting the associated entities and the information that aligns to their business practices, the richer the insights you’ll uncover.  Ultimately, these richer data points enable you to move beyond simple modeling based on internal historical data and produce sophisticated business models grounded in multifaceted business connections.

As more businesses point to smart data as a conduit to growth, it’s important to ask the right questions of your data in order to extract meaningful insights to propel your business. That means going beneath the surface of what you normally see and exploring your business relationships to fully understand the cause and effect in your very own ecosystem.

3 Reasons to Be Thankful for Data

give thanks - Thanksgiving concept

 

What better time to think about all that we’re thankful for than Thanksgiving? For businesses, there has never been a better time to be grateful for data and all it can do to help deliver new opportunities. Like a moist, juicy roast turkey, data is the one thing everyone seems to be eating up. Let’s look at three reasons why data is so delectable. We promise you won’t feel sleepy after consuming this list.

 

Bountiful Amounts of Data

The Pilgrims arrived in the New World during a particularly harsh winter, making it very difficult for them to find food and shelter. Of course we know they persevered despite the shortage of resources after they learned to harvest crops that produced an abundance of food for them to not only survive, but thrive.  Fast-forward to today, and we gorge ourselves on turkey and various complex carbohydrates in celebration of the early settlers’ efforts.

Like the sustenance which was once so hard to come by, data too has gone from scarce to abundant, and there’s plenty of it to feast on. Just as the Pilgrims struggled to live off the land, data scientists once had to work especially hard to cultivate meaningful bits of data. When data was scarce, they had to extract, track and survey to get at it. The objective was always to find out how and where to get more of it. Today, data is everywhere and the new goal is understanding how to make sense of it.

Technology has been one of the biggest catalysts of the data explosion. With the digitization of business, the amount of information we are collecting is growing so large that there are debates on just how big it is – it probably has grown even more since you read that sentence . Both structured and unstructured in nature, this wealth of information has made it possible to produce insights and achieve outcomes that were previously inconceivable. Businesses can better identify trends and risk, organizations can tackle health and wellness issues, and governments can solve economic and social challenges.

While we should certainly be grateful for the abundance of data, we must be careful how we use the information. It is important not to overindulge or horde it. Instead we must recognize the type of data that will sustain us and avoid the empty calories that may lead us astray. Just like the Pilgrims planted the right seeds that would bring them sustenance, we must choose the kernels of data that will drive meaningful value and insights.

 

Data Freedom

The Pilgrims came to America from England in search of religious freedom.  They yearned for an open society characterized by a flexible stricture, freedom of belief and dissemination of information. We are witnessing a similar evolution in the way data is accessed and shared. The concept of data sharing is officially defined as making a certain piece of data free to use, reuse and redistribute –subject only, at most, to the requirement to attribute and/or share-alike. In other words, we’re at a point in history where some information can be freely used to solve key problems or make progress towards specific goals.

There are many examples of data that can be openly shared. However, it’s not just about the numbers, but the insights that come along with it that pose the most benefits when freely distributed. This concept offers benefits across both the private and public sector. Businesses can gain a new level of transparency into new opportunities for services/goods, make better decisions based on more accurate information, and reduce costs for data conversion. But perhaps the biggest advantage of an open data ecosystem is for individual citizens. That’s because the sharing of information between governments can help everything from increase economic activity, address national disasters in a swifter manner, and even reduce health issues.

There are several types of data that can be shared among governmental functions. There is the sharing of data among governmental agencies within a single country. Second is the sharing of data across borders between International governments. And lastly, there is the sharing of data between businesses and government; this refers to voluntary sharing of data, beyond the legal reporting obligations of governments.

So what exactly should governments be sharing? Observations are crucial – such as the National Weather Service issuing a Hurricane watch. It’s about sharing conclusions that can help different agencies better prepare for that projected weather event. Ultimately, this is the value of open data: multiple organizations, with mutual interests, sharing their independent observations in a manner that lets each organization draw more accurate, informed and insightful conclusions.

Unfortunately, in many cases the decisions made by government officials are not always based on all of the information that might be pertinent to the situation. The same goes for businesses, which are somewhat reluctant to let go of their first-party data. With the freedom and ease to share information at our hands, we have the opportunity to achieve maximum economic and social benefits.

At the end of the day, data is nothing without analytics to help make sense of it.  We should always be cognizant about ways in which specific pieces shared data are used to address specific questions and help establish new ideas. The Pilgrims likely did not use all of their food sources to cook a single meal.  We shouldn’t use all the data available to solve a problem just because it’s in front of our face.

 

A Brave New World of Data

As hard as I tried, I could not come up with a very clever parallel between the Pilgrims and the Internet of Things (IoT). But, this new buzz word represents such a major data opportunity to be thankful for, and, well, the Pilgrims had, umm, things, so here we go.

The IoT refers to the growing network of interconnected objects – devices and machines which are not connected but aware of, and discoverable to, other devices and machines around them. It’s mobile, virtual and instantaneous. Data can be gathered and shared from anything like a car to a refrigerator, which means we’ll be witnessing a huge increase in the amount of data being generated – huge streams of data that will enable new types of insight. We now have an opportunity to organize this information in compelling ways to reach new conclusions.

The IoT has the opportunity to fundamentally change industries. From the automotive industry, where new data signals from cars may help improve safety conditions, to the supply chain, where the real-time passing of information can avoid disruptions in manufacturing. Organizations will quickly realize transformational change of their business models given the rate at which digital technologies are connected and constantly evolving.

As beneficial as the IoT will be, it will not flourish without the right data management and analytics capabilities. Organizations will need to invest time and resources in order to mine valuable insights from the data generated by the interactions that occur between machines.

 

In Summary

These are just three examples of how data is changing the face of business and frankly, society. There are certainly countless other reasons to be thankful for data, depending on what your business goals are and what you want to achieve. As I’ve noted, within each of these instances, while there is much to be thankful for, it is vital we be cautious and smart when taking advantage of new data opportunities. Just like the Pilgrims, we are using this new frontier to create a new world of endless possibilities.

Data: Trick or Treat?

Halloween-Wallpaper-Background

 

Come October 31, hordes of miniature ghosts, ghouls and goblins will come knocking down your door for some sugar-filled treats. Of course it’s not what it seems. You know that behind the colorful masks and spooky costumes stand harmless children celebrating Halloween.

The question is, can you say the same about your data when it appears in front of you? Data, too, often is not what it appears at first sight.

Unlike enthusiastic trick-or-treaters roaming the streets in search of candy, data does not intentionally wear a disguise to trick you, yet so many of us are fooled by it on a daily basis. It’s no surprise that data can often blind us to the truth – there’s just so much of it out there. With the amount of data growing exponentially each and every day, it’s often hard to find time to make sense of it all. Instead, we take what it says on the surface without considering whether or not we are asking the right questions of the data.

Take unemployment for example. After one of the worst recessions in modern history, we’re now hearing much celebration in the media about how unemployment is down to almost record lows, 5.1% as of me writing this article. That certainly seems worthy of praise; after all, the data doesn’t lie. While the numbers are technically right, it does not factor in a ton of variables that may actually change the overtly upbeat conversation about the current economic climate.

Please don’t think of me as a ‘Debbie Downer’ trying to rain down on the recovery parade, but this is a classic example of the data not telling the complete story. The Department of Labor measures unemployment by the number of people receiving unemployment benefits. But that, of course, can be misleading since unemployment benefits expire, leaving the jobless without a way to be measured. It also does not take into consideration those who may be working part-time jobs but are actively seeking full-time work to support their families and pay back student loans – the same loans they took out in hopes of landing full-time jobs.

These are the types of factors we need to look at in order to pose the right questions before we take the data for what it’s worth. Of course it’s easy for politicians to look at the numbers without digging deeper because it makes them look good.

“It’s human nature,” says Anthony Scriffignano, Dun & Bradstreet’s Chief Data Scientist. “We have a tendency to try and organize the world around us. It’s how we probably have survived. The world around us is so chaotic so we try to find patterns in it; we try to find things that we think are there and want to believe.”

Unfortunately, Scriffignano believes, we tend to use data to make decisions based on ego or make rush judgments in response to answers we believe others want to hear instead of digging deeper to discover what really lies underneath the data. “There’s a name for it,” he says. “It’s called confirmation bias.”

Getting back to the unemployment example, it’s really not ethical to make a broad-based assumption on the data without looking at other evidence, not when that assertion clearly contradicts what others themselves are experiencing. Unfortunately there are countless examples of this happening all the time, and in much more precarious instances across business and government.

As Gartner’s senior vice president and global head of research said at this year’s Gartner ITxpo, “Data is inherently dumb. It does not actually do anything unless you know how to use it.” That doesn’t stop most of us from seeking out even more data when we don’t see the answers we want. But reaching an honest conclusion goes beyond data volume. It takes an inquisitive approach and relentless curiosity to turn data into dependable insights.

Don’t be tricked by seeing data only on face value. Look deeper.