Will Businesses Like Facebook’s New Reaction Buttons?

Thumbs Up


New Data Will Reveal How Customers Really Feel

If you recently updated your Facebook status, you may see a wide array of responses that goes beyond the fabled “like” riposte the social media platform has become known for. Last month Facebook unveiled “Reactions,” which offers its users five additional ways to express their opinions on everything from pictures of your cats to your brash political musings. While it will certainly give users more choices in how they interact with friends, it will also give businesses deeper insights into customer sentiment.

Reactions are essentially an extension of the like button, with six different buttons, or “animated emojis” (ask a millennial): “Like,” “Love,” “Haha,” “Wow,” “Sad” or “Angry.” Despite the public outcry for a “dislike” button, Facebook did not include one because the company felt it could be construed as negative. But that won’t stop people from using the “angry” button when they do not like something.

While this may upset a friend, it can actually help companies react and respond to complaints. What’s more, it may even help us predict trends and threats we may have not previously seen.


“I think it’s definitely possible to draw true insight from this, but you’ll need to do some very careful analytics before forming any meaningful conclusions.”

-Nipa Basu, Chief Analytics Officer, Dun & Bradstreet


Dun & Bradstreet’s Chief Analytics Officer, Nipa Basu, believes the new “Reactions” will be an opportunity for businesses to better understand their customers, but notes it will take a deep level of understanding to make perfect sense of it.

“I think it’s definitely possible to draw true insight from this, but you’ll need to do some very careful analytics before forming any meaningful conclusions,” explains Basu. “These ‘Reactions’ are typically based on real-time human emotion. Sometimes it will be fair. Sometimes it will be unfair. But if you have a large sample of a lot of people’s emotions reflected then you can begin to ask if that says something about the customer experience or the product or service from that business, and go from there.

“Then comes the second part. Does it matter? Looking deeper at what the comments suggest and how they correlate with the different types of ‘Reactions’ being received, a good analyst will be able to draw more accurate insights. Looking at both types of responses together will help understand what customers really felt.”

This is what Basu believes will make social sentiment analysis that much more effective. Not only will it open the door for brands to assess the success or relevance of their content, as well as measure customer satisfaction, it may paint a deeper picture about total risk and opportunity across industries that could benefit others.

“When you utilize a company like ours, where we already have our pulse on the health of global businesses and industries, and combine it with these social ‘Reactions,’ we can start to understand the correlation it has between business growth or degeneration,” said Basu. “Can looking at the amount of ‘angry’ comments predict the future health of a particular business or industry? Quite possibly. This is paving the way for even richer data that can drive even smarter analytics.”

Skills That Matter in a World Awash in Data

animal-197161Original content found on www.linkedin.com/skills-matter-world-awash-data

There is a paradox put forth by the French philosopher Jean Buridan which is commonly referred to as Buridan’s Ass. One interpretation goes something like this: Take a donkey and stake it equidistant between two identical piles of hay. Since the donkey is incapable of rational choice and the piles of hay are indistinguishable, the donkey will die of hunger. Of course, in the real world, we all presume the donkey would somehow “pick” a pile. We accept these situations all around us: fish seem to “choose” a direction to swim, birds of the same species seem to “decide” whether or not to migrate, and data seems to “suggest” things that we wish to prove. Which of these is not like the others? The answer is the data. Data has no ability to “act” on its own. We can use it or not, and it simply doesn’t care. The choice is entirely ours. The challenge is how we decide rationally what data to use and how to use it, when we have enough data, and when we have the “right” data. Making the wrong choice has serious consequences. Making the right choice can lead to enormous advantage.

Let’s look at the facts. We know that we are living in a world awash in data. Every day, we produce more data than the previous day, and at a rate which is arguably impossible to measure or model because we have lost the ability to see the boundaries. Data is not only created in places we can easily “see” such as the Internet, or on corporate servers. It is created in devices, it is created in the cloud, it is created in streams that may or may not be captured and stored, and it is created in places intentionally engineered to be difficult or impossible to perceive without special tools or privileges. Things are now talking to other things and producing data that only those things can see or use. There is no defendable premise that we can simply scale our approach to data from ten years ago to address the dynamic nature of data today.

This deluge of data is resulting in three inconvenient truths:

  1. Organizations are struggling to make use of the data already in hand, even as the amount of “discoverable” data increases at unprecedented rates.
  2. The data which can be brought to bear on a business problem is effectively unbounded, yet the requirements of governance and regulatory compliance make it increasingly difficult to experiment with new types of data.
  3. The skills we need to understand new data never before seen are extremely nuanced, and very different than those which have led to success so far.

Data already in hand – think airplanes and peanut butter.

Recently, I was on a flight which was delayed due to a mechanical issue. In such situations, the airline faces a complex problem, trying to estimate the delay and balance regulations, passenger connections, equipment availability, and many other factors. There is also a human element as people try to fix the problem. All I really wanted to know was how long I had in terms of delay. Did I have time to leave the gate and do something else? Did I have time to find a quiet place to work? In this situation, the answer was yes. The flight was delayed 2 hours. I wandered very far from the gate (bad idea). All of a sudden, I got a text message that as of 3:50PM, my flight was delayed to 3:48PM. I didn’t have time to wonder about time travel… I sprinted back to the gate, only to find a whole lot of nothing going on. It seemed that the airline systems that talk to each other to send out messaging were not communicating correctly with the ones that ingested data from the rest of the system. Stand down from red alert… No plane yet. False alarm.

While the situation is funny in retrospect, it wasn’t at the time. How many times do we do something like this to customers or colleagues? How many times do the complex systems we have built speak to one another in ways that were not intended and reach the wrong conclusions or send the wrong signals? I am increasingly finding senior executives who struggle to make sense out of the data already on-hand within their organization. In some cases, they are simply not interested in more data because they are overwhelmed with the data on hand.

This position is a very dangerous one to take. We can’t just “pick a pile of hay.” There is no logical reason to presume that the data in hand is sufficient to make any particular decision without some sort of analysis comparing three universes: data in hand, data that could be brought to bear on the problem, and data that we know exists but which is not accessible (e.g. covert, confidential, not disclosed). Only by assessing the relative size and importance of these three distinct sets of data in some meaningful way can we rationally make a determination that we are using sufficient data to make a data-based decision.

There is a phenomenon in computer science known as the “dispositive threshold.” This is the point at which sufficient information exists to make a decision. It does not, however, determine that there is sufficient information to make a repeatable decision, or an effective decision. Imagine that I asked you if you liked peanut butter and you had never tasted it. You don’t have enough information. After confirming that you know you don’t have a peanut allergy, I give you a spoon of peanut butter. You either “like” it or you don’t. You may feel you have enough information (dispositive threshold) until you learn that there is creamy and chunky peanut butter and you have only tasted one type, so you ask for a spoon of the other type. Now you learn that some peanut butter is salted and some isn’t. At some point, you step back and realize that all of these variations are not changing the essence of what peanut butter is. You can make a reasonable decision about how you feel about peanut butter without tasting all potential variations of peanut butter. You can answer the question “do you like peanut butter” but not the question “do you like all types of peanut butter.” The moral here, without getting into lots of math or philosophy, is this:

It is possible to make decisions with data if we are mindful about what data we have available. However, we must at least have some idea of the data we are not using in the decision-making process and a clear understanding of the constraints on the types of decisions we can make and defend.

Governance and regulatory compliance – bad guys and salad bars.

Governance essentially boils down to the three time-worn pieces of advice: “say what you’re going to do, do it, say you did it.” Of course, in the case of data-based decision making, there are many nuances in terms of deciding what you are going to do. Even before we consider rules and regulations, we can look at best practice and reasonableness. We must decide what information we will allow in the enterprise, how we will ingest it, evaluate it, store it, and use it. These become the rules of the road and governance is the process of making sure we follow those rules.

So far, this advice seems pretty straightforward, but consider what happens when the governance system gets washed over by a huge amount of data that has never been seen before. Some advocates of “big data” would suggest ingesting the data and using techniques such as unsupervised learning to tell us what the data means. This is a dangerous strategy akin to trying to eat everything on the salad bar. There is a very real risk that some data should never enter the enterprise. I would suggest that we need to take a few steps first to make sure we are “doing what we said we will do.” For example, have we looked at the way in which the data was created, what it is intended to contain, and a small sample of the data in a controlled environment to make sure it lives up to the promised content. Small steps before ingesting big data can avoid big, possibly unrecoverable mistakes.

Of course, even if we follow the rules very carefully, the system changes. In the case of governance, we must also consider the changing regulatory environment. For example, the first laws concerning expectations of privacy in electronic communication were in place before the Internet changed the way we communicate with one another. Many times, laws lag quite significantly behind technology, or lawmakers are influenced by changes in policy, so we must be careful to continuously re-evaluate what we are doing from a governance perspective to comply not only with internal policy, but also with evolving regulation. Sometimes, this process can get very tricky.

Consider the situation of looking for bad behavior. Bad guys are tricky. They continue to change their behavior, even as systems and processes evolve to detect bad behavior. In science, these types of problems are called “quantum observation” effects, where the thing being observed changes by virtue of being observed. Even the definition of “bad” changes over time or from the perspective of different geographies and use cases. When we create processes for governance, we look at the data we may permissibly ingest. When we create processes for detecting (or predicting) bad behavior, the dichotomy is that we must use data in permissible ways to detect malfeasant acts that are unconstrained by those same rules. So in effect, we have to use good data in good ways to detect bad actors operating in bad ways. The key take-away here is a tricky one:

We must be overt and observant about how we discover, curate and synthesize data to discover actions and insights that often shape or redefine the rules.

The skills we need – on change and wet babies.

There is an old saying that only wet babies like change all the time. The reality is that all of the massive amounts of data facing an enterprise are forcing leaders to look very carefully at the skills they are hiring into the organization. It is not enough to find people who will help “drive change” in the organization – we have to ensure we are driving the right change because the cost of being wrong is quite significant when the pace of change is so fast. I was once in a meeting where a leader was concerned about having to provide a type of training to a large group because their skill level would increase. “They are unskilled workers. What happens if we train them, and they leave?” he shouted. The smartest consultant I ever worked with crystallized the situation with the perfect reply, “What happens if you don’t train them and they stay!” Competitors and malefactors will certainly gain ground if we spend time chasing the wrong paths of inquiry, yet we can just as easily become paralyzed with analysis and do nothing, which is itself a decision that has cost (the cost of doing nothing is often the most damaging).

The key to driving change in the data arena is to balance the needs of the organization in the near term with the enabling capabilities that will be required in the future. Some skills, like the ability to deal with unstructured data, non-regressive methods (such as recursion and heuristic evaluation), and adjudication of veracity will require time to refine. We must be careful to spend some time building out the right longer-term capabilities so that they are ready when we need them.

At the same time, we must not ignore skills that may be needed to augment our capability in the short term. Examples might include better visualization, problem formulation, empirical (repeatable) methodology, and computational linguistics. Ultimately, I recommend considering three categories to consider from the perspective of skills in the data-based organization:

Consider what you believe, how you need to behave, and how you will measure and sustain progress.

Ultimately, the skills that matter are those that will drive value to the organization and to the customers served. As leaders in a world awash in data, we must be better than Buridan’s Ass. We must look beyond the hay. We live in an age where we will learn to do amazing things with data or become outpaced by those who gain better skills and capability. The opportunity goes to those who take a conscious decision to look at data in a new way, unconstrained and full of opportunity if we learn how to use it.

Data and Instinct – Inspiration’s Yin and Yang


With all the momentum around the easy access to customer data across the digital landscape and the variety of tools that let marketers make quick sense of that data, some might think human intuition is becoming obsolete.

We’re here to say human instinct is alive and well in the world of IoT and Big Data. Marketing Land’s article Human Intuition Vs. Marketing Data: Forging A New Alliance states,


“The human brain remains one of the most flexible and potent sources of computing on the planet…And the most effective marketing analytics solutions are those that recognize that human intuition – and perhaps most importantly, human curiosity – plays a critical role in moving from data to insight to decision to action.”

Netflix Decides with Data and Human Instinct

Netflix gets a lot of attention for its use of data to drive its powerful recommendation engine. It’s often cited as a key to the company’s continued global success, helping people discover lesser-known content that will delight them.

But along with streaming a large catalogue of movies and TV shows, Netflix has been gradually shifting its business model to emphasize more of its own original TV shows and movies. These Netflix originals (think House of Cards and Orange Is The New Black) have become cultural phenomena in their own right.

Netflix has said it uses data analytics in researching the development of new programming. But at the recent DLD Conference in Munich, Germany, CEO Reed Hastings noted that the company still leans just as hard on “gut instinct” in the realm of selecting its original content.

“We start with the data,” Hastings said. “But the final call is always gut. It’s informed intuition.”

It’s a good reminder for just about anyone working in the realm of Big Data. Yes, data can drive exciting new insights and learning. It’s a revolution. But that doesn’t mean it’s time to forget the human side of the equation.

And it’s notable that this argument is made by Hastings. Not only is he one of Silicon Valley’s most successful CEOs, he’s also a former artificial intelligence engineer. So he knows a thing or two about Big Data and how it works under the hood.

In the case of Netflix, that means the company bets big on data. But it also means, according to Hastings, that the people in position to make decisions have to be able to interpret that data, know its limits and make those hard calls on “gut instinct.”

Data won’t provide all the answers, and at some point the human in charge has to decide.

Of course, data picks right up again after those decisions have been made. For Netflix, that has provided interesting and unexpected insights into programming and viewing behavior.

For example, the company is funding development of an original French-language series, called Marseille, because data shows that French TV is popular around the world. That might come as a surprise to people in France, where the reputation for the quality of TV shows remains far below the country’s films.

Netflix also discovered through its data that many shows develop audiences in ways that might seem unpredictable. For one new show, Narcos, the plot is based in South America, created by a French studio and a massive hit in Germany of all places.

Discovering what drives those connections, Hastings said, will be a key tool in Netflix’s long-range goal of revolutionizing the way we make, watch and discover content.

Marketing Land’s article hits the nail on the head: “Creative interpretation of the data is always going to be essential.”

The inspiration companies find to solve their biggest challenges hinges on people knowing what questions to ask and what priorities to set. Data is essential in helping answer those questions and achieving those goals. When human instinct and the right data work in concert, the yin and yang of inspiration comes to life in the best ways possible.

Data Dialogue: CareerBuilder Consolidates CRMs, Identifies Hierarchies and Provides Actionable Insight for Sales Enablement

Data Dialogue Logo Header


A Q&A with CareerBuilder’s Director of Sales Productivity  

CareerBuilder, the global leader in human capital solutions, has evolved over recent years to better meet marketplace demands. “We’ve moved from transactional advertising sales to a software solution, which is much more complex with a longer sales process and a longer sales cycle,” says Maggie Palumbo, Director of Sales Productivity for CareerBuilder.

With that, it is critical that the sales teams Palumbo works with must be armed not only with information on which accounts to target, but also with intelligence they can use to approach – and engage – the potential customer.

More, as they continue to expand globally, their need for consistent data and CRM consolidation has become a priority. Establishing one system whereby employees all see the same information better positions the company for continued global expansion through more informed decision-making.

We talked with Palumbo about how she has been leading this effort.

What are you focused on now as you look to your priorities moving forward?

In the past, we did not have much in the way of governance. We had loose rules and accounts ended up not being fully optimized. We’re focused now on better segmentation to figure out where we can get the best return.

We use tools from Dun & Bradstreet to help us accomplish this goal. Specifically, we rely on geographical information, SIC codes for industry, employee total and other predictive elements. Then we bring it all together to get a score that tells us which accounts to target and where it belongs within our business.

How do you make the information actionable?

We are unique in that we take the full D&B marketable file, companies with more than 11 employees, and pass it along to a sales representative. Some reps, those with accounts with fewer than 500 employees, have thousands of accounts. What they need to know is where to focus within that account base. Scoring and intelligence allows us to highlight the gems within the pile that may have otherwise been neglected. The reps at the higher end of the business have fewer accounts and require more intelligence on each account to enable more informed sales calls.

Because we’ve moved from a transactional advertising sales model to one based on software solutions, our sales process is now more complex. The reps need intelligent information that they can rely on as they adjust their sales efforts to this new approach.

How do you make sure you’re actually uncovering new and unique opportunities?

We’ve gone more in-depth with our scoring and now pull in more elements. We also work with third party partners who do much of the manual digging. With that, we’re confident that we’re uncovering opportunities others may not necessarily see.

How do you marry together the data with the technology?

When the economy went south, our business could have been in big trouble. We are CareerBuilder. For a business whose focus is on hiring, and you’re in an economy when far fewer companies are hiring, where do you go with that?

Our CEO is forward-thinking and already started to expand our business to include human capital data solutions. With that, it became clear that we needed to have standardized data, which we do via our data warehouse. Once the data is normalized and set up properly, it can be pushed into the systems. Pulling the information from different sources together into one record is the challenge. We use Integration manager for that; the D-U-N-S Number serves as a framework for how we segment our data and we rely heavily on that insight.

How does Dun & Bradstreet data help within your sales force?

Data we get from Dun & Bradstreet provides us with excellent insight. While the reps may not care about the D-U-N-S Number, per se, they do care about the hierarchical information it may reveal—particularly with the growth of mergers and acquisitions over the past few years.

What other aspects of your work with Dun and Bradstreet have evolved?

We are a global company, but we are segmented. As we move everyone across the globe onto one CRM platform, we are creating more transparency. That is our goal. In order for people within the organization to effectively partner with each other, they must see the same information, including hierarchical structure. D&B has helped us bring that information together and identify those global hierarchies.

Tell me more about how linkage has helped you.

We used to be all over the place, with multiple CRMs across the organization globally. Some were even homegrown. We also wanted to get a better view on hierarchies. We lacked insight of what was going on with a company across multiple companies and therefore couldn’t leverage that information. We had to bring it all together through linkage. We’ve made great progress in terms of the global hierarchy and global organizational structure, and we couldn’t have done it without D&B.

Parting Words

As CareerBuilder continues to grow globally and evolve as a company to meet customer needs and demands of the marketplace, aligning the sales process with actionable intelligence is critical to its forward-moving trajectory. “It’s all about fully optimizing the data that’s available to us so that we can focus our efforts on where we can get the most return for the company,” said Palumbo.

The Chief Analytics Officer Takes Center Stage

Financial data and eyeglasses


From coast to coast, the business world is lauding the emergence of the Chief Analytics Officer (CAO). That’s right, we said Chief ANALYTICS Officer. Perhaps you were thinking about that other C-level role that recently dominated the headlines – the Chief Data Officer? Nope, the CDO is so 2015. Despite it being called the hottest job of the 21st century, it seems a new contender has entered the fray, that of the CAO.

All joking aside, the role of CDO has certainly not lost any of its luster; it remains an important position within the enterprise, it’s just that the CAO has now become just as significant. While both roles will need to coexist side-by-side, they face similar challenges, many of which were common themes during two recent industry events. Just looking at the massive turnout and the passionate discussions coming out of the Chief Analytics Officer Forum in New York and the Chief Data & Analytics Officer Exchange in California, it is apparent that the CAO will play a pivotal role in utilizing the modern organization’s greatest asset – data.

IDC predicts that the market for big data technology and services will grow at a 27% clip annually through 2017 – about six times faster than the overall IT market. But that windfall is not just going towards ways to obtain more data, it’s about investing in the right strategies to help make sense of data – an increasingly common perspective. Therefore, everyone is trying to figure out how to scale their analytical approach and drive more value across the organization.

Data alone is no longer the focal point for businesses, not without analytics to accompany it. We’re seeing the term ‘data analytics’ popping up more frequently. In fact, it’s the number one investment priority for Chief Analytics Officers over the next 12-24 months according to a Chief Analytics Officer Forum survey. That means an increased investment in data analytics and predictive analytics software tools. Not surprisingly, with the increased investment planned around these initiatives, the ability for data analytics to meet expectations across the company is the number one thing keeping CAO’s up at night, according to the same study.

The lively discussions during the course of both events featured some of the industry’s smartest minds discussing common challenges and objectives. Here are some of the most prevalent topics that were discussed.

  • The Need for C-Level Support: Very similar to the challenge facing the CDO, the CAO will need to secure buy-in from the C-level to make a real impact. Many speakers at both events expressed similar frustrations with getting the C-level to provide them the budget and resources needed to do their jobs. A good approach to take shared during one session was to build a business case which clearly quantifies the business value analytics will drive against specific goals. If you can tie everything back to ROI, you will have the ears of the CEO.
  • Breaking Down Silos: Even if you have attained support from the C-level, it is critical to partner with cross-functional departments. Whether it’s sales, marketing, finance, etc., tying the business value that analytics can drive to their specific goals will help the work relationship. These teams need to feel they are being included in your work. This theme was evident in many sessions, with speakers giving examples how they partnered with their colleagues to influence their business strategy and turn insights into action. At the end of the day, analytics is only as good as the data you have, and you need to ensure you are leveraging all of it across the enterprise.
  • Becoming a Storyteller: It was widely acknowledged that 80-85 percent of business executives who claim to understand analytics actually don’t. Hence the need to be able to simplify the message is critical to your success. There was a lot of discussion around what encompasses being a better storyteller. Being able to stay on point, avoiding technical jargon, relying on words versus numbers, and clearly quantifying and measuring business value were agreed upon paths to help the analytics group clearly communicate with the C-level.
  • Building the Right Team: Of all the discussions during both events, this was one of the most prominent themes and one of the biggest challenges shared by attendees. Where do you find the right talent? Views ranged from outsourcing and offshoring strategies to partnering with universities to develop a combined curriculum for undergrads and graduate students.

Everyone agreed the right candidate should have 4 distinct skills:

  • Quant skills, i.e. math/stats qualification
  • Business acumen
  • Coding skills/visualization
  • Communication and consulting skills

Since it is very difficult to find all four skills in a single person, the consensus was the perfect analytics team should consist of 3 tiers of analytics resources that should be thought through when building a team:

  • Statisticians, modelers and PhDs
  • Computer scientists
  • Measurement and reporting

From talent to strategy, the past two analytics-focused events underline the importance of employing a CAO in the enterprise. As data and analytics continue to be the core drivers of business growth, the CAO will not only need a prominent seat at the table, they will need the freedom and resources to help turn analytics into actionable insights for the entire enterprise.

Is Anticipatory Analytics the Path Toward Future Truth?


New Whitepaper Explores the Arrival of Anticipatory Analytics

Most everyone is familiar with the image of the eccentric fortune-teller gazing into her crystal ball to boldly predict the future. In the business world, teams of analytic experts are doing this everyday; they’re just using data instead of a crystal ball to get a glimpse into the future.

Thanks to advanced analytics, organizations are able to understand potential outcomes and evaluate how issues can be addressed. By generating predictive models based on all the data being captured, a new level of transparency and foresight has been created that helps shape future business strategy based on historical trends. This is called predicative analytics, and it is “the fastest growing segment of the business intelligence (BI) and analytics software market,” according to Information Management.

But for all of the promise around predictive analytics, there is some criticism.  For instance, since environments and people are always changing, relying on historical trends is said to be too simplistic and sterile to say something will or will not happen with a great degree of certainty. But a new analytic approach has emerged that may be better at grasping future outcomes.

As technology has evolved, so has our ability to process data at an incredible rate, making it possible to perform what has become known as anticipatory analytics. While still a relatively new concept, anticipatory analytics is gaining prevalence as a methodology.  It leapfrogs predictive analytics in that it enables companies to forecast future behaviors quicker than traditional predictive analytics by identifying change, acceleration and deceleration of market dynamics.

In order to make this possible, the right mixture of data, processing tools, technology and expertise plays a central role. The following developments play key roles in being able to address the future, today.

Key Enablers of Anticipatory Analytics

4 trends are making anticipatory analytics a reality.

To gain a deeper understanding of the emergence of anticipatory analytics, and how it should be utilized in your organization, check out this detailed guide that outlines the differences between anticipatory and predictive.

INFOGRAPHIC: Exploring Inter-Connected Business Relationships

downloadHow Understanding Relationships Drives Better Data and Analytics

How much do you really know about the companies you do business with day in and day out?  Sure, you may understand how your growth is affected by dealings with that supplier you’ve worked with for over a decade. But what about the association that supplier has to another company you may have never even heard of?  Don’t believe how they do business can inadvertently effect you? Think again.

Among the entities you do business with lies potentially crucial insights and information that can be critical in assessing total risk and opportunity. While not obvious at first glance, these insights become visible when you dive down to explore the information that links these entitles to you, as well as their connections to other businesses. By doing so, you’ll be able to understand the full potential of your relationships with customers, prospects, suppliers and partners. This is relationship data – when information about two or more entities are brought together along with their business activities to inform an implied business impact or outcome. Through a combination of interpreting the right signal data and implementing advanced analytics uncovered in this data, unmet needs arise, hidden dangers surface and new opportunities can be identified.

Every company has relationship data; they just need to know where to look for it, or who to partner with to obtain the right information. The infographic below describes the different types of relationships that exist; some you easily see, while other relationships are harder to decipher, but just as important to your bottom-line. Understanding the way in which two or more entities are connected is the foundation of this data.

The more you connect and expose entities across your databases, the greater your visibility into the cross-company interactions with these enterprises. The ability to uncover previously hidden associations inside the data provides a catalyst for business transformation and insights. Exposing relationships across product lines, branches and countries creates opportunities to evaluate sales coverage, modify compensation plans, renegotiate terms and conditions, adjust compliance policies, improve customer experiences, build advanced segmentation categories and uncover hidden supply chain risk.

Dive down to discover the many sources of relationship data.

Exploring the Relationship Data Iceberg


It is important to remember that relationships can be one-to-one, one-to-many or many-to-many. They can be uni-directional or bi-directional in nature. Understanding the differences can be key to the types of questions you ask and what insights you draw from the data.

The deeper you go in connecting the associated entities and the information that aligns to their business practices, the richer the insights you’ll uncover.  Ultimately, these richer data points enable you to move beyond simple modeling based on internal historical data and produce sophisticated business models grounded in multifaceted business connections.

As more businesses point to smart data as a conduit to growth, it’s important to ask the right questions of your data in order to extract meaningful insights to propel your business. That means going beneath the surface of what you normally see and exploring your business relationships to fully understand the cause and effect in your very own ecosystem.

Five Surprising Big Ideas about the Future

Jay BergesenMarket research firm Gartner is in the business of trying to make forecasts about how markets and technologies will change. But each year, the firm tries to go a step further with its Maverick program.

The Maverick program lets Gartner analysts step outside their comfort and coverage zones and identify big, sweeping ideas and trends that might be over the horizon. It was the Maverick process, for instance, that led Garter to make early predictions of the importance of things like cloud computing, social media, and the consumerization of IT in the workplace.

Not a bad track record – and so it’s worth reading about what they think is coming next. In their latest summary of their Mavericks sessions, here are five things Gartner says we should all be watching.

1. Big Data

This one may be particularly surprising. Gartner says we are not being skeptical and critical enough when it comes to the “insights” we think we are getting from Big Data. The analysts worry that data scientists who lack business backgrounds may be interpreting data in ways that don’t actually reflect what’s happening in the real world, or the way businesses work. “A naïve faith in the infallibility of big data analytics and decision-making algorithms does more harm than good,” Gartner writes. “People must rediscover a questioning and skeptical approach that asserts their authority over these useful tools and stops the tail from wagging the dog.”

2. Machine Learning

The good news is that machines that teach themselves are going to allow businesses to automate a wide range of labor-intensive tasks. The fear, however, is that this wipes out large classes of jobs. Will the productivity gains be offset by sluggish or even declining job growth? Gartner is optimistic that over time, “the sum of employment gains and losses will come out positive”.

3. IT Careers

This one may seem totally counterintuitive: “IT departments will be forced to fire or reassign most of their existing workers over the next several years.” Huh? For anyone who uses IT at work, they likely wish their IT department had more people with more resources to keep things running smoothly. But Gartner suspects that the advent of more agile development and cloud-based development will allow IT departments to move much quicker with far fewer people. “The transformational nature of this change requires IT departments to eliminate most of their existing positions and define new ones to be filled,” Gartner says.

4. Quantified Self

We’re all hearing plenty about wearables that let us track our exercise, heart rate and calorie intake. But the next step, according to Gartner, may be that these new types of data get formally incorporated into the health insurance industry, creating scenarios where the results of this information determine the type and cost of coverage we can get. “People will be held accountable for living healthy lifestyles — enabled by analyzing data about what they do and buy,” Gartner says.

5. Digital Economics of Personal Data

As we agree to put more personal data online, and more valuable and sensitive data, a kind of data bank will emerge that will allow us to store and protect that information. At the same time, these banks will create ways for us to “lend” out this data to create value, or even get paid in some form. “The system will lend this data under carefully controlled and limited conditions to others at the owner’s discretion to generate an income stream for the owner,” Gartner says.

Interesting ideas, all.

Image credit: Jay Bergesen

1010data Feeds D&B Data to Capital Markets

Sam valadiDun & Bradstreet and 1010data announced a strategic partnership earlier this week. And that’s exciting for two reasons.

For Dun & Bradstreet, this is its first partnership in the area of capital markets analytics – a giant opportunity where small improvements can yield big results. For 1010data, it can now offer its users – hedge fund and asset managers – additional business performance data to help them analyze investment opportunities more thoroughly and take action.

Starting this week, 1010data will offer Dun & Bradstreet commercial data within its Big Data Discovery and data sharing platform. 1010data’s platform helps fast-moving users find and act on data from many sources, which is especially helpful when they’re considering complex transactions with significant upside – and potential risk. Dun & Bradstreet business performance and health data is now part of that mix, helping investors create more complete strategies and models for investing in their sectors.

More than 700 of the world’s largest retail, manufacturing, telecom, and financial services enterprises trust 1010data to manage and analyze over 20 trillion rows of data. Our hat off to 1010data and its excellent technology. We look forward to helping them help their customers in any way we can.

Image credit: Sam Valadi