Think Data First, Platform Second – Why Data Fuels MDM



As the volume of data coming into organizations – from both internal and external sources – continues to grow and makes its way across departmental systems in many different formats, there is a critical need to create a single, holistic view of the key data entities in common use across the enterprise. Master Data Management (MDM) aims to accomplish this goal. Not surprisingly, MDM has become a significant priority for global enterprises, with the market expected to triple from $9.4B to $26.8B by 2020 according to analysts.

But while everyone is investing serious cash into the tools to manage the data, few are putting any thought into the data itself. This is akin to purchasing a luxury sports car and fueling it with water. Sure it looks great, but it won’t get you very far.


The underlying concept of MDM is surprisingly simple: get everyone “on the same page” looking at the same data and ensure it is accurate. Yet, master data and its management continue to be a universal challenge across many industries.  Organizations of all shapes and sizes share similar problems related to master data and can all reap benefits from solving them. That means concentrating on the quality of the data before going shopping for the sexiest MDM platform. In essence, you must master data before you can manage it. Ensuring the quality, structure, and integrability is your responsibility; your MDM platform won’t do that for you. It’s like purchasing a top-of-the-line oven and expecting it to produce a delectable meal. You are responsible for what goes into it.

Master Data Defined

Master Data is the foundational information on customers, vendors and prospect that must be shared across all internal systems, applications, and processes in order for your commercial data, transactional reporting, and business activity to be optimized and accurate. Because individual businesses and departments have a need to plan, execute, monitor and analyze these common entities, multiple versions of the same data can reside in separate departmental systems. This results in disparate data, which is difficult to integrate across functions and quite costly to manage in terms of resources and IT development. Cross-channel initiatives, buying and planning, merger and acquisition activity, and content management all create new data silos. Major strategic endeavors, part of any business intelligence strategy, can be hampered or derailed if fundamental master data is not in place. In reality, master data is the only way to connect multiple systems and processes both internally and externally.

Master data is the most important data you have.  It’s about the products you make and services you provide, the customers you sell to and the the vendors you buy from. It is the basis of your business and commercial relationship. A primary focus area should be your ability to define your foundational master data elements, (entities, hierarchies and types) and then the data that is needed (both to be mastered and to be accessible) to meet your business objective. If you focus on this before worrying about the solution, you’ll be on the right course for driving success with MDM. Always remember, think data first and platform second.

Will Businesses Like Facebook’s New Reaction Buttons?

Thumbs Up


New Data Will Reveal How Customers Really Feel

If you recently updated your Facebook status, you may see a wide array of responses that goes beyond the fabled “like” riposte the social media platform has become known for. Last month Facebook unveiled “Reactions,” which offers its users five additional ways to express their opinions on everything from pictures of your cats to your brash political musings. While it will certainly give users more choices in how they interact with friends, it will also give businesses deeper insights into customer sentiment.

Reactions are essentially an extension of the like button, with six different buttons, or “animated emojis” (ask a millennial): “Like,” “Love,” “Haha,” “Wow,” “Sad” or “Angry.” Despite the public outcry for a “dislike” button, Facebook did not include one because the company felt it could be construed as negative. But that won’t stop people from using the “angry” button when they do not like something.

While this may upset a friend, it can actually help companies react and respond to complaints. What’s more, it may even help us predict trends and threats we may have not previously seen.


“I think it’s definitely possible to draw true insight from this, but you’ll need to do some very careful analytics before forming any meaningful conclusions.”

-Nipa Basu, Chief Analytics Officer, Dun & Bradstreet


Dun & Bradstreet’s Chief Analytics Officer, Nipa Basu, believes the new “Reactions” will be an opportunity for businesses to better understand their customers, but notes it will take a deep level of understanding to make perfect sense of it.

“I think it’s definitely possible to draw true insight from this, but you’ll need to do some very careful analytics before forming any meaningful conclusions,” explains Basu. “These ‘Reactions’ are typically based on real-time human emotion. Sometimes it will be fair. Sometimes it will be unfair. But if you have a large sample of a lot of people’s emotions reflected then you can begin to ask if that says something about the customer experience or the product or service from that business, and go from there.

“Then comes the second part. Does it matter? Looking deeper at what the comments suggest and how they correlate with the different types of ‘Reactions’ being received, a good analyst will be able to draw more accurate insights. Looking at both types of responses together will help understand what customers really felt.”

This is what Basu believes will make social sentiment analysis that much more effective. Not only will it open the door for brands to assess the success or relevance of their content, as well as measure customer satisfaction, it may paint a deeper picture about total risk and opportunity across industries that could benefit others.

“When you utilize a company like ours, where we already have our pulse on the health of global businesses and industries, and combine it with these social ‘Reactions,’ we can start to understand the correlation it has between business growth or degeneration,” said Basu. “Can looking at the amount of ‘angry’ comments predict the future health of a particular business or industry? Quite possibly. This is paving the way for even richer data that can drive even smarter analytics.”

Skills That Matter in a World Awash in Data

animal-197161Original content found on

There is a paradox put forth by the French philosopher Jean Buridan which is commonly referred to as Buridan’s Ass. One interpretation goes something like this: Take a donkey and stake it equidistant between two identical piles of hay. Since the donkey is incapable of rational choice and the piles of hay are indistinguishable, the donkey will die of hunger. Of course, in the real world, we all presume the donkey would somehow “pick” a pile. We accept these situations all around us: fish seem to “choose” a direction to swim, birds of the same species seem to “decide” whether or not to migrate, and data seems to “suggest” things that we wish to prove. Which of these is not like the others? The answer is the data. Data has no ability to “act” on its own. We can use it or not, and it simply doesn’t care. The choice is entirely ours. The challenge is how we decide rationally what data to use and how to use it, when we have enough data, and when we have the “right” data. Making the wrong choice has serious consequences. Making the right choice can lead to enormous advantage.

Let’s look at the facts. We know that we are living in a world awash in data. Every day, we produce more data than the previous day, and at a rate which is arguably impossible to measure or model because we have lost the ability to see the boundaries. Data is not only created in places we can easily “see” such as the Internet, or on corporate servers. It is created in devices, it is created in the cloud, it is created in streams that may or may not be captured and stored, and it is created in places intentionally engineered to be difficult or impossible to perceive without special tools or privileges. Things are now talking to other things and producing data that only those things can see or use. There is no defendable premise that we can simply scale our approach to data from ten years ago to address the dynamic nature of data today.

This deluge of data is resulting in three inconvenient truths:

  1. Organizations are struggling to make use of the data already in hand, even as the amount of “discoverable” data increases at unprecedented rates.
  2. The data which can be brought to bear on a business problem is effectively unbounded, yet the requirements of governance and regulatory compliance make it increasingly difficult to experiment with new types of data.
  3. The skills we need to understand new data never before seen are extremely nuanced, and very different than those which have led to success so far.

Data already in hand – think airplanes and peanut butter.

Recently, I was on a flight which was delayed due to a mechanical issue. In such situations, the airline faces a complex problem, trying to estimate the delay and balance regulations, passenger connections, equipment availability, and many other factors. There is also a human element as people try to fix the problem. All I really wanted to know was how long I had in terms of delay. Did I have time to leave the gate and do something else? Did I have time to find a quiet place to work? In this situation, the answer was yes. The flight was delayed 2 hours. I wandered very far from the gate (bad idea). All of a sudden, I got a text message that as of 3:50PM, my flight was delayed to 3:48PM. I didn’t have time to wonder about time travel… I sprinted back to the gate, only to find a whole lot of nothing going on. It seemed that the airline systems that talk to each other to send out messaging were not communicating correctly with the ones that ingested data from the rest of the system. Stand down from red alert… No plane yet. False alarm.

While the situation is funny in retrospect, it wasn’t at the time. How many times do we do something like this to customers or colleagues? How many times do the complex systems we have built speak to one another in ways that were not intended and reach the wrong conclusions or send the wrong signals? I am increasingly finding senior executives who struggle to make sense out of the data already on-hand within their organization. In some cases, they are simply not interested in more data because they are overwhelmed with the data on hand.

This position is a very dangerous one to take. We can’t just “pick a pile of hay.” There is no logical reason to presume that the data in hand is sufficient to make any particular decision without some sort of analysis comparing three universes: data in hand, data that could be brought to bear on the problem, and data that we know exists but which is not accessible (e.g. covert, confidential, not disclosed). Only by assessing the relative size and importance of these three distinct sets of data in some meaningful way can we rationally make a determination that we are using sufficient data to make a data-based decision.

There is a phenomenon in computer science known as the “dispositive threshold.” This is the point at which sufficient information exists to make a decision. It does not, however, determine that there is sufficient information to make a repeatable decision, or an effective decision. Imagine that I asked you if you liked peanut butter and you had never tasted it. You don’t have enough information. After confirming that you know you don’t have a peanut allergy, I give you a spoon of peanut butter. You either “like” it or you don’t. You may feel you have enough information (dispositive threshold) until you learn that there is creamy and chunky peanut butter and you have only tasted one type, so you ask for a spoon of the other type. Now you learn that some peanut butter is salted and some isn’t. At some point, you step back and realize that all of these variations are not changing the essence of what peanut butter is. You can make a reasonable decision about how you feel about peanut butter without tasting all potential variations of peanut butter. You can answer the question “do you like peanut butter” but not the question “do you like all types of peanut butter.” The moral here, without getting into lots of math or philosophy, is this:

It is possible to make decisions with data if we are mindful about what data we have available. However, we must at least have some idea of the data we are not using in the decision-making process and a clear understanding of the constraints on the types of decisions we can make and defend.

Governance and regulatory compliance – bad guys and salad bars.

Governance essentially boils down to the three time-worn pieces of advice: “say what you’re going to do, do it, say you did it.” Of course, in the case of data-based decision making, there are many nuances in terms of deciding what you are going to do. Even before we consider rules and regulations, we can look at best practice and reasonableness. We must decide what information we will allow in the enterprise, how we will ingest it, evaluate it, store it, and use it. These become the rules of the road and governance is the process of making sure we follow those rules.

So far, this advice seems pretty straightforward, but consider what happens when the governance system gets washed over by a huge amount of data that has never been seen before. Some advocates of “big data” would suggest ingesting the data and using techniques such as unsupervised learning to tell us what the data means. This is a dangerous strategy akin to trying to eat everything on the salad bar. There is a very real risk that some data should never enter the enterprise. I would suggest that we need to take a few steps first to make sure we are “doing what we said we will do.” For example, have we looked at the way in which the data was created, what it is intended to contain, and a small sample of the data in a controlled environment to make sure it lives up to the promised content. Small steps before ingesting big data can avoid big, possibly unrecoverable mistakes.

Of course, even if we follow the rules very carefully, the system changes. In the case of governance, we must also consider the changing regulatory environment. For example, the first laws concerning expectations of privacy in electronic communication were in place before the Internet changed the way we communicate with one another. Many times, laws lag quite significantly behind technology, or lawmakers are influenced by changes in policy, so we must be careful to continuously re-evaluate what we are doing from a governance perspective to comply not only with internal policy, but also with evolving regulation. Sometimes, this process can get very tricky.

Consider the situation of looking for bad behavior. Bad guys are tricky. They continue to change their behavior, even as systems and processes evolve to detect bad behavior. In science, these types of problems are called “quantum observation” effects, where the thing being observed changes by virtue of being observed. Even the definition of “bad” changes over time or from the perspective of different geographies and use cases. When we create processes for governance, we look at the data we may permissibly ingest. When we create processes for detecting (or predicting) bad behavior, the dichotomy is that we must use data in permissible ways to detect malfeasant acts that are unconstrained by those same rules. So in effect, we have to use good data in good ways to detect bad actors operating in bad ways. The key take-away here is a tricky one:

We must be overt and observant about how we discover, curate and synthesize data to discover actions and insights that often shape or redefine the rules.

The skills we need – on change and wet babies.

There is an old saying that only wet babies like change all the time. The reality is that all of the massive amounts of data facing an enterprise are forcing leaders to look very carefully at the skills they are hiring into the organization. It is not enough to find people who will help “drive change” in the organization – we have to ensure we are driving the right change because the cost of being wrong is quite significant when the pace of change is so fast. I was once in a meeting where a leader was concerned about having to provide a type of training to a large group because their skill level would increase. “They are unskilled workers. What happens if we train them, and they leave?” he shouted. The smartest consultant I ever worked with crystallized the situation with the perfect reply, “What happens if you don’t train them and they stay!” Competitors and malefactors will certainly gain ground if we spend time chasing the wrong paths of inquiry, yet we can just as easily become paralyzed with analysis and do nothing, which is itself a decision that has cost (the cost of doing nothing is often the most damaging).

The key to driving change in the data arena is to balance the needs of the organization in the near term with the enabling capabilities that will be required in the future. Some skills, like the ability to deal with unstructured data, non-regressive methods (such as recursion and heuristic evaluation), and adjudication of veracity will require time to refine. We must be careful to spend some time building out the right longer-term capabilities so that they are ready when we need them.

At the same time, we must not ignore skills that may be needed to augment our capability in the short term. Examples might include better visualization, problem formulation, empirical (repeatable) methodology, and computational linguistics. Ultimately, I recommend considering three categories to consider from the perspective of skills in the data-based organization:

Consider what you believe, how you need to behave, and how you will measure and sustain progress.

Ultimately, the skills that matter are those that will drive value to the organization and to the customers served. As leaders in a world awash in data, we must be better than Buridan’s Ass. We must look beyond the hay. We live in an age where we will learn to do amazing things with data or become outpaced by those who gain better skills and capability. The opportunity goes to those who take a conscious decision to look at data in a new way, unconstrained and full of opportunity if we learn how to use it.

Data and Instinct – Inspiration’s Yin and Yang


With all the momentum around the easy access to customer data across the digital landscape and the variety of tools that let marketers make quick sense of that data, some might think human intuition is becoming obsolete.

We’re here to say human instinct is alive and well in the world of IoT and Big Data. Marketing Land’s article Human Intuition Vs. Marketing Data: Forging A New Alliance states,


“The human brain remains one of the most flexible and potent sources of computing on the planet…And the most effective marketing analytics solutions are those that recognize that human intuition – and perhaps most importantly, human curiosity – plays a critical role in moving from data to insight to decision to action.”

Netflix Decides with Data and Human Instinct

Netflix gets a lot of attention for its use of data to drive its powerful recommendation engine. It’s often cited as a key to the company’s continued global success, helping people discover lesser-known content that will delight them.

But along with streaming a large catalogue of movies and TV shows, Netflix has been gradually shifting its business model to emphasize more of its own original TV shows and movies. These Netflix originals (think House of Cards and Orange Is The New Black) have become cultural phenomena in their own right.

Netflix has said it uses data analytics in researching the development of new programming. But at the recent DLD Conference in Munich, Germany, CEO Reed Hastings noted that the company still leans just as hard on “gut instinct” in the realm of selecting its original content.

“We start with the data,” Hastings said. “But the final call is always gut. It’s informed intuition.”

It’s a good reminder for just about anyone working in the realm of Big Data. Yes, data can drive exciting new insights and learning. It’s a revolution. But that doesn’t mean it’s time to forget the human side of the equation.

And it’s notable that this argument is made by Hastings. Not only is he one of Silicon Valley’s most successful CEOs, he’s also a former artificial intelligence engineer. So he knows a thing or two about Big Data and how it works under the hood.

In the case of Netflix, that means the company bets big on data. But it also means, according to Hastings, that the people in position to make decisions have to be able to interpret that data, know its limits and make those hard calls on “gut instinct.”

Data won’t provide all the answers, and at some point the human in charge has to decide.

Of course, data picks right up again after those decisions have been made. For Netflix, that has provided interesting and unexpected insights into programming and viewing behavior.

For example, the company is funding development of an original French-language series, called Marseille, because data shows that French TV is popular around the world. That might come as a surprise to people in France, where the reputation for the quality of TV shows remains far below the country’s films.

Netflix also discovered through its data that many shows develop audiences in ways that might seem unpredictable. For one new show, Narcos, the plot is based in South America, created by a French studio and a massive hit in Germany of all places.

Discovering what drives those connections, Hastings said, will be a key tool in Netflix’s long-range goal of revolutionizing the way we make, watch and discover content.

Marketing Land’s article hits the nail on the head: “Creative interpretation of the data is always going to be essential.”

The inspiration companies find to solve their biggest challenges hinges on people knowing what questions to ask and what priorities to set. Data is essential in helping answer those questions and achieving those goals. When human instinct and the right data work in concert, the yin and yang of inspiration comes to life in the best ways possible.

Data Dialogue: CareerBuilder Consolidates CRMs, Identifies Hierarchies and Provides Actionable Insight for Sales Enablement

Data Dialogue Logo Header


A Q&A with CareerBuilder’s Director of Sales Productivity  

CareerBuilder, the global leader in human capital solutions, has evolved over recent years to better meet marketplace demands. “We’ve moved from transactional advertising sales to a software solution, which is much more complex with a longer sales process and a longer sales cycle,” says Maggie Palumbo, Director of Sales Productivity for CareerBuilder.

With that, it is critical that the sales teams Palumbo works with must be armed not only with information on which accounts to target, but also with intelligence they can use to approach – and engage – the potential customer.

More, as they continue to expand globally, their need for consistent data and CRM consolidation has become a priority. Establishing one system whereby employees all see the same information better positions the company for continued global expansion through more informed decision-making.

We talked with Palumbo about how she has been leading this effort.

What are you focused on now as you look to your priorities moving forward?

In the past, we did not have much in the way of governance. We had loose rules and accounts ended up not being fully optimized. We’re focused now on better segmentation to figure out where we can get the best return.

We use tools from Dun & Bradstreet to help us accomplish this goal. Specifically, we rely on geographical information, SIC codes for industry, employee total and other predictive elements. Then we bring it all together to get a score that tells us which accounts to target and where it belongs within our business.

How do you make the information actionable?

We are unique in that we take the full D&B marketable file, companies with more than 11 employees, and pass it along to a sales representative. Some reps, those with accounts with fewer than 500 employees, have thousands of accounts. What they need to know is where to focus within that account base. Scoring and intelligence allows us to highlight the gems within the pile that may have otherwise been neglected. The reps at the higher end of the business have fewer accounts and require more intelligence on each account to enable more informed sales calls.

Because we’ve moved from a transactional advertising sales model to one based on software solutions, our sales process is now more complex. The reps need intelligent information that they can rely on as they adjust their sales efforts to this new approach.

How do you make sure you’re actually uncovering new and unique opportunities?

We’ve gone more in-depth with our scoring and now pull in more elements. We also work with third party partners who do much of the manual digging. With that, we’re confident that we’re uncovering opportunities others may not necessarily see.

How do you marry together the data with the technology?

When the economy went south, our business could have been in big trouble. We are CareerBuilder. For a business whose focus is on hiring, and you’re in an economy when far fewer companies are hiring, where do you go with that?

Our CEO is forward-thinking and already started to expand our business to include human capital data solutions. With that, it became clear that we needed to have standardized data, which we do via our data warehouse. Once the data is normalized and set up properly, it can be pushed into the systems. Pulling the information from different sources together into one record is the challenge. We use Integration manager for that; the D-U-N-S Number serves as a framework for how we segment our data and we rely heavily on that insight.

How does Dun & Bradstreet data help within your sales force?

Data we get from Dun & Bradstreet provides us with excellent insight. While the reps may not care about the D-U-N-S Number, per se, they do care about the hierarchical information it may reveal—particularly with the growth of mergers and acquisitions over the past few years.

What other aspects of your work with Dun and Bradstreet have evolved?

We are a global company, but we are segmented. As we move everyone across the globe onto one CRM platform, we are creating more transparency. That is our goal. In order for people within the organization to effectively partner with each other, they must see the same information, including hierarchical structure. D&B has helped us bring that information together and identify those global hierarchies.

Tell me more about how linkage has helped you.

We used to be all over the place, with multiple CRMs across the organization globally. Some were even homegrown. We also wanted to get a better view on hierarchies. We lacked insight of what was going on with a company across multiple companies and therefore couldn’t leverage that information. We had to bring it all together through linkage. We’ve made great progress in terms of the global hierarchy and global organizational structure, and we couldn’t have done it without D&B.

Parting Words

As CareerBuilder continues to grow globally and evolve as a company to meet customer needs and demands of the marketplace, aligning the sales process with actionable intelligence is critical to its forward-moving trajectory. “It’s all about fully optimizing the data that’s available to us so that we can focus our efforts on where we can get the most return for the company,” said Palumbo.

Breaking Down the Business Relationships in Breaking Bad



How Saul Goodman Can Teach Businesses About the Value of Understanding Relationships

This week, TV viewers witnessed the return of Jimmy McGill, a scrappy and indefatigable attorney struggling for respect and reward. Spoiler alert: If you watched Breaking Bad, you know the upstart Albuquerque lawyer goes on to become Saul Goodman, the lawyer and adviser for eventual meth kingpin Walter White, or as he’s known on the street, Heisenberg.

Now in its second season, AMC’s hit show Better Call Saul shows the transformation of the naïve McGill into what would become one of the city’s most notorious criminal defense attorneys. But it doesn’t happen by chance. His ability to understand and manipulate relationships plays a huge role, something many businesses can learn a thing or two about. But before I proceed, if you have not not watched Breaking Bad, I implore you do so immediately. Go on, watch it now, and then come back and read this article, otherwise you’re going to be a bit lost.

In Breaking Bad we learn that Saul Goodman is a key player in Walter White’s evolution from everyday chemistry teacher to criminal mastermind, constantly getting him out of several sticky situations over the course of his drug business operations. Goodman is effective in helping Walt stay one step ahead of the police and competing drug czars because of his extensive connections within the criminal underworld, as well as serving as a go-between connecting drug distributors, evidence removers, impersonators, and other criminals-for-hire.

What makes Goodman so successful is his network of relationships. He knows all the players and how they are connected to others and uses that knowledge to his advantage. Ultimately, it’s what probably keeps him and his clients alive for so long. Other entities in the Breaking Bad world are not so lucky. Shotgun blasts and burning faces aside, I’m talking about the businesses that were ultimately crippled by the chain of events that were set off by Walter White’s maniacal obsession for power.

The Breaking Bad series finale shows us the fate of all the major characters, but what about everyone else that has some underlying connection to what went down?

We learned that Walt’s meth empire was funded by a multifaceted conglomerate headquartered in Germany called Madrigal Electromotive. According to Wikia, Madrigal is highly diversified in industrial equipment, manufacturing, global shipping, construction and fast food; the most notorious being the American fried chicken chain, Los Pollos Hermanos.

Founded by Gustavo Fring, the Los Pollos Hermanos restaurant chain had fourteen locations throughout the southwest and was a subsidiary of Madrigal. As we learned during the course of the show, the restaurant provided money-laundering and logistics for illegal activities. It’s safe to assume that following the death of its founder and his reported connection to engineering a billion-dollar drug empire, business suffered. Every enterprise that was directly doing business with the fried chicken chain likely cut ties with them as soon as the news broke. From the factory providing the meat to the manufacturer supplying the utensils, these businesses were aware that Los Pollos Hermanos would suffer and were able to plan in advance for a revenue downfall.

But what about the other suppliers that did not realize they were working with entities that had connections to Los Pollos Hermanos’ parent company? Madrigal is spread across 14 divisions, including a massive investment in fast food. The fast-food division, formerly run by Herr Peter Schuler, encompasses a stable of 7 fast-food eateries, including Whiskerstay’s, Haau Chuen Wok, Burger Matic, and Polmieri Pizza. Following the breaking news of the drug ring, the resulting investigation likely sent shockwaves throughout the entire Madrigal enterprise and subsequently hurt all of its businesses in some shape or form. But let’s look at the supplier of dough for Polmieri Pizza for example. Do you think they knew the pizza shop they do business with was a subsidiary of Madrigal and would be a casualty of the meth trade? Very unlikely.

Because Polmieri Pizza is a subsidiary of Madrigal, they will be at least indirectly effected. While its parent company will be in damage control – a change of management, a freeze on funds, etc. – the innocuous pizza shop will be impacted, even if it is only short term. During this time, the dough supplier has no clue to the grievous relationship the pizza shop has to Madrigal and that it should expect some change in how they work with the pizza eatery. If they had known there was any connection, they may have been able to plan ahead and cut down on production and accounted for less orders. Instead, they are caught by surprise and left overstocked and under water.

This could have been avoided if the dough manufacturer leveraged its relationship data. Every company has relationship data; they just need to know where to look for it, or who to partner with to obtain the right information.

Relationship data is information about two or more entities that are brought together along with their business activities to inform an implied business impact or outcome. Through a combination of interpreting the right signal data and implementing advanced analytics uncovered in this data, unmet needs arise, hidden dangers surface and new opportunities can be identified.

Of course, this is just an example of the importance of being able to recognize business relationships based on a fictional show. But not being able to do so could prove to be a grave reality for businesses of all shapes and form. If the companies with business connections to Madrigal’s vast enterprise had had a sense of relationship data, what would they have seen?

If you can take anything away from the Saul Goodman’s of the world, it is this: know how all your relationships are connected and you will know how to solve problems, manage revenue – and stay out of trouble.

The Chief Analytics Officer Takes Center Stage

Financial data and eyeglasses


From coast to coast, the business world is lauding the emergence of the Chief Analytics Officer (CAO). That’s right, we said Chief ANALYTICS Officer. Perhaps you were thinking about that other C-level role that recently dominated the headlines – the Chief Data Officer? Nope, the CDO is so 2015. Despite it being called the hottest job of the 21st century, it seems a new contender has entered the fray, that of the CAO.

All joking aside, the role of CDO has certainly not lost any of its luster; it remains an important position within the enterprise, it’s just that the CAO has now become just as significant. While both roles will need to coexist side-by-side, they face similar challenges, many of which were common themes during two recent industry events. Just looking at the massive turnout and the passionate discussions coming out of the Chief Analytics Officer Forum in New York and the Chief Data & Analytics Officer Exchange in California, it is apparent that the CAO will play a pivotal role in utilizing the modern organization’s greatest asset – data.

IDC predicts that the market for big data technology and services will grow at a 27% clip annually through 2017 – about six times faster than the overall IT market. But that windfall is not just going towards ways to obtain more data, it’s about investing in the right strategies to help make sense of data – an increasingly common perspective. Therefore, everyone is trying to figure out how to scale their analytical approach and drive more value across the organization.

Data alone is no longer the focal point for businesses, not without analytics to accompany it. We’re seeing the term ‘data analytics’ popping up more frequently. In fact, it’s the number one investment priority for Chief Analytics Officers over the next 12-24 months according to a Chief Analytics Officer Forum survey. That means an increased investment in data analytics and predictive analytics software tools. Not surprisingly, with the increased investment planned around these initiatives, the ability for data analytics to meet expectations across the company is the number one thing keeping CAO’s up at night, according to the same study.

The lively discussions during the course of both events featured some of the industry’s smartest minds discussing common challenges and objectives. Here are some of the most prevalent topics that were discussed.

  • The Need for C-Level Support: Very similar to the challenge facing the CDO, the CAO will need to secure buy-in from the C-level to make a real impact. Many speakers at both events expressed similar frustrations with getting the C-level to provide them the budget and resources needed to do their jobs. A good approach to take shared during one session was to build a business case which clearly quantifies the business value analytics will drive against specific goals. If you can tie everything back to ROI, you will have the ears of the CEO.
  • Breaking Down Silos: Even if you have attained support from the C-level, it is critical to partner with cross-functional departments. Whether it’s sales, marketing, finance, etc., tying the business value that analytics can drive to their specific goals will help the work relationship. These teams need to feel they are being included in your work. This theme was evident in many sessions, with speakers giving examples how they partnered with their colleagues to influence their business strategy and turn insights into action. At the end of the day, analytics is only as good as the data you have, and you need to ensure you are leveraging all of it across the enterprise.
  • Becoming a Storyteller: It was widely acknowledged that 80-85 percent of business executives who claim to understand analytics actually don’t. Hence the need to be able to simplify the message is critical to your success. There was a lot of discussion around what encompasses being a better storyteller. Being able to stay on point, avoiding technical jargon, relying on words versus numbers, and clearly quantifying and measuring business value were agreed upon paths to help the analytics group clearly communicate with the C-level.
  • Building the Right Team: Of all the discussions during both events, this was one of the most prominent themes and one of the biggest challenges shared by attendees. Where do you find the right talent? Views ranged from outsourcing and offshoring strategies to partnering with universities to develop a combined curriculum for undergrads and graduate students.

Everyone agreed the right candidate should have 4 distinct skills:

  • Quant skills, i.e. math/stats qualification
  • Business acumen
  • Coding skills/visualization
  • Communication and consulting skills

Since it is very difficult to find all four skills in a single person, the consensus was the perfect analytics team should consist of 3 tiers of analytics resources that should be thought through when building a team:

  • Statisticians, modelers and PhDs
  • Computer scientists
  • Measurement and reporting

From talent to strategy, the past two analytics-focused events underline the importance of employing a CAO in the enterprise. As data and analytics continue to be the core drivers of business growth, the CAO will not only need a prominent seat at the table, they will need the freedom and resources to help turn analytics into actionable insights for the entire enterprise.

Prioritizing Capital Markets Data Management: Should we be concerned?

Dollar Bank photo-1443110189928-4448af4a2bc5

Original content found on capital markets data management

I read the Enterprise Data Management Council’s (EDMC) 2015 Data Management Industry Benchmark Report with great interest and am not sure if I should be encouraged or worried. I am encouraged because the study was well done, and the report was chock full of great insight into the progress of important data management initiatives in our financial institutions. However, I am also concerned that the inability of industry leaders to effectively communicate the importance of data management initiatives to all constituents will inhibit the ability of our financial institutions to execute on their strategic priorities.

A Historically Low Priority IT Activity

I was involved in data management in the 1980s and 1990s as a technology executive for investment banks, and I believe that data management—at the time a function of the technology department—was viewed as a low priority among industry management. A lot has changed since then to raise the importance of data management, most obviously the damage of the 2008 credit crisis and the stifling regulation that has resulted from it.

Now that I’m at Dun & Bradstreet, the leading provider of commercial data, I surely see progress.

The EDMC report indicates that data management has “gained a strong and sustainable foothold” in the industry and that “data is … essential in order to facilitate process automation, support financial engineering and enhance analytical capabilities.

Capital markets institutions have made undeniable improvements—such as building faster and better models for decision making, deploying highly intelligent trading algorithms and reducing trade breaks and fails—that have elevated their business. But adoption of reference data for enhanced insights has not made a prominent impact in this growth, in large part because it has not gained prominence in these institutions.

Data management historically has resided in organizational technology silos, which greatly inhibits the collaboration that is required to maximize the benefit from analysis of the complex concepts of reference data. Ownership of reference data has not been fully integrated into operational processes. More importantly, it has not been sufficiently evangelized and its value not articulated as part of an overall strategy.

Time to Spread the Data Management Gospel

The report calls it spreading “the data management gospel.” Indeed, the successful integration of data management into a corporate or enterprise function will surely improve acceptance and adoption. As the report states, “Stakeholder buy-in increases significantly and resource satisfaction is highest in those circumstances.

Two things will get us to data management adoption:

One is for management to spread the word. Resources need to hear—and believe—that data management is a priority. In the past, it’s been given lip service and has then predictably faded in the shadow of the latest trading technology or low-latency market data solution, or has given way under the weight of unending regulatory mandates. As a result, because so many have heard it repeatedly, it is natural for them to greet statements about the importance of data management with a skewed eye.

Indeed, the EDMC report confirms such, saying that while the industry has a sufficient level of resources ready, the industry has a low level of satisfaction with support for data management initiatives, and refers to the industry’s tendency to ‘haircut’ data management program resources for other operational activities.

The industry’s experience leads to its struggle today to get sufficient resources to meet objectives. Of course, when financial institutions now need to become smarter in their knowledge of the market, this lack of commitment and resulting resource shortfall is seen as a primary cause. Organizations such as the EDM Council itself have already benefitted from the progress of this communication, generating consistent dialogue on the most important initiatives while offering a platform for executives to share their ideas for the best solutions.

So that’s where the second thing comes in — secondary drivers. Financial institutions are rapidly recognizing the value of data management for the processing part of the business. The EDMC report states that operational efficiency is cited by 68% of respondents as being a significant benefit while business value/analytics is noted by 46%. With reducing operations and processing costs being such an important part of capital markets’ strategy (supported by such initiatives as reducing the settlement cycle and investigation of distributed ledger solutions), the ability to improve efficiency will raise data management to the level it needs to attract resources.

Say It Like You Mean It

However, as the leaders of financial institutions adopt these tenets, their challenge lies in communication to others in this business. No longer can capital markets afford another “false start” and more lip service to the importance of data management.

In its introduction, the EDMC report accurately states:

 “There is no getting around the inherent difficulties associated with either altering organizational behavior or managing wholescale transformation of the data content infrastructure. And while the challenges are real, the global financial industry has clearly taken a giant step closer to achieving a data management control environment.”

It is indeed a daunting task and one that has been central to the jobs of data executives for decades.

Further, I agree completely with the report’s statement that, “we would expect to see the importance of communication clearly articulated as part of data management strategy and various approaches being created to ‘spread the data management gospel.”

This means that organizations such as SIFMA, the FISD and the EDMC itself, as well as the individual institutions and data providers like Dun & Bradstreet, should firm up our dialogue for communication with everyone in the industry. This will pave the way for sufficient resource dedication to address the data management problem.

We’ve been hearing for years about the importance of data management and have witnessed its steady, if still slow, progress to becoming a prominent business initiative. Now it’s time for executives to make the biggest push yet to attract the resources required to execute on this strategy.

Read the full report here: 2015 Data Management Industry Benchmark Report


Learn more about Dun & Bradstreet’s perspectives on the kinds of data organizations in capital markets need to help make better decisions.

IoT Is Here – And Accelerating Faster Than Predicted

23554385075_09e0918959_k-2We’ve been hearing about the Internet of Things for what feels like forever, and so it’s easy to start tuning it out.

Yes, we know everything will have a sensor and be connected to the Internet. And yes, we know the Internet will mostly be machines talking to machines. And yes, the numbers of connected gadgets are massive. Yada, yada, yada.

But it’s time to snap out of the stupor and tune back in. IoT is starting to match all the hype. That means there’s a data tidal wave gathering force, and it’s about to hit us all.

IDC, the marketing research firm, recently issued two IoT reports that point to the accelerated pace of adoption. In one survey of corporate decision makers, IDC found that “73% of respondents have already deployed IoT solutions or plan to deploy in the next 12 months.”

That puts this way beyond critical mass.

And the people driving these deployments don’t simply believe they are jumping on some bandwagon. IDC, in a second report, found that 58% considered IoT to be a “strategic initiative,” and 24% felt it would be “transformative.”

Optimism is particularly high among those in manufacturing and healthcare, IDC notes.

“The Internet of Things is enabling organizations to reinvent how they engage with their customers, helping them to accelerate the speed at which they deliver their products and services, and effectively reinventing industry processes,” said Carrie MacGillivray, Vice President, Internet of Things & Mobile at IDC, in the report.

A separate survey of U.S. and U.K. business leaders from Aeris Communications also found that IoT is already having a big impact on businesses.

For instance, 74% in the Aeris survey felt that IoT “provides the opportunity to meet key business objectives” and 70% said IoT will “help them achieve a competitive edge.”

Now, for the cautionary note: Aeris found that 72% “find it difficult to analyze sensor and connectivity data to obtain useful insights.” Really, that’s the same challenge for everyone.

It doesn’t matter how big the data pile is. What matters is making it meaningful.

In my recent blog post, I shared findings from “What’s the Big Deal With Data?” published by Software Alliance. This quote was especially poignant and reinforces the promise and challenge of data being spawned from IoT:

“Our challenge is to harness data and put it to work, using our ingenuity to make sense of the valuable learnings locked within it. It is this ability to process data and transform observations into insights, and insights into answers, which enables us to achieve meaningful solutions to today’s significant challenges.”

This data challenge reality means that IoT deployments are just a first step. The reason to embrace IoT is that it can generate new sources of information about customers, partners and your business.

In the end, all the cool sensors and gadgets won’t matter unless your business finds a way to translate this new data into insights.

As IoT accelerates and unleashes this unprecedented wave of data, a data management strategy is not just important, it’s imperative. It is the only way you can shape and share a meaningful source of truth across your organization.

That’s what helps you get to your end game – solving your most pressing challenges, delivering the best customer experience, and planning for the brightest future in our IoT world.


Image credit: via Flickr Creative Commons.

With Data, Size Matters, but it’s Really About the Relationships

14854239231_2d7c16f4f4_kI sat in the audience at Waters USA 2015 watching my colleague Anthony Scriffignano, Dun & Bradstreet’s Chief Data Scientist, capture the undivided attention of the delegates with the “inconvenient truths” about decisions we make in today’s data-driven environment. It was intellectually powerful stuff – even for a very tuned-in audience of technology leaders and C-level executives at financial institutions who had gathered to learn more about emerging IT innovations and solutions they can adopt in their organizations for better decision making and optimization.

As Anthony spoke, I thought of how all of us – in both our personal and business lives – consume data at an alarming rate. Yet despite the consumption pace, we crave even more, causing us to run endlessly faster in order to simply keep up.

As a parent, my concern grows when witnessing my children become overly focused on the data that is available to them, linking them tightly to their devices, gobbling up whatever social or structured data is in their line of sight. Try as I might, I struggle to convince my high schooler that not every text message or Snapchat alert requires immediate attention. She believes that, whatever the problem, she can get the answer by seeking data from her phone.

“You’re better off being thoughtful and engaging your closest friends and colleagues in discussion, rather than responding to every question or message by checking your phone,” I say, to a predictable teenage eye roll and cynical smirk.

It occurs to me that this lament is similar to that of most of today’s Wall Street Chief Data Officers. Their firms have mountains of data, more computing power than they could possibly need, and billions of dollars of compensation incentives driving them to perfect their correlations and analysis. The natural reaction to solving problems is often to get more data, when in fact sometimes it muddies the waters and obscures the causes.

I think of what my colleague Anthony stressed in his presentation, that many data managers run faster and faster to get just one millisecond ahead of the competition. Financial institutions continue to accumulate greater amounts of data, much of it unstructured data and social media related, to help gain intelligence on our markets. They process is ultra-fast. They hire the smartest quants in the world.

So with all this data and all the knowhow and computing power to run all these scenarios, what do they have now?

Our markets are generally more volatile and move unpredictably, while “flash crashes,” fraud and flawed market structure drive unprecedented regulation. Our institutions have volumes more data to analyze than ever before. But has that made our markets more stable? More trustworthy? Easier to understand? Are we more confident in our markets’ fairness or performance? Few believe so.

Participants’ reactions to the markets’ problems are disproportionately addressed by creating new regulation that restricts activity and often amounts to throwing the baby out with the bath water by curbing positive activity. Regulation may touch on some of the causes, however, it is often so broad that it masks the real problem, rather than pinpoint the source. There’s just too much data at hand, and without analyzing the relationships more closely, it has become impossible to target the specific data needed to identify and solve precise source issues…

So, despite all the data and intelligence, the current approach is missing something. Much of the industry is still performing data analysis like they always have (albeit faster and more effectively) – by crunching more data.

The real answer is to resist the urge to get more data for data’s sake.

The best CDOs are shifting their strategy to one that enables them to better recognize the importance of the relationship between the data, our flawed market structure, and ineffective risk management.

While the improvements in data analysis have created new opportunities for capital markets institutions, flaws remain in the system, illustrating that this data still has gaps that are not properly acknowledged. In part, this is the result of the industry’s effort to apply the most sophisticated algorithms and analytics to the most data it can find.

However, the inconvenient truth cited by Anthony is – simply – that more data is not necessarily better data. The answer lies in thinking differently about how the market consumes data and studies relationships in that data. This represents a marked change over the approach most CDOs took just a decade ago.

Now if I can only get my teenagers to think the same way…

Learn more about Dun & Bradstreet’s perspectives on the kinds of data organizations in capital markets need to help make better decisions.