Think Data First, Platform Second – Why Data Fuels MDM

Fuel

 

As the volume of data coming into organizations – from both internal and external sources – continues to grow and makes its way across departmental systems in many different formats, there is a critical need to create a single, holistic view of the key data entities in common use across the enterprise. Master Data Management (MDM) aims to accomplish this goal. Not surprisingly, MDM has become a significant priority for global enterprises, with the market expected to triple from $9.4B to $26.8B by 2020 according to analysts.

But while everyone is investing serious cash into the tools to manage the data, few are putting any thought into the data itself. This is akin to purchasing a luxury sports car and fueling it with water. Sure it looks great, but it won’t get you very far.

 

The underlying concept of MDM is surprisingly simple: get everyone “on the same page” looking at the same data and ensure it is accurate. Yet, master data and its management continue to be a universal challenge across many industries.  Organizations of all shapes and sizes share similar problems related to master data and can all reap benefits from solving them. That means concentrating on the quality of the data before going shopping for the sexiest MDM platform. In essence, you must master data before you can manage it. Ensuring the quality, structure, and integrability is your responsibility; your MDM platform won’t do that for you. It’s like purchasing a top-of-the-line oven and expecting it to produce a delectable meal. You are responsible for what goes into it.

Master Data Defined

Master Data is the foundational information on customers, vendors and prospect that must be shared across all internal systems, applications, and processes in order for your commercial data, transactional reporting, and business activity to be optimized and accurate. Because individual businesses and departments have a need to plan, execute, monitor and analyze these common entities, multiple versions of the same data can reside in separate departmental systems. This results in disparate data, which is difficult to integrate across functions and quite costly to manage in terms of resources and IT development. Cross-channel initiatives, buying and planning, merger and acquisition activity, and content management all create new data silos. Major strategic endeavors, part of any business intelligence strategy, can be hampered or derailed if fundamental master data is not in place. In reality, master data is the only way to connect multiple systems and processes both internally and externally.

Master data is the most important data you have.  It’s about the products you make and services you provide, the customers you sell to and the the vendors you buy from. It is the basis of your business and commercial relationship. A primary focus area should be your ability to define your foundational master data elements, (entities, hierarchies and types) and then the data that is needed (both to be mastered and to be accessible) to meet your business objective. If you focus on this before worrying about the solution, you’ll be on the right course for driving success with MDM. Always remember, think data first and platform second.

Will Businesses Like Facebook’s New Reaction Buttons?

Thumbs Up

 

New Data Will Reveal How Customers Really Feel

If you recently updated your Facebook status, you may see a wide array of responses that goes beyond the fabled “like” riposte the social media platform has become known for. Last month Facebook unveiled “Reactions,” which offers its users five additional ways to express their opinions on everything from pictures of your cats to your brash political musings. While it will certainly give users more choices in how they interact with friends, it will also give businesses deeper insights into customer sentiment.

Reactions are essentially an extension of the like button, with six different buttons, or “animated emojis” (ask a millennial): “Like,” “Love,” “Haha,” “Wow,” “Sad” or “Angry.” Despite the public outcry for a “dislike” button, Facebook did not include one because the company felt it could be construed as negative. But that won’t stop people from using the “angry” button when they do not like something.

While this may upset a friend, it can actually help companies react and respond to complaints. What’s more, it may even help us predict trends and threats we may have not previously seen.

 

“I think it’s definitely possible to draw true insight from this, but you’ll need to do some very careful analytics before forming any meaningful conclusions.”

-Nipa Basu, Chief Analytics Officer, Dun & Bradstreet

 

Dun & Bradstreet’s Chief Analytics Officer, Nipa Basu, believes the new “Reactions” will be an opportunity for businesses to better understand their customers, but notes it will take a deep level of understanding to make perfect sense of it.

“I think it’s definitely possible to draw true insight from this, but you’ll need to do some very careful analytics before forming any meaningful conclusions,” explains Basu. “These ‘Reactions’ are typically based on real-time human emotion. Sometimes it will be fair. Sometimes it will be unfair. But if you have a large sample of a lot of people’s emotions reflected then you can begin to ask if that says something about the customer experience or the product or service from that business, and go from there.

“Then comes the second part. Does it matter? Looking deeper at what the comments suggest and how they correlate with the different types of ‘Reactions’ being received, a good analyst will be able to draw more accurate insights. Looking at both types of responses together will help understand what customers really felt.”

This is what Basu believes will make social sentiment analysis that much more effective. Not only will it open the door for brands to assess the success or relevance of their content, as well as measure customer satisfaction, it may paint a deeper picture about total risk and opportunity across industries that could benefit others.

“When you utilize a company like ours, where we already have our pulse on the health of global businesses and industries, and combine it with these social ‘Reactions,’ we can start to understand the correlation it has between business growth or degeneration,” said Basu. “Can looking at the amount of ‘angry’ comments predict the future health of a particular business or industry? Quite possibly. This is paving the way for even richer data that can drive even smarter analytics.”

Data Dialogue: CareerBuilder Consolidates CRMs, Identifies Hierarchies and Provides Actionable Insight for Sales Enablement

Data Dialogue Logo Header

 

A Q&A with CareerBuilder’s Director of Sales Productivity  

CareerBuilder, the global leader in human capital solutions, has evolved over recent years to better meet marketplace demands. “We’ve moved from transactional advertising sales to a software solution, which is much more complex with a longer sales process and a longer sales cycle,” says Maggie Palumbo, Director of Sales Productivity for CareerBuilder.

With that, it is critical that the sales teams Palumbo works with must be armed not only with information on which accounts to target, but also with intelligence they can use to approach – and engage – the potential customer.

More, as they continue to expand globally, their need for consistent data and CRM consolidation has become a priority. Establishing one system whereby employees all see the same information better positions the company for continued global expansion through more informed decision-making.

We talked with Palumbo about how she has been leading this effort.

What are you focused on now as you look to your priorities moving forward?

In the past, we did not have much in the way of governance. We had loose rules and accounts ended up not being fully optimized. We’re focused now on better segmentation to figure out where we can get the best return.

We use tools from Dun & Bradstreet to help us accomplish this goal. Specifically, we rely on geographical information, SIC codes for industry, employee total and other predictive elements. Then we bring it all together to get a score that tells us which accounts to target and where it belongs within our business.

How do you make the information actionable?

We are unique in that we take the full D&B marketable file, companies with more than 11 employees, and pass it along to a sales representative. Some reps, those with accounts with fewer than 500 employees, have thousands of accounts. What they need to know is where to focus within that account base. Scoring and intelligence allows us to highlight the gems within the pile that may have otherwise been neglected. The reps at the higher end of the business have fewer accounts and require more intelligence on each account to enable more informed sales calls.

Because we’ve moved from a transactional advertising sales model to one based on software solutions, our sales process is now more complex. The reps need intelligent information that they can rely on as they adjust their sales efforts to this new approach.

How do you make sure you’re actually uncovering new and unique opportunities?

We’ve gone more in-depth with our scoring and now pull in more elements. We also work with third party partners who do much of the manual digging. With that, we’re confident that we’re uncovering opportunities others may not necessarily see.

How do you marry together the data with the technology?

When the economy went south, our business could have been in big trouble. We are CareerBuilder. For a business whose focus is on hiring, and you’re in an economy when far fewer companies are hiring, where do you go with that?

Our CEO is forward-thinking and already started to expand our business to include human capital data solutions. With that, it became clear that we needed to have standardized data, which we do via our data warehouse. Once the data is normalized and set up properly, it can be pushed into the systems. Pulling the information from different sources together into one record is the challenge. We use Integration manager for that; the D-U-N-S Number serves as a framework for how we segment our data and we rely heavily on that insight.

How does Dun & Bradstreet data help within your sales force?

Data we get from Dun & Bradstreet provides us with excellent insight. While the reps may not care about the D-U-N-S Number, per se, they do care about the hierarchical information it may reveal—particularly with the growth of mergers and acquisitions over the past few years.

What other aspects of your work with Dun and Bradstreet have evolved?

We are a global company, but we are segmented. As we move everyone across the globe onto one CRM platform, we are creating more transparency. That is our goal. In order for people within the organization to effectively partner with each other, they must see the same information, including hierarchical structure. D&B has helped us bring that information together and identify those global hierarchies.

Tell me more about how linkage has helped you.

We used to be all over the place, with multiple CRMs across the organization globally. Some were even homegrown. We also wanted to get a better view on hierarchies. We lacked insight of what was going on with a company across multiple companies and therefore couldn’t leverage that information. We had to bring it all together through linkage. We’ve made great progress in terms of the global hierarchy and global organizational structure, and we couldn’t have done it without D&B.

Parting Words

As CareerBuilder continues to grow globally and evolve as a company to meet customer needs and demands of the marketplace, aligning the sales process with actionable intelligence is critical to its forward-moving trajectory. “It’s all about fully optimizing the data that’s available to us so that we can focus our efforts on where we can get the most return for the company,” said Palumbo.

Breaking Down the Business Relationships in Breaking Bad

better-call-saul

 

How Saul Goodman Can Teach Businesses About the Value of Understanding Relationships

This week, TV viewers witnessed the return of Jimmy McGill, a scrappy and indefatigable attorney struggling for respect and reward. Spoiler alert: If you watched Breaking Bad, you know the upstart Albuquerque lawyer goes on to become Saul Goodman, the lawyer and adviser for eventual meth kingpin Walter White, or as he’s known on the street, Heisenberg.

Now in its second season, AMC’s hit show Better Call Saul shows the transformation of the naïve McGill into what would become one of the city’s most notorious criminal defense attorneys. But it doesn’t happen by chance. His ability to understand and manipulate relationships plays a huge role, something many businesses can learn a thing or two about. But before I proceed, if you have not not watched Breaking Bad, I implore you do so immediately. Go on, watch it now, and then come back and read this article, otherwise you’re going to be a bit lost.

In Breaking Bad we learn that Saul Goodman is a key player in Walter White’s evolution from everyday chemistry teacher to criminal mastermind, constantly getting him out of several sticky situations over the course of his drug business operations. Goodman is effective in helping Walt stay one step ahead of the police and competing drug czars because of his extensive connections within the criminal underworld, as well as serving as a go-between connecting drug distributors, evidence removers, impersonators, and other criminals-for-hire.

What makes Goodman so successful is his network of relationships. He knows all the players and how they are connected to others and uses that knowledge to his advantage. Ultimately, it’s what probably keeps him and his clients alive for so long. Other entities in the Breaking Bad world are not so lucky. Shotgun blasts and burning faces aside, I’m talking about the businesses that were ultimately crippled by the chain of events that were set off by Walter White’s maniacal obsession for power.

The Breaking Bad series finale shows us the fate of all the major characters, but what about everyone else that has some underlying connection to what went down?

We learned that Walt’s meth empire was funded by a multifaceted conglomerate headquartered in Germany called Madrigal Electromotive. According to Wikia, Madrigal is highly diversified in industrial equipment, manufacturing, global shipping, construction and fast food; the most notorious being the American fried chicken chain, Los Pollos Hermanos.

Founded by Gustavo Fring, the Los Pollos Hermanos restaurant chain had fourteen locations throughout the southwest and was a subsidiary of Madrigal. As we learned during the course of the show, the restaurant provided money-laundering and logistics for illegal activities. It’s safe to assume that following the death of its founder and his reported connection to engineering a billion-dollar drug empire, business suffered. Every enterprise that was directly doing business with the fried chicken chain likely cut ties with them as soon as the news broke. From the factory providing the meat to the manufacturer supplying the utensils, these businesses were aware that Los Pollos Hermanos would suffer and were able to plan in advance for a revenue downfall.

But what about the other suppliers that did not realize they were working with entities that had connections to Los Pollos Hermanos’ parent company? Madrigal is spread across 14 divisions, including a massive investment in fast food. The fast-food division, formerly run by Herr Peter Schuler, encompasses a stable of 7 fast-food eateries, including Whiskerstay’s, Haau Chuen Wok, Burger Matic, and Polmieri Pizza. Following the breaking news of the drug ring, the resulting investigation likely sent shockwaves throughout the entire Madrigal enterprise and subsequently hurt all of its businesses in some shape or form. But let’s look at the supplier of dough for Polmieri Pizza for example. Do you think they knew the pizza shop they do business with was a subsidiary of Madrigal and would be a casualty of the meth trade? Very unlikely.

Because Polmieri Pizza is a subsidiary of Madrigal, they will be at least indirectly effected. While its parent company will be in damage control – a change of management, a freeze on funds, etc. – the innocuous pizza shop will be impacted, even if it is only short term. During this time, the dough supplier has no clue to the grievous relationship the pizza shop has to Madrigal and that it should expect some change in how they work with the pizza eatery. If they had known there was any connection, they may have been able to plan ahead and cut down on production and accounted for less orders. Instead, they are caught by surprise and left overstocked and under water.

This could have been avoided if the dough manufacturer leveraged its relationship data. Every company has relationship data; they just need to know where to look for it, or who to partner with to obtain the right information.

Relationship data is information about two or more entities that are brought together along with their business activities to inform an implied business impact or outcome. Through a combination of interpreting the right signal data and implementing advanced analytics uncovered in this data, unmet needs arise, hidden dangers surface and new opportunities can be identified.

Of course, this is just an example of the importance of being able to recognize business relationships based on a fictional show. But not being able to do so could prove to be a grave reality for businesses of all shapes and form. If the companies with business connections to Madrigal’s vast enterprise had had a sense of relationship data, what would they have seen?

If you can take anything away from the Saul Goodman’s of the world, it is this: know how all your relationships are connected and you will know how to solve problems, manage revenue – and stay out of trouble.

The Chief Analytics Officer Takes Center Stage

Financial data and eyeglasses

 

From coast to coast, the business world is lauding the emergence of the Chief Analytics Officer (CAO). That’s right, we said Chief ANALYTICS Officer. Perhaps you were thinking about that other C-level role that recently dominated the headlines – the Chief Data Officer? Nope, the CDO is so 2015. Despite it being called the hottest job of the 21st century, it seems a new contender has entered the fray, that of the CAO.

All joking aside, the role of CDO has certainly not lost any of its luster; it remains an important position within the enterprise, it’s just that the CAO has now become just as significant. While both roles will need to coexist side-by-side, they face similar challenges, many of which were common themes during two recent industry events. Just looking at the massive turnout and the passionate discussions coming out of the Chief Analytics Officer Forum in New York and the Chief Data & Analytics Officer Exchange in California, it is apparent that the CAO will play a pivotal role in utilizing the modern organization’s greatest asset – data.

IDC predicts that the market for big data technology and services will grow at a 27% clip annually through 2017 – about six times faster than the overall IT market. But that windfall is not just going towards ways to obtain more data, it’s about investing in the right strategies to help make sense of data – an increasingly common perspective. Therefore, everyone is trying to figure out how to scale their analytical approach and drive more value across the organization.

Data alone is no longer the focal point for businesses, not without analytics to accompany it. We’re seeing the term ‘data analytics’ popping up more frequently. In fact, it’s the number one investment priority for Chief Analytics Officers over the next 12-24 months according to a Chief Analytics Officer Forum survey. That means an increased investment in data analytics and predictive analytics software tools. Not surprisingly, with the increased investment planned around these initiatives, the ability for data analytics to meet expectations across the company is the number one thing keeping CAO’s up at night, according to the same study.

The lively discussions during the course of both events featured some of the industry’s smartest minds discussing common challenges and objectives. Here are some of the most prevalent topics that were discussed.

  • The Need for C-Level Support: Very similar to the challenge facing the CDO, the CAO will need to secure buy-in from the C-level to make a real impact. Many speakers at both events expressed similar frustrations with getting the C-level to provide them the budget and resources needed to do their jobs. A good approach to take shared during one session was to build a business case which clearly quantifies the business value analytics will drive against specific goals. If you can tie everything back to ROI, you will have the ears of the CEO.
  • Breaking Down Silos: Even if you have attained support from the C-level, it is critical to partner with cross-functional departments. Whether it’s sales, marketing, finance, etc., tying the business value that analytics can drive to their specific goals will help the work relationship. These teams need to feel they are being included in your work. This theme was evident in many sessions, with speakers giving examples how they partnered with their colleagues to influence their business strategy and turn insights into action. At the end of the day, analytics is only as good as the data you have, and you need to ensure you are leveraging all of it across the enterprise.
  • Becoming a Storyteller: It was widely acknowledged that 80-85 percent of business executives who claim to understand analytics actually don’t. Hence the need to be able to simplify the message is critical to your success. There was a lot of discussion around what encompasses being a better storyteller. Being able to stay on point, avoiding technical jargon, relying on words versus numbers, and clearly quantifying and measuring business value were agreed upon paths to help the analytics group clearly communicate with the C-level.
  • Building the Right Team: Of all the discussions during both events, this was one of the most prominent themes and one of the biggest challenges shared by attendees. Where do you find the right talent? Views ranged from outsourcing and offshoring strategies to partnering with universities to develop a combined curriculum for undergrads and graduate students.

Everyone agreed the right candidate should have 4 distinct skills:

  • Quant skills, i.e. math/stats qualification
  • Business acumen
  • Coding skills/visualization
  • Communication and consulting skills

Since it is very difficult to find all four skills in a single person, the consensus was the perfect analytics team should consist of 3 tiers of analytics resources that should be thought through when building a team:

  • Statisticians, modelers and PhDs
  • Computer scientists
  • Measurement and reporting

From talent to strategy, the past two analytics-focused events underline the importance of employing a CAO in the enterprise. As data and analytics continue to be the core drivers of business growth, the CAO will not only need a prominent seat at the table, they will need the freedom and resources to help turn analytics into actionable insights for the entire enterprise.

Prioritizing Capital Markets Data Management: Should we be concerned?

Dollar Bank photo-1443110189928-4448af4a2bc5

Original content found on www.linkedin.com/prioritizing capital markets data management

I read the Enterprise Data Management Council’s (EDMC) 2015 Data Management Industry Benchmark Report with great interest and am not sure if I should be encouraged or worried. I am encouraged because the study was well done, and the report was chock full of great insight into the progress of important data management initiatives in our financial institutions. However, I am also concerned that the inability of industry leaders to effectively communicate the importance of data management initiatives to all constituents will inhibit the ability of our financial institutions to execute on their strategic priorities.

A Historically Low Priority IT Activity

I was involved in data management in the 1980s and 1990s as a technology executive for investment banks, and I believe that data management—at the time a function of the technology department—was viewed as a low priority among industry management. A lot has changed since then to raise the importance of data management, most obviously the damage of the 2008 credit crisis and the stifling regulation that has resulted from it.

Now that I’m at Dun & Bradstreet, the leading provider of commercial data, I surely see progress.

The EDMC report indicates that data management has “gained a strong and sustainable foothold” in the industry and that “data is … essential in order to facilitate process automation, support financial engineering and enhance analytical capabilities.

Capital markets institutions have made undeniable improvements—such as building faster and better models for decision making, deploying highly intelligent trading algorithms and reducing trade breaks and fails—that have elevated their business. But adoption of reference data for enhanced insights has not made a prominent impact in this growth, in large part because it has not gained prominence in these institutions.

Data management historically has resided in organizational technology silos, which greatly inhibits the collaboration that is required to maximize the benefit from analysis of the complex concepts of reference data. Ownership of reference data has not been fully integrated into operational processes. More importantly, it has not been sufficiently evangelized and its value not articulated as part of an overall strategy.

Time to Spread the Data Management Gospel

The report calls it spreading “the data management gospel.” Indeed, the successful integration of data management into a corporate or enterprise function will surely improve acceptance and adoption. As the report states, “Stakeholder buy-in increases significantly and resource satisfaction is highest in those circumstances.

Two things will get us to data management adoption:

One is for management to spread the word. Resources need to hear—and believe—that data management is a priority. In the past, it’s been given lip service and has then predictably faded in the shadow of the latest trading technology or low-latency market data solution, or has given way under the weight of unending regulatory mandates. As a result, because so many have heard it repeatedly, it is natural for them to greet statements about the importance of data management with a skewed eye.

Indeed, the EDMC report confirms such, saying that while the industry has a sufficient level of resources ready, the industry has a low level of satisfaction with support for data management initiatives, and refers to the industry’s tendency to ‘haircut’ data management program resources for other operational activities.

The industry’s experience leads to its struggle today to get sufficient resources to meet objectives. Of course, when financial institutions now need to become smarter in their knowledge of the market, this lack of commitment and resulting resource shortfall is seen as a primary cause. Organizations such as the EDM Council itself have already benefitted from the progress of this communication, generating consistent dialogue on the most important initiatives while offering a platform for executives to share their ideas for the best solutions.

So that’s where the second thing comes in — secondary drivers. Financial institutions are rapidly recognizing the value of data management for the processing part of the business. The EDMC report states that operational efficiency is cited by 68% of respondents as being a significant benefit while business value/analytics is noted by 46%. With reducing operations and processing costs being such an important part of capital markets’ strategy (supported by such initiatives as reducing the settlement cycle and investigation of distributed ledger solutions), the ability to improve efficiency will raise data management to the level it needs to attract resources.

Say It Like You Mean It

However, as the leaders of financial institutions adopt these tenets, their challenge lies in communication to others in this business. No longer can capital markets afford another “false start” and more lip service to the importance of data management.

In its introduction, the EDMC report accurately states:

 “There is no getting around the inherent difficulties associated with either altering organizational behavior or managing wholescale transformation of the data content infrastructure. And while the challenges are real, the global financial industry has clearly taken a giant step closer to achieving a data management control environment.”

It is indeed a daunting task and one that has been central to the jobs of data executives for decades.

Further, I agree completely with the report’s statement that, “we would expect to see the importance of communication clearly articulated as part of data management strategy and various approaches being created to ‘spread the data management gospel.”

This means that organizations such as SIFMA, the FISD and the EDMC itself, as well as the individual institutions and data providers like Dun & Bradstreet, should firm up our dialogue for communication with everyone in the industry. This will pave the way for sufficient resource dedication to address the data management problem.

We’ve been hearing for years about the importance of data management and have witnessed its steady, if still slow, progress to becoming a prominent business initiative. Now it’s time for executives to make the biggest push yet to attract the resources required to execute on this strategy.

Read the full report here: 2015 Data Management Industry Benchmark Report

***

Learn more about Dun & Bradstreet’s perspectives on the kinds of data organizations in capital markets need to help make better decisions.

3 Reasons to Be Thankful for Data

give thanks - Thanksgiving concept

 

What better time to think about all that we’re thankful for than Thanksgiving? For businesses, there has never been a better time to be grateful for data and all it can do to help deliver new opportunities. Like a moist, juicy roast turkey, data is the one thing everyone seems to be eating up. Let’s look at three reasons why data is so delectable. We promise you won’t feel sleepy after consuming this list.

 

Bountiful Amounts of Data

The Pilgrims arrived in the New World during a particularly harsh winter, making it very difficult for them to find food and shelter. Of course we know they persevered despite the shortage of resources after they learned to harvest crops that produced an abundance of food for them to not only survive, but thrive.  Fast-forward to today, and we gorge ourselves on turkey and various complex carbohydrates in celebration of the early settlers’ efforts.

Like the sustenance which was once so hard to come by, data too has gone from scarce to abundant, and there’s plenty of it to feast on. Just as the Pilgrims struggled to live off the land, data scientists once had to work especially hard to cultivate meaningful bits of data. When data was scarce, they had to extract, track and survey to get at it. The objective was always to find out how and where to get more of it. Today, data is everywhere and the new goal is understanding how to make sense of it.

Technology has been one of the biggest catalysts of the data explosion. With the digitization of business, the amount of information we are collecting is growing so large that there are debates on just how big it is – it probably has grown even more since you read that sentence . Both structured and unstructured in nature, this wealth of information has made it possible to produce insights and achieve outcomes that were previously inconceivable. Businesses can better identify trends and risk, organizations can tackle health and wellness issues, and governments can solve economic and social challenges.

While we should certainly be grateful for the abundance of data, we must be careful how we use the information. It is important not to overindulge or horde it. Instead we must recognize the type of data that will sustain us and avoid the empty calories that may lead us astray. Just like the Pilgrims planted the right seeds that would bring them sustenance, we must choose the kernels of data that will drive meaningful value and insights.

 

Data Freedom

The Pilgrims came to America from England in search of religious freedom.  They yearned for an open society characterized by a flexible stricture, freedom of belief and dissemination of information. We are witnessing a similar evolution in the way data is accessed and shared. The concept of data sharing is officially defined as making a certain piece of data free to use, reuse and redistribute –subject only, at most, to the requirement to attribute and/or share-alike. In other words, we’re at a point in history where some information can be freely used to solve key problems or make progress towards specific goals.

There are many examples of data that can be openly shared. However, it’s not just about the numbers, but the insights that come along with it that pose the most benefits when freely distributed. This concept offers benefits across both the private and public sector. Businesses can gain a new level of transparency into new opportunities for services/goods, make better decisions based on more accurate information, and reduce costs for data conversion. But perhaps the biggest advantage of an open data ecosystem is for individual citizens. That’s because the sharing of information between governments can help everything from increase economic activity, address national disasters in a swifter manner, and even reduce health issues.

There are several types of data that can be shared among governmental functions. There is the sharing of data among governmental agencies within a single country. Second is the sharing of data across borders between International governments. And lastly, there is the sharing of data between businesses and government; this refers to voluntary sharing of data, beyond the legal reporting obligations of governments.

So what exactly should governments be sharing? Observations are crucial – such as the National Weather Service issuing a Hurricane watch. It’s about sharing conclusions that can help different agencies better prepare for that projected weather event. Ultimately, this is the value of open data: multiple organizations, with mutual interests, sharing their independent observations in a manner that lets each organization draw more accurate, informed and insightful conclusions.

Unfortunately, in many cases the decisions made by government officials are not always based on all of the information that might be pertinent to the situation. The same goes for businesses, which are somewhat reluctant to let go of their first-party data. With the freedom and ease to share information at our hands, we have the opportunity to achieve maximum economic and social benefits.

At the end of the day, data is nothing without analytics to help make sense of it.  We should always be cognizant about ways in which specific pieces shared data are used to address specific questions and help establish new ideas. The Pilgrims likely did not use all of their food sources to cook a single meal.  We shouldn’t use all the data available to solve a problem just because it’s in front of our face.

 

A Brave New World of Data

As hard as I tried, I could not come up with a very clever parallel between the Pilgrims and the Internet of Things (IoT). But, this new buzz word represents such a major data opportunity to be thankful for, and, well, the Pilgrims had, umm, things, so here we go.

The IoT refers to the growing network of interconnected objects – devices and machines which are not connected but aware of, and discoverable to, other devices and machines around them. It’s mobile, virtual and instantaneous. Data can be gathered and shared from anything like a car to a refrigerator, which means we’ll be witnessing a huge increase in the amount of data being generated – huge streams of data that will enable new types of insight. We now have an opportunity to organize this information in compelling ways to reach new conclusions.

The IoT has the opportunity to fundamentally change industries. From the automotive industry, where new data signals from cars may help improve safety conditions, to the supply chain, where the real-time passing of information can avoid disruptions in manufacturing. Organizations will quickly realize transformational change of their business models given the rate at which digital technologies are connected and constantly evolving.

As beneficial as the IoT will be, it will not flourish without the right data management and analytics capabilities. Organizations will need to invest time and resources in order to mine valuable insights from the data generated by the interactions that occur between machines.

 

In Summary

These are just three examples of how data is changing the face of business and frankly, society. There are certainly countless other reasons to be thankful for data, depending on what your business goals are and what you want to achieve. As I’ve noted, within each of these instances, while there is much to be thankful for, it is vital we be cautious and smart when taking advantage of new data opportunities. Just like the Pilgrims, we are using this new frontier to create a new world of endless possibilities.

The Fact on FATCA: How Third-Party Data Reduces Time, Cuts Costs and Improves Customer Experience

fatca_microscopeBy its very name, “big data” sounds like something that always makes business more complex. But here’s one instance, from the world of tax compliance, where the opposite is the case.

The Foreign Account Tax Compliance Act, or FATCA, requires foreign financial institutions (FFIs) to report to the IRS information about financial accounts held by U.S. taxpayers, or by foreign entities in which U.S. taxpayers hold a substantial ownership interest. The June 2016 deadline for FFIs to review existing accounts is fast approaching at the same time as many of the institutions are also planning their Common Reporting Standards (CRS) account review approach.

The time is now for them to determine the best way to use data to improve efficiency and process. Why? The main reason: customer service. A large portion of the FATCA and CRS effort requires financial institutions to reach out to preexisting account holders for new tax forms. Obtaining this client tax documentation presents numerous challenges to a financial institution and often leads to a process of multiple iteration that frustrates clients.

FATCA and CRS allow an alternative to the standard approach of soliciting and validating client tax documentation, which can also assist in automating changes in circumstances monitoring and, on top of that, further improve an FFI’s data quality. Third-party data can be leveraged to classify certain types of entity statuses for FATCA and CRS, which can ease the effort by eliminating the need to collect and validate IRS Tax Forms W-8/W-9 or entity self-certifications. Classifying accounts without the need to obtain tax documentation can save organizations the cost of soliciting and validating tax documentation and improve the customer experience.

Dun & Bradstreet is working with global tax leader KPMG to help financial firms make some of these results a reality, and here’s what you need to know.

Regulations Overview – What You’re Allowed to Do

FATCA states that a withholding agent may rely on documentation collected by a third-party data provider with respect to an entity. If it meets certain conditions, that third-party data can be used to classify offshore entity clients for certain entity types, such as Active Nonfinancial foreign entities (NFFEs) and International organizations.

FATCA regulations also state that preexisting accounts, which generally include accounts opened at a financial institution prior to July 1, 2014, and potentially entity accounts opened between July 1, 2014 and December 31, 2014, must be properly reviewed, documented and classified by June 30, 2016.

The CRS due diligence procedures also provide an exception to the requirement to obtain a self-certification where the financial institution can reasonably determine, based on information in its possession or that is publicly available, that the Account Holder is not a Reportable Person. By utilizing data, an FFI can determine that a client is not reportable and therefore does not have to solicit a CRS self-certification.

Data that can be utilized for a non-reportable entity includes:

  • Information published by an authorized government body of a jurisdiction. For example, the list of Foreign Financial Institutions published by the US tax administration;
  • Information in a publicly accessible register maintained or authorized by an authorized government body of a jurisdiction;
  • Information disclosed on an established securities market;
  • Information previously recorded in the files of the financial institution;
  • A publicly accessible classification based on a standardized industry coding system. This will include any coding system employed by the financial institution which is based on such a standardized industry coding system.

Where the financial institution relies on such information, it must retain a notation of the type of information reviewed and the date on which the review was carried out. For CRS 2017 adopters, the review of preexisting entity accounts is to be completed by December 31, 2017. 2018 adopters have until end of 2018 to review these accounts.

The Tools to Help

An FFI has can use technology tools that utilize third-party data to classify preexisting accounts for FATCA and CRS. These tools match and compare an FFI’s client account information to a third-party data source and, once compared, can catalog account records for certain FATCA and CRS classifications. This solution reduces the need to solicit and validate tax documentation, saving one costly step in the process.

Once linked, the accounts can be monitored for change in circumstance against a third-party data source, improving accuracy and providing confidence in the classifications. This procedure, along with technology, seamlessly provides the chance for an FFI to compare its client account information to a third-party source that can assist the quality assurance and improvement of the account data.

The Payoff

Two large global financial institutions recently received strong results by providing Dun & Bradstreet with a random sampling of their preexisting entity account data to determine what accounts could be classified for FATCA by utilizing D&B data.

First, the financial institutions’ data was uploaded into a tool that matched their data to ours. Based on the sampling of more 7,000 records, almost 40% of the accounts were classified for FATCA via the D&B data. For those accounts, FATCA status was clearly established, eliminating the need for the financial institution to reach out to its clients and request tax forms. Adding to the benefits, during the proof of concept, the FFIs were able to improve their customer account data by comparing it to the Dun & Bradstreet data files. For example, a number of records lacked data elements, like address or country, and the D&B data filled in many of the gaps.

By doing what is permissible under CRS and FATCA, an FFI can make a cumbersome documenting process simpler for customers, more efficient for the institute itself – and, on top of it all, improve overall data quality.

Note: This information does not constitute tax or legal advice and is provided for informational purposes only. Please consult your tax advisor for more information.

Data: Trick or Treat?

Halloween-Wallpaper-Background

 

Come October 31, hordes of miniature ghosts, ghouls and goblins will come knocking down your door for some sugar-filled treats. Of course it’s not what it seems. You know that behind the colorful masks and spooky costumes stand harmless children celebrating Halloween.

The question is, can you say the same about your data when it appears in front of you? Data, too, often is not what it appears at first sight.

Unlike enthusiastic trick-or-treaters roaming the streets in search of candy, data does not intentionally wear a disguise to trick you, yet so many of us are fooled by it on a daily basis. It’s no surprise that data can often blind us to the truth – there’s just so much of it out there. With the amount of data growing exponentially each and every day, it’s often hard to find time to make sense of it all. Instead, we take what it says on the surface without considering whether or not we are asking the right questions of the data.

Take unemployment for example. After one of the worst recessions in modern history, we’re now hearing much celebration in the media about how unemployment is down to almost record lows, 5.1% as of me writing this article. That certainly seems worthy of praise; after all, the data doesn’t lie. While the numbers are technically right, it does not factor in a ton of variables that may actually change the overtly upbeat conversation about the current economic climate.

Please don’t think of me as a ‘Debbie Downer’ trying to rain down on the recovery parade, but this is a classic example of the data not telling the complete story. The Department of Labor measures unemployment by the number of people receiving unemployment benefits. But that, of course, can be misleading since unemployment benefits expire, leaving the jobless without a way to be measured. It also does not take into consideration those who may be working part-time jobs but are actively seeking full-time work to support their families and pay back student loans – the same loans they took out in hopes of landing full-time jobs.

These are the types of factors we need to look at in order to pose the right questions before we take the data for what it’s worth. Of course it’s easy for politicians to look at the numbers without digging deeper because it makes them look good.

“It’s human nature,” says Anthony Scriffignano, Dun & Bradstreet’s Chief Data Scientist. “We have a tendency to try and organize the world around us. It’s how we probably have survived. The world around us is so chaotic so we try to find patterns in it; we try to find things that we think are there and want to believe.”

Unfortunately, Scriffignano believes, we tend to use data to make decisions based on ego or make rush judgments in response to answers we believe others want to hear instead of digging deeper to discover what really lies underneath the data. “There’s a name for it,” he says. “It’s called confirmation bias.”

Getting back to the unemployment example, it’s really not ethical to make a broad-based assumption on the data without looking at other evidence, not when that assertion clearly contradicts what others themselves are experiencing. Unfortunately there are countless examples of this happening all the time, and in much more precarious instances across business and government.

As Gartner’s senior vice president and global head of research said at this year’s Gartner ITxpo, “Data is inherently dumb. It does not actually do anything unless you know how to use it.” That doesn’t stop most of us from seeking out even more data when we don’t see the answers we want. But reaching an honest conclusion goes beyond data volume. It takes an inquisitive approach and relentless curiosity to turn data into dependable insights.

Don’t be tricked by seeing data only on face value. Look deeper.

The UN’s Lessons in Data Inspiration


unglobalpulseFive Smart Elements of a Global “Data Revolution” 

The United Nations recently held an event called “Data Playground” at the Microsoft Technology Center. The goal, according to the UN, was to “celebrate momentum around a ‘Data Revolution’ for sustainable development.”

(The tweets were particularly interesting and you can find them here: #dataplayground)

As goals go, that’s a pretty lofty one. Over the course of the evening, attendees heard presentations about crowdsourcing of data, interactive data visualizations, big data analysis and how to creatively tell stories based on data. In other words, they were having the same kind of conversations that are happening throughout executive suites around the world.

Of course, in this case, the goals are more social than business. UN representatives talked about how opening data to the planet can be democratizing. And how the act of simply asking people questions in zones affected by strife can be as much a part of providing some dignity as activating the actual results and insight from the information gathered.

To this end, there were maps of tweets after the massive Nepal earthquake. A team from Microsoft presented a climate visualization project. But a big part of the evening was spent explaining how the UN itself was already using Big Data to support 17 new sustainable development goals.

These goals were adopted by the UN’s 193 member states. While governments and community organizations are key, the UN has also said that “the new Global Goals cannot be achieved over the next fifteen years without data innovation” and “effective data collection, curation and utilization can enable a people-centered approach to sustainable development.”

The connection of this kind of data inspiration to what businesses must manage was not lost on us. And as we looked closely at the number of Global Projects the UN already has in place to further its development and data goals (you can see the full list here), we found five to be particularly notable:

  1. Crowdsourcing food prices in Indonesia: Food prices, and sudden spikes or drops in prices, can impact developing communities in big ways and lead to economic and security issues. In many rural areas, where food is sold in stalls or local markets, governments have no way to monitor prices. So the UN enlisted a group of citizen reporters, armed them with mobile phones, and built an app that lets them record and track prices of food.
  2. Monitoring biodiversity in Zimbabwe: The reduction in biodiversity, where plants and trees become more homogenous, can be problematic as it leaves them more susceptible to environmental issues or disease that wipe them out. So the UN has built a data visualization map to make it easier to track changes in populations of some animals and vegetations that are being threatened by fire or poachers in the hopes it will help policy leaders make better decisions.
  3. Citizen feedback for governments in Indonesia: The national government wants to decentralize and hand more power and decision-making to local governments. But these smaller bodies have fewer resources and less tech savvy. So the UN created a project to let citizens offer rapid feedback that could be rapidly analyzed to help local leaders make policy decisions. The feedback was also available on a public digital online dashboard to foster more transparency.
  4. Using social media to support forest fire management: Also in Indonesia, the UN created a system to monitor tweets about forest and peat fires. The UN found that in many cases, local residents were tweeting early about things like haze and visibility, possible indicators that fires had broken out. The hope is that earlier detection could help local firefighters react more quickly.
  5. Mobile phone data to track immigration: In Senegal, a group from the UN began monitoring anonymous mobile phone data to observe large changes in mobility. That is, when a large group of people suddenly decide to move elsewhere in a very short time frame. The UN hopes this can also be an early-detection system for potential humanitarian issues, like conflict or food scarcity. Again, the sooner a problem is spotted, the faster relief agencies or policy makers can respond.

 As several speakers noted, many of these projects are relatively new. But the hope is that they can be scaled across more regions over time to have a bigger impact.

While the issues are different, the way the UN is striving to experiment, report back results in a transparent way, and discuss failures and successes in an honest way, is a good model for any executive leader tackling the challenges presented by Big Data.