Think Data First, Platform Second – Why Data Fuels MDM

Fuel

 

As the volume of data coming into organizations – from both internal and external sources – continues to grow and makes its way across departmental systems in many different formats, there is a critical need to create a single, holistic view of the key data entities in common use across the enterprise. Master Data Management (MDM) aims to accomplish this goal. Not surprisingly, MDM has become a significant priority for global enterprises, with the market expected to triple from $9.4B to $26.8B by 2020 according to analysts.

But while everyone is investing serious cash into the tools to manage the data, few are putting any thought into the data itself. This is akin to purchasing a luxury sports car and fueling it with water. Sure it looks great, but it won’t get you very far.

 

The underlying concept of MDM is surprisingly simple: get everyone “on the same page” looking at the same data and ensure it is accurate. Yet, master data and its management continue to be a universal challenge across many industries.  Organizations of all shapes and sizes share similar problems related to master data and can all reap benefits from solving them. That means concentrating on the quality of the data before going shopping for the sexiest MDM platform. In essence, you must master data before you can manage it. Ensuring the quality, structure, and integrability is your responsibility; your MDM platform won’t do that for you. It’s like purchasing a top-of-the-line oven and expecting it to produce a delectable meal. You are responsible for what goes into it.

Master Data Defined

Master Data is the foundational information on customers, vendors and prospect that must be shared across all internal systems, applications, and processes in order for your commercial data, transactional reporting, and business activity to be optimized and accurate. Because individual businesses and departments have a need to plan, execute, monitor and analyze these common entities, multiple versions of the same data can reside in separate departmental systems. This results in disparate data, which is difficult to integrate across functions and quite costly to manage in terms of resources and IT development. Cross-channel initiatives, buying and planning, merger and acquisition activity, and content management all create new data silos. Major strategic endeavors, part of any business intelligence strategy, can be hampered or derailed if fundamental master data is not in place. In reality, master data is the only way to connect multiple systems and processes both internally and externally.

Master data is the most important data you have.  It’s about the products you make and services you provide, the customers you sell to and the the vendors you buy from. It is the basis of your business and commercial relationship. A primary focus area should be your ability to define your foundational master data elements, (entities, hierarchies and types) and then the data that is needed (both to be mastered and to be accessible) to meet your business objective. If you focus on this before worrying about the solution, you’ll be on the right course for driving success with MDM. Always remember, think data first and platform second.

4 Wishes Data-Inspired Leaders Want This Holiday

4 Wishes Data-Inspired Leaders Want This Holiday | D&B

4 Wishes Data-Inspired Leaders Want This Holiday | D&B

 

What Every Data-Inspired Leader Wants This Holiday

With the holidays in full swing, everyone is busy making their lists and checking them twice. But while electronics and toys routinely top the wish lists for most, the data-inspired leaders of the world have some unique desires that can’t easily be purchased from your favorite store.

Whether you’ve been naughty (online hookup site for married couples was breached by hacking outfit, The Impact Team, and the personal details of 37M users were made public, leaving many men sleeping on the couch) or nice (Data Science for Social Good, a program at the University of Chicago that connects data scientists with governments, is working to predict when officers are at risk of misconduct, with the goal of preventing incidents before they happen), chief data officers, data scientists and all data stewards want better and safer ways to do their jobs.

Instead of playing Santa and asking them to sit on my lap and tell me what they want for the holidays, I figured I’d simply share some of the top things we’ve heard on data leaders’ wish lists this year.

1. A Better Way to Find Truth in Data

Mark Twain famously said, “There are three kinds of lies: lies, damned lies, and statistics.” One of the biggest problems we’re faced with every day is trying to make sense of the data we have. In a perfect world the answer to all of our questions would lie smack dab in the data itself, but that’s not the case. The premise that data can get us closer to that single version of the truth is harder to achieve than first thought. But it hasn’t stopped us from trying to form conclusions from the data that is presented. Sometimes we rush to conclusions in the face of mounting pressure from others who demand answers.

What we really need is a source of truth to compare it to, otherwise it is very hard to know what the truth actually is. Unfortunately, that is often an impossible goal – finding truth in a world of ambiguity is not as simple as looking up a word in the dictionary. If you think about Malaysia Airlines Flight 370, which tragically disappeared in 2014, there were several conflicting reports claiming to show where the downed airline would be found. Those reports were based on various data sets which essentially led to multiple versions of proposed “truth.” Until they finally found pieces of the wreckage, searchers were looking in multiple disconnected spots because that was what the “data” said. But without anything to compare it to, there was no way to know what was true or not. This is just one example how data can be used to get an answer we wall want. This same thing happens in business everyday, so the takeaway here is that we need to stop rushing to form conclusions and try to first understand the character, quality and shortcomings of data and what can be done with it. Good data scientists are data skeptics and want better ways to measure the truthfulness of data. They want a “veracity-meter” if you will, a better method to help overcome the uncertainty and doubt often found in data.

2. A Method for Applying Structure to Unstructured Data

Unstructured data – information that is not organized in a pre-defined manner, is growing significantly, outpacing structured data. Experts generally agree that 80-85% of data is unstructured. As the amount of unstructured data continues to grow, so does complexity and cost of attempting to discover, curate and make sense out of this data. However, there are benefits when it is managed right.

This explosion of data is providing organizations with insights they were previously not privy to, nor that they can fully understand. When faced with looking at data signals from numerous sources, the first inclination is to break out the parts that are understood. This is often referred to as entity extraction. Understanding those entities is a first step to drawing meaning, but the unstructured data can sometimes inform new insights that were not previously seen through the structured data, so additional skills are needed.

For example, social media yields untapped opportunities to derive new insights. Social media channels that offer user ratings and narrative offer a treasure trove of intelligence, if you can figure out how to make sense of it all. At Dun & Bradstreet, we are building capabilities that give us some insight into the hidden meaning in unstructured text. Customer reviews provide new details on the satisfactory of a business that may not previously be seen in structured data. By understanding how to correlate negative and positive comments as well as ratings, we hope to inform future decisions about total risk and total opportunity.

With unstructured data steadily becoming part of the equation, data leaders need to find a better way to organize the unorganized without relying on the traditional methods we have used in the past, because they won’t work on all of the data. A better process or system that could manage much or all of our unstructured data is certainly at the top of the data wish list.

3. A Global Way to Share Insights

Many countries around the world are considering legislation to ensure certain types of data stay within their borders. They do this out of security concerns, which are certainly understandable. They’re worried about cyber-terrorism and spying and simply want to maintain their sovereignty. Not surprisingly, it’s getting harder and harder to know what you may permissibly do in the global arena. We must be careful not to create “silos” of information that undermine the advancement of our ability to use information while carefully controlling the behaviors that are undesirable.

There’s a method in the scientific community that when you make a discovery, you publish your results in a peer-reviewed journal for the world to see. It’s a way to share knowledge to benefit the greater good. Of course not all knowledge is shared that way. Some of it is proprietary. Data falls into that area of knowledge that is commonly not shared. But data can be very valuable to others and should be shared appropriately.

That concept of publishing data is still confusing and often debated. Open data is one example, but there are many more nuanced approaches. Sharing data globally requires a tremendous amount of advise-and-consent to do this in a permissible way. The countries of the world have to mature in allowing the permissible use of data across borders in ways that do not undermine our concerns around malfeasance, but also don’t undermine the human race’s ability to move forward in using this tremendous asset that it’s creating.

4. Breeding a Generation of Analytical Thinkers

If we are going to create a better world through the power of data, we have to ensure our successors can pick up where we leave off and do things we never thought possible. As data continues to grow at an incredible rate, we’ll be faced with complex problems we can’t even conceive right now, and we’ll need the best and brightest to tackle these new challenges. For that to happen, we must first teach the next generation of data leaders how to be analytically savvy with data, especially new types of data that have never been seen before. Research firm McKinsey has predicted that by 2018, the U.S. alone may face a 50% to 60% gap between supply and demand of deep analytic talent.

Today we teach our future leaders the basics of understanding statistics. For example, we teach them regression, which is based on longitudinal data sets. Those are certainly valuable skills, but it’s not teaching them how to be analytically savvy with new types of data. Being able to look at data and tell a story takes years of training; training that is just not happening at the scale we need.

High on the wish list for all data stewards – and really organizations across the globe, whether they realize it or not – is for our educational institutions to teach students to be analytical thinkers, which means becoming proficient with methods of discovering, comparing, contrasting, evaluating and synthesizing information. This type of thinking helps budding data users see information in many different dimensions, from multiple angles. These skills are instrumental in breeding the next generation of data stewards.

Does this reflect your own data wish list? I hope many of these will come true for us in 2016 and beyond. Until then, wishing you the very best for the holiday season…

3 Reasons to Be Thankful for Data

give thanks - Thanksgiving concept

 

What better time to think about all that we’re thankful for than Thanksgiving? For businesses, there has never been a better time to be grateful for data and all it can do to help deliver new opportunities. Like a moist, juicy roast turkey, data is the one thing everyone seems to be eating up. Let’s look at three reasons why data is so delectable. We promise you won’t feel sleepy after consuming this list.

 

Bountiful Amounts of Data

The Pilgrims arrived in the New World during a particularly harsh winter, making it very difficult for them to find food and shelter. Of course we know they persevered despite the shortage of resources after they learned to harvest crops that produced an abundance of food for them to not only survive, but thrive.  Fast-forward to today, and we gorge ourselves on turkey and various complex carbohydrates in celebration of the early settlers’ efforts.

Like the sustenance which was once so hard to come by, data too has gone from scarce to abundant, and there’s plenty of it to feast on. Just as the Pilgrims struggled to live off the land, data scientists once had to work especially hard to cultivate meaningful bits of data. When data was scarce, they had to extract, track and survey to get at it. The objective was always to find out how and where to get more of it. Today, data is everywhere and the new goal is understanding how to make sense of it.

Technology has been one of the biggest catalysts of the data explosion. With the digitization of business, the amount of information we are collecting is growing so large that there are debates on just how big it is – it probably has grown even more since you read that sentence . Both structured and unstructured in nature, this wealth of information has made it possible to produce insights and achieve outcomes that were previously inconceivable. Businesses can better identify trends and risk, organizations can tackle health and wellness issues, and governments can solve economic and social challenges.

While we should certainly be grateful for the abundance of data, we must be careful how we use the information. It is important not to overindulge or horde it. Instead we must recognize the type of data that will sustain us and avoid the empty calories that may lead us astray. Just like the Pilgrims planted the right seeds that would bring them sustenance, we must choose the kernels of data that will drive meaningful value and insights.

 

Data Freedom

The Pilgrims came to America from England in search of religious freedom.  They yearned for an open society characterized by a flexible stricture, freedom of belief and dissemination of information. We are witnessing a similar evolution in the way data is accessed and shared. The concept of data sharing is officially defined as making a certain piece of data free to use, reuse and redistribute –subject only, at most, to the requirement to attribute and/or share-alike. In other words, we’re at a point in history where some information can be freely used to solve key problems or make progress towards specific goals.

There are many examples of data that can be openly shared. However, it’s not just about the numbers, but the insights that come along with it that pose the most benefits when freely distributed. This concept offers benefits across both the private and public sector. Businesses can gain a new level of transparency into new opportunities for services/goods, make better decisions based on more accurate information, and reduce costs for data conversion. But perhaps the biggest advantage of an open data ecosystem is for individual citizens. That’s because the sharing of information between governments can help everything from increase economic activity, address national disasters in a swifter manner, and even reduce health issues.

There are several types of data that can be shared among governmental functions. There is the sharing of data among governmental agencies within a single country. Second is the sharing of data across borders between International governments. And lastly, there is the sharing of data between businesses and government; this refers to voluntary sharing of data, beyond the legal reporting obligations of governments.

So what exactly should governments be sharing? Observations are crucial – such as the National Weather Service issuing a Hurricane watch. It’s about sharing conclusions that can help different agencies better prepare for that projected weather event. Ultimately, this is the value of open data: multiple organizations, with mutual interests, sharing their independent observations in a manner that lets each organization draw more accurate, informed and insightful conclusions.

Unfortunately, in many cases the decisions made by government officials are not always based on all of the information that might be pertinent to the situation. The same goes for businesses, which are somewhat reluctant to let go of their first-party data. With the freedom and ease to share information at our hands, we have the opportunity to achieve maximum economic and social benefits.

At the end of the day, data is nothing without analytics to help make sense of it.  We should always be cognizant about ways in which specific pieces shared data are used to address specific questions and help establish new ideas. The Pilgrims likely did not use all of their food sources to cook a single meal.  We shouldn’t use all the data available to solve a problem just because it’s in front of our face.

 

A Brave New World of Data

As hard as I tried, I could not come up with a very clever parallel between the Pilgrims and the Internet of Things (IoT). But, this new buzz word represents such a major data opportunity to be thankful for, and, well, the Pilgrims had, umm, things, so here we go.

The IoT refers to the growing network of interconnected objects – devices and machines which are not connected but aware of, and discoverable to, other devices and machines around them. It’s mobile, virtual and instantaneous. Data can be gathered and shared from anything like a car to a refrigerator, which means we’ll be witnessing a huge increase in the amount of data being generated – huge streams of data that will enable new types of insight. We now have an opportunity to organize this information in compelling ways to reach new conclusions.

The IoT has the opportunity to fundamentally change industries. From the automotive industry, where new data signals from cars may help improve safety conditions, to the supply chain, where the real-time passing of information can avoid disruptions in manufacturing. Organizations will quickly realize transformational change of their business models given the rate at which digital technologies are connected and constantly evolving.

As beneficial as the IoT will be, it will not flourish without the right data management and analytics capabilities. Organizations will need to invest time and resources in order to mine valuable insights from the data generated by the interactions that occur between machines.

 

In Summary

These are just three examples of how data is changing the face of business and frankly, society. There are certainly countless other reasons to be thankful for data, depending on what your business goals are and what you want to achieve. As I’ve noted, within each of these instances, while there is much to be thankful for, it is vital we be cautious and smart when taking advantage of new data opportunities. Just like the Pilgrims, we are using this new frontier to create a new world of endless possibilities.

Data: Trick or Treat?

Halloween-Wallpaper-Background

 

Come October 31, hordes of miniature ghosts, ghouls and goblins will come knocking down your door for some sugar-filled treats. Of course it’s not what it seems. You know that behind the colorful masks and spooky costumes stand harmless children celebrating Halloween.

The question is, can you say the same about your data when it appears in front of you? Data, too, often is not what it appears at first sight.

Unlike enthusiastic trick-or-treaters roaming the streets in search of candy, data does not intentionally wear a disguise to trick you, yet so many of us are fooled by it on a daily basis. It’s no surprise that data can often blind us to the truth – there’s just so much of it out there. With the amount of data growing exponentially each and every day, it’s often hard to find time to make sense of it all. Instead, we take what it says on the surface without considering whether or not we are asking the right questions of the data.

Take unemployment for example. After one of the worst recessions in modern history, we’re now hearing much celebration in the media about how unemployment is down to almost record lows, 5.1% as of me writing this article. That certainly seems worthy of praise; after all, the data doesn’t lie. While the numbers are technically right, it does not factor in a ton of variables that may actually change the overtly upbeat conversation about the current economic climate.

Please don’t think of me as a ‘Debbie Downer’ trying to rain down on the recovery parade, but this is a classic example of the data not telling the complete story. The Department of Labor measures unemployment by the number of people receiving unemployment benefits. But that, of course, can be misleading since unemployment benefits expire, leaving the jobless without a way to be measured. It also does not take into consideration those who may be working part-time jobs but are actively seeking full-time work to support their families and pay back student loans – the same loans they took out in hopes of landing full-time jobs.

These are the types of factors we need to look at in order to pose the right questions before we take the data for what it’s worth. Of course it’s easy for politicians to look at the numbers without digging deeper because it makes them look good.

“It’s human nature,” says Anthony Scriffignano, Dun & Bradstreet’s Chief Data Scientist. “We have a tendency to try and organize the world around us. It’s how we probably have survived. The world around us is so chaotic so we try to find patterns in it; we try to find things that we think are there and want to believe.”

Unfortunately, Scriffignano believes, we tend to use data to make decisions based on ego or make rush judgments in response to answers we believe others want to hear instead of digging deeper to discover what really lies underneath the data. “There’s a name for it,” he says. “It’s called confirmation bias.”

Getting back to the unemployment example, it’s really not ethical to make a broad-based assumption on the data without looking at other evidence, not when that assertion clearly contradicts what others themselves are experiencing. Unfortunately there are countless examples of this happening all the time, and in much more precarious instances across business and government.

As Gartner’s senior vice president and global head of research said at this year’s Gartner ITxpo, “Data is inherently dumb. It does not actually do anything unless you know how to use it.” That doesn’t stop most of us from seeking out even more data when we don’t see the answers we want. But reaching an honest conclusion goes beyond data volume. It takes an inquisitive approach and relentless curiosity to turn data into dependable insights.

Don’t be tricked by seeing data only on face value. Look deeper.

System Integrators Fuel Customization

Brock BuildersHome-builder D.R. Horton is the largest residential construction company in the United States. Like many companies, it has optimized its business model using simple principles: Build a simple product, make it customizable and make it easy to support.

With this strategy, D.R. Horton has realized 84 quarters of consecutive growth. There are a variety of reasons the company is thriving, including strong leadership. Let’s take a closer look at their three principles.

Simple Product: D.R. Horton sells a simple product, a home. Their core competency is building homes. Usually it offers three or four models in each suburban community it serves. Each home is built from the same materials and designed by the same architect. Each caters to a spectrum of buyers, including those willing to pay a premium for high-end finishes, upgraded appliances and designer paint colors. And that is where it ends.

Customizable Aftermarket: There is a cycle of goodness around each new housing development. Retail stores like Home Depot move in, ready to serve the needs of new homeowners. Local landscapers and service providers thrive with new business, outfitting new homes with different fixtures or adding a deck, a spa or a cabana. Spending by new homeowners stimulates the local economy, the town grows, and the housing market flourishes as new buyers settle in and existing home buyers return for more.

Maintenance Services and Annuities: Often new homes are part of a loosely organized home owner’s association. Homeowners pay a monthly fee that covers ongoing maintenance of community assets like sidewalks, lighting and water, trees and grass. The business of collecting fees and distributing proceeds back to D.R. Horton, the city, and contractors is outsourced. The fees generate revenue like a silent tax on buyers, who are beholden to their basic desire for a nice park, clean sidewalks, ample street lighting and cut grass.

Applying the Lesson Plan

The case study of D.R. Horton is interesting because there’s a strong parallel between how it approaches the housing market and how Dun & Bradstreet addresses the market for commercial data and analytics.

Product: D&B sells a product such as D&B360, DNBi, D&B Direct 2.0 and others. Features and functions cater to use cases and price points, with customers willing to pay a premium for confidence in the veracity and richness of our commercial business insight. The architecture for each product is standardized, the materials largely the same.

Customizable Aftermarket: When buyers want customization, we leverage our partners to fulfill those specific needs. Our system integrator (SI) partners – part of the Dun & Bradstreet Global AllianceNetwork program – have the specialized skills needed to stitch together the fabric of software and applications, Dun & Bradstreet commercial data, and the many workflows and business processes that fuel an enterprise. System integrators develop a myriad of features our customers want.

Maintenance Services and Support Annuity: At its core, Dun & & Bradstreet maintains and supports products like D&B Direct 2.0, DNBi and D&B360. As the variety of customizations glom onto our products, we increasingly turn to SIs to maintain the growing labyrinth of custom applications and services hanging off our products. SIs collect a recurring fee to maintain these enhancements, and customers are happy because someone is servicing their ongoing need for customization and support.

The business potential here is the same: a virtuous cycle that connects buyers to the products and services they want and need. The markets are different – homes vs. data – but the approach is similar. Figure out how much of the buyer’s need your business can address, and then refer, recommend and partner to make sure your product is going to market with the support and services needed to make it a ripping success. Because that’s how markets – and economies – grow.

Image credit: Brock Builders

 

Data as a Service: From Flat Files to Real-Time Insight

an_api_callTime is money. So, increasingly, is data. Here’s the catch: Data’s value actually ‘rusts’ over time. Consider this. Every 60 minutes, businesses are hit with an average 399 lawsuits, D&B research shows. In the same amount of time, 148 new businesses are started, while 9 file for bankruptcy—and that is just in the U.S. And as business processes continue to accelerate, data rusts faster and faster with each new quarter.

Yet many organizations still rely solely on data delivered in flat files via file transfer protocol (FTP). While this is appropriate for large-scale analyses, it’s no longer enough to compete effectively in a fast-changing, agile world. Because it’s difficult to make good decisions based on information that may be months – or quarters — out of date.

Here’s why. FTP is still a fantastic way to process multiple files quickly, in one pass. But configuring these transfers requires manually intervention. In other words, they take time, often a couple of weeks. As a result, many overworked IT departments limit updates to once a quarter, or even less frequently.

Turning on the Spigot

Enter Data as a Service (DaaS). It is the latest in the constantly expanding arena of ‘as a Service’ technologies. Like its popular cousin Software as a Service, DaaS enables delivery of content in real time via the cloud. Think of it as a constant stream of data — what you need, when you need it — rather than intermittent bucketful.

The D&B Direct 2.0 API is the engine that drives D&B Direct’s DaaS offerings. The API allows you to pour all kinds of business data—financial, supplier, customer, social—directly into business applications, in real time. Once you configure your initial API call logic in D&B Direct, you can receive ongoing updates instantly. No more latency periods for cleanup and integration. Meanwhile, you are sure records are clean from the moment they enter your systems.

In other words all fresh data, all the time. And the potential uses are almost as boundless as data itself.

  • Deal-making due diligence
  • Credit portfolio management and collections prioritization
  • Integrated financial decision-making
  • Compliance
  • Source-cost optimization, and supplier and distributor performance
  • Order management
  • Segmentation and targeting optimization
  • Advanced marketing automation and nurture
  • Social and digital intelligence
  • And many more

Keep the Baby and the Bathwater

Best of all, you can move your organization toward DaaS without forgoing the convenience of FTP transfers. For example, if you’re performing a system migration, flat files remain an ideal way to match and cleanse large volumes of records at once.

In fact, many D&B customers make a gradual transition to DaaS, using flat files for initial cleansing of large data sets, then keeping data fresh via continuous updates in real-time with D&B Direct. In other words, a fast and efficient way to give your teams the right data at the point of decision — whenever and wherever they need it.

Cloud-Based ERP Gains Ground

ERP-Enterprise-Software_shutterstock_180737201-560x371This blog post originally appeared on the Bizmology blog on Nov. 4, 2014.

The enterprise resource planning (ERP) software market has traditionally been dominated by Tier 1 vendors such as SAP and Oracle. However, the emergence of cloud-based, SaaS ERP applications has led to Tier 2 players making inroads through innovation and increased market share, which has resulted in fierce competition. Among the fastest-growing participants in the ERP space are Tier 2 players such as Workday and Workforce Software. Learn more about trends and opportunities in our new Enterprise Resource Planning Software industry profile.

Enterprise Resource Planning Software ($25 billion annual global revenue) is one of six new industry reports added by First Research editors in October. First Research has an ongoing process of adding new industry content; see my colleague Amy Schein’s recent post to learn about profiles added in September. First Research customers use our industry insight to better understand business in a given market.

Take a look at some of the trends, challenges, and opportunities from our newest profiles.

Accounting & Finance Software (U.S. is a large market)

Business Intelligence — Interest in business intelligence gives accounting and finance software vendors the opportunity to offer tools that can help customers collect and organize data and use it to improve their businesses. For example, clients may want to generate custom reports or dashboards incorporating financial metrics. Business intelligence and financial management software may be sold together or integrated into broader enterprise resource planning (ERP) applications.

Asphalt Products Manufacturing ($22 billion annual U.S. revenue)

Roofing Material Trends — Asphalt shingles will remain the dominant roofing material in new residential construction. But the use of modified bitumens, which are reinforced sheets of glass fiber, polyester, and/or polyethylene that have been factory-coated into roll roofing, has grown rapidly. Metal roofing today is well-accepted in commercial buildings and becoming more common in residential structures. Energy-saving cool roofs, which use highly reflective materials instead of sun-absorbing black materials, are gaining popularity in some areas as well, creating competition for asphalt roofing product companies.

Database & File Management Software (U.S. is a large market)

NoSQL Development — Although relational data management remains the most prevalent model for databases, adoption of nonrelational, or NoSQL, databases is increasing. Proponents of NoSQL systems tout greater scalability and flexibility, easier administration, and less expensive hardware requirements compared to relational DBMS. Providers of relational DBMS, some of which also provide NoSQL products, argue that the models are complementary, and relational databases remain the preferred technology for applications such as transaction processing and large-scale data warehousing.

Diagnostic Substance Manufacturing ($13 billion annual U.S. revenue)

Infectious Disease Outbreaks — The use of diagnostic tests is a crucial time-saver in the effective management and containment of infectious disease outbreaks. Increased global cases of HIV, flu, and malaria during 2013 resulted in a 17 percent increase in sales of infectious disease tests for one major manufacturer. International aid organizations such as the World Health Organization and the Gates Foundation promote the use of point-of-care tests to monitor disease outbreaks in developing nations.

Multimedia, Graphics & Publishing Software ($15 billion global market)

Data Visualization Tools — As more content is published online and via social and mobile platforms, the use of charts, graphs, maps, and infographics is becoming a more popular way to present data. To meet demand for visual data, multimedia, graphics, and publishing software providers are developing new tools. Recent examples include ZingChart, a tool for building interactive Flash or HTML5 charts, and InstantAtlas, which enables the creation of visualizations around map data.

More about First Research

First Research now offers insight on more than 450 industries to help guide professionals in their business decisions. Profiles contain a comprehensive set of data in an easy-to-digest format. In a hurry? Our call-prep sheet is a popular way to quickly get up to speed on an industry.

Where can you find First Research content? Visit the First Research website and learn about getting a First Research add-on to a Hoover’s subscription or delivery via D&B CRM solutions.

 

Come See Us at Oracle OpenWorld 2014

OOWboothpeopleD&B is charged up for this week’s Oracle OpenWorld conference in San Francisco. First, there’s our new partnership with Oracle. That work is coming to fruition quickly — Oracle announced today Oracle Data as a Service (DaaS) for Sales, a new Cloud component that includes D&B commercial content. Oracle’s goal? Meet its customers’ ongoing needs to qualify customers and improve the productivity of sales and marketing team members, for ultimate profitability.

Expect a lot of talk at the show about the potential of DaaS to improve data quality inside applications. We’ll also be participating in a Content Session with Oracle called Big Data and DaaS in Business-to-Business Applications. D&B’s General Manager and EVP of Partner Solutions, Mike Sabin, will be speaking alongside Oracle’s head of DaaS, Omar Tawakol, and Niraj Deo, Senior Director Oracle Cloud Product Management.

Make time in your schedule to listen in on the conversation: Wed. Oct 1 at 11:30am-12:15pm, Room 2002, Moscone West. And be sure to see Oracle’s demonstration center in Moscone West, 2nd floor to learn more about Oracle’s DaaS plans, and check out these other Oracle DaaS sessions.

Stop by the D&B booth in Moscone South, #1637 to see how D&B’s commercial data will flow into Oracle’s Cloud applications and see our early work with Oracle’s inhouse tool, Value Navigator.  We’ll also be giving away credit card-sized phone chargers and snacks. Don’t be shy! We’re always looking to have great conversations with folks interested in data quality.

D&B and Oracle Team Up to Enrich Cloud-based Business Apps

the_aucitronD&B is taking its new “outside-in” approach seriously, signing a strategic partnership agreement with Oracle that makes D&B data available to sales and marketing professionals through Oracle’s cloud-based Data as a Service for Business (Oracle DaaS for Business).

The combined solution is designed to create an ‘intelligent data layer’ — a way to add rich sources of third-party data, including D&B’s commercial business data, social media and professional contact information, to Oracle’s cloud-based applications.

The pre-integrated solution places the power directly in the hands of end users, enabling them to access data in native formats and to view high-quality data and insights within their application ecosystem. Using Oracle DaaS for Business with D&B data, companies can make headway on key strategic initiatives such as:

  • Data-driven decision-making
  • Omni-channel marketing
  • Gaining a 360-degree view of the customer
  • Sales optimization
  • Social media marketing
  • Global data management

The combined solution will be offered first to Sales and Marketing Cloud customers and then will be extended to the full suite of Oracle DaaS applications.

Check out today’s press release to find out more about the strategic partnership between Oracle and D&B. If you’re heading to Oracle OpenWorld, stop by the D&B booth for a sneak peak of the new capabilities in Oracle DaaS for Business.

Image credit: Theaucitron

 

Data as a Service Comes of Age

Kastner_LukasD&B has been talking about Data as a Service (DaaS) for more than a year now. And we’re thrilled that more and more companies are embracing the concept.

Think about it. What makes more sense than cleaning and verifying the data that’s sitting in your data stores, as your teams use it? Better yet, it makes data management snarls invisible to end users – they just get accurate data about companies, contacts and customers, right in the applications they already use.

Research firm Ovum agrees with the DaaS model. In a white paper published this summer, Data-as-a-Service: The Next Step in the As-a-Service Journey, it calls DaaS the “instinctive next step in the evolution of as-a-service”.  With data volumes and complexity on the rise, the DaaS approach shields end users from IT complexity and offers easy-to-use, function-focused tools that deliver the true value of the data to the business user. A good DaaS solution can remedy significant data challenges, like a lack of common identifiers across data sources and low data quality, so data analysts can sidestep the noise and get to actionable insights. By decoupling data from applications, the DaaS model makes it possible to clean up data in any application, on any platform.

The time is right for DaaS to enter the limelight, for one reason: There is competitive advantage in data. Oracle knows it. Earlier this year, the company added DaaS as a “fourth pillar” to their foundation of the Oracle Cloud. So now they talk about SaaS, PaaS, IaaS and DaaS. You can read more about their vision on the Oracle DaaS microsite.

Through its partnerships, D&B has real-world DaaS solutions in the market today. Microsoft, Sugar, and Salesforce Data.com are all building momentum with their customers. Together, our companies can give end users access to better quality, squeaky clean data within CRM systems – saving them time, helping them build better relationships and creating happier customers. And that makes it a no-brainer as a valuable next step for any organization looking to bring data-driven methodologies to sales and marketing functions.

Image credit: Lukas Kastner