A Q&A with Red Hat’s Director of Operations Data Management
Suzy Steele is honest with herself. “I own the data at Red Hat, the operations data, the strategy, and how we store the data,” says the open source technology leader’s operations director. “But I’m not a data person. I didn’t come with a data background.”
But she’s quick to point this out because it doesn’t feel like she’s confessing. She sees it as a key advantage in her work. “I came with a business background, and I think that’s critical because that aligns to where it is that Red Hat wants to go,” she says. “It’s not just simply a matter of having clean data because you’re a data purist, and because you want everything to tick and tie across the systems.”
So, yes, clean data matters. But how much? Steele says the answer to that is driven by a company’s ability to connect data to business growth goals.
“Every data person, data steward, data leader has a responsibility for understanding what it is the organization needs,” Steele says, “and make sure that they’re providing that service.”
Steele leads a Red Hat data strategy focused on making these connections. In this edition of Data Dialogue, she highlights what they’ve accomplished – and where they’re headed.
You are operations director at Red Hat. What does Red Hat do, and why do their operations need direction?
Red Hat is a technology company where we are the leading open-source technology company. We deal in the infrastructure side of technology. We don’t do the applications or the cool things that people interact with. We do all of the underlying stuff that is necessary to make technology happen, like software and operating systems and cloud enablement. Those are the new things that are coming up now.
So in operations specifically at Red Hat, we don’t produce something that gets shipped out to a customer, we make sure that customers get access to their technology virtually, and we provide them with the support, the updates, and the ticketing resolution that they need to make sure that their systems are solid.
How has the role evolved?
At every minute, it is different, especially right now, with a huge shift from traditional on-premise technology. Where a company has a servers that are sitting in a server farm, and they run applications on top of that, that’s very expensive. That is very laborious. Now the technology is shifting toward cloud enablement.
So you as a customer, you don’t have to have a server sitting on-site if you want to launch a new technology. You can go out to a cloud provider, and you can get access to it, and you can turn it on and turn it off at will, and you can scale your business by purchasing bigger chunks of that cloud technology.
But it means that we’re running very fast. We have to understand who that customer is, the partners that they’re engaging with, how they want to buy with us, and we have to have it now. We have to understand it immediately because that customer wants to purchase something three seconds from now and not three weeks from now.
How has it shifted your customers’ buying cycles?
Open source has become more of a viable solution. People are understanding the value that it brings. And so, before we had major customers that we would engage with, and we would sell big chunks of technology to them because they understood us. And now it is becoming more mainstream, so you have more customers that are opening up to it, and again, it’s more on demand, so customers are saying, “OK, I need Red Hat’s products and services, and I need it right now because I need to deliver something on the cloud.”
Before, you’d be focused on these big corporate relationships, and spend months building a solution, and let’s get you some people on-site to help you figure out what your solutions need to be. Now it’s more on-demand. Customers are saying, “I need your products and services because I understand, and I’m good with it, and I’m good with the open source technology. Just give me what I need right now.”
So, not only does it shift your buying turnaround, but it also shifts the level of engagement with your sales reps. So it is less about holding that customer’s hand now and more about, how do we enable our partners to sell and support them because more than likely these customers are coming in through alternate routes.
Given that dynamic, how do you make sure you understand their needs?
Different from the traditional model where you would sit down with customers and understand where they’re going, it requires us to have an understanding of the technology, a deeper understanding of the technologies that are out there and what the customers are running before we ever engage in that conversation with the customer. We have to know what products they’re running, and what kind of services they offer, and what they’re looking for.
And we have to do that information gathering ahead of time, so that when you sit down with them, you’re ready to hit the ground running, and you’re not coming out to pitch a solution. Our portfolio is very robust. You’re not coming out to pitch a solution that has absolutely nothing to do with what they need, or you may pitch a solution that they think is what they need, but you could do so much better of a job if you understand where they’re heading in the future.
So we rely more heavily on data that can provide us, not just the understanding of the customer and what their industry is and how they operate, and then overlay our understanding of where that industry is going, but also on third-party vendors that will tell us things like what technologies are they running, and what is their spend that’s coming up, and what have they said in analyst calls.
You want your data to tell you where the customer is headed?
Yes, end of story, that makes sales a lot easier. But specifically at Red Hat, our focus on data in the past couple of years have been more one, what are the things that we could be doing to better engage and align with where the customer wants to head, and two, where can we invest as a result. So right now one of our focus areas is, we really want to build models. We want to understand what our engagement with our customer looks like. What does that model of that customer look like, and how can we emulate that someplace else?
And so, one of the problems that we have is getting that aggregated view of the customer, and linking it in with our own data, our Red Hat transactional data, so that we can build that model and be able to answer questions like, if a customer buys product x and product y, they’re seven times more likely to delve into this new, emerging market.
How are you creating that integrated view of your data?
We’re in the infancy of that. We’ve just finished a couple years’ projects worth of getting data mastered so that, you remove the data quality and the data mastery piece out of the equation. A couple years ago, we wanted to get to predictive modeling, but we’d have 17 different systems that contain critical information about our customer, and those customers don’t even remotely look the same across those systems. So you’re using more hunch-based decision-making than you are actually vetting it.
What we’ve done is spend the last few years developing a mastered system of our customer data, so that we have one centralized hub in a hub-and-spoke type of model, and it delivers the right information out to the downstream system from the get-go. This is not reactive, this is proactively delivering that information about the customer to the downstream system. And then, when the time comes you need to link those different systems together and do your shopping list of, I want to measure from here and I want to measure from here, the system’s already ready for that.
So what we’ve provided to Red Hat to enable them to get to that state, is the clean view of the customer. We’ve overlaid key, critical vendors that help us provide that clean information like Dun & Bradstreet, and now the organization is positioned so that we’ve got the right foundation. We have the data that’s very pure form, in a very clean form. And now we’re, literally, within weeks of starting to be able to do the magic of, let’s build these models and see what comes out of it.
How has Dun & Bradstreet helped with your strategy?
When we designed the architecture of these systems and how we feed data down to the downstream systems, Dun & Bradstreet helped up with that design. We actually had solution architects from Dun & Bradstreet that came in, participated in design, participated in the testing effort to say here’s what’s working well and here’s what’s not. And then they’re also understanding where we have our gaps, so that as we’re looking at the next phases of these projects, or the next pieces of data that we want integrate, D&B already knows what’s there and knows where we’re heading, and they can say, “Hey, no, no, no, wait. We’ve got another product over here that you should look at,” or, “We’ve got a better way of doing it.”
And the partnerships that you’re building through the Data Exchange is also helping this a lot. For instance, there’s a vendor that you guys have signed up with that gives us technology information about the customer, what servers are they running, what’s their IT spend? And that’s simply a matter of clicking a button, and getting that information in instead of going through this massive cleanup or integration with a completely different vendor.
That’s the value that we see. You guys are helping to drive where we want our data technology to go.
Steele leaves us with one very important thought. She reminds us that for all the value and investment in data itself, it is nothing without having a robust governance framework built into your process. Without it, she says, “The minute you go live, the minute you clean something, you’re going to have a problem.”