CDO to CDO Podcast: Ray Deiotte

March 31, 2021 Chris Knerr

The CDO to CDO podcast is hosted by Chris Knerr, Chief Digital Officer of Syniti.

 

 

+++++++++++++++++++++++++

Subscribe Here:  Listen to all the episodes in the Syniti Podcast Series

+++++++++++++++++++++++++

 

 

Chris Knerr (00:23):

Hello, and welcome to the CDO Magazine interview series. I'm Chris Knerr, Chief Digital Officer of Syniti, a world leader in enterprise data software, and we're partnering with CDO Magazine, MIT CDOIQ, and the International Society of Chief Data Officers to bring you this series of interviews with thought leaders in data and analytics. Today, it's my great pleasure to welcome to the interview series Ray Deiotte, CDO of NetApp. Welcome Ray. Terrific to meet you.

 

Ray Deiotte (00:48):

Thanks Chris. It's a pleasure to be here.

 

Chris Knerr (00:51):

So Ray, I'd like to dive right in and get your take on streaming and IOT. Obviously, NetApp is a platform company, and your products are core to the whole concept of data fabric, and for your clients, you enable a huge volume of streaming data. For the audience, the IOT and streaming narrative, as you know, is very hot. What's your perspective? Should we believe the hype? Are your customers seeing value from the integration and utilization of streaming and IOT data?

 

Ray Deiotte (01:25):

Yeah. The hype is there. Where we need to continue to work is to continue to work on those use cases that... building data for data's sake. But as I look at the industries that we work with, especially in healthcare and life sciences, streaming and IOT is really starting to build a head of steam behind because we're starting to see more realtime actions needing to be taken, whether that's in response to COVID on the healthcare side of the house, or if it's in response to production optimization on the life sciences side of the house in our pharma clients. But we're definitely seeing a rise in IOT and it's starting to drive new decisions. Even internally at NetApp, we use a lot of our own capabilities to drive edge, core, and cloud in order to better provide our customers that realtime insight into what their systems are doing and get ahead of any issues that we might see with those deployed systems.

 

Chris Knerr (02:26):

Okay. That's really helpful, and I like the emphasis on use cases and I like the clarity on a good criteria. And if I follow your line of [Appen 00:02:39] in real time, and that's where we're going to get more bang for the buck.

 

Ray Deiotte (02:45):

Yeah. But we need to look for that realtime value, but we have to bring the whole ecosystem together because if we're generating data in realtime but we have no way to consume it in realtime and really act on it in realtime, that becomes an issue. At that point, we're just collecting data for data's sake. So one of the partners that we're working with now, they have a SneakerNet problem that we're instrumenting with realtime IOT or to accelerate them. But the reason we can accelerate them is that they're hungry for that data. They've got the consumption mechanism already built up, and so the data will go right into that consumption algorithm, so that they aren't sitting around waiting for data anymore. Five years ago, it probably wouldn't have been ready for them to do that, and sneaker netting into a batch so that they could do batch analytics was probably perfectly acceptable for their use case.

 

Chris Knerr (03:37):

Yeah. No, and that's a really helpful distinction, and I agree. You've basically just simplified. You've got to have the handpicks. [crosstalk 00:03:45] update in realtime, but if you have no way of looking at it or bringing it to the point of consumption visualization and so on, there's not much point. I want to go back to your point about the whole ecosystem, because one of my observations in working in a lot of large enterprises is that... I think we've had a story in the industry in particular over the last five to 10 years, that cloud makes everything simpler, and here are different technologies and everything's really interoperable. In real life, what I've found is that data integration and data interoperability continues to be a huge challenge for most enterprises.

 

 

So, maybe with respect to some of the examples you could talk about what's significant from an IT ops standpoint? So for example, I've got realtime streaming data, and it's about an asset. But on the backend of that, I've still got to have an asset master, I've still got to have master data management that's properly curated. I might have a lot of neat data, but I don't even know what I'm looking at, and I have no way to tie it back to that broader functional framework and an IT operational framework. Am I on the right track that that's still a problem, and what's your thinking about that?

 

Ray Deiotte (04:59):

Yeah. So interop and integration will always be a problem in any industry because nobody has come to the panacea of one form of data to rule them all, if you will. But what you're getting to is really what's necessary across any data ecosystem. And that is a good set of governance. And understanding how that data is going to be mastered and used, curated and stewarded through the system will help bring along all of the things that you discuss about problems with interoperability and integration and matching to a device master. All of those bits and pieces need to be formed under an enterprise governance program that then brings together all of your static data, your historical data, and your streaming deform that can be utilized to solve the questions of the day.

 

Chris Knerr (05:53):

Yeah. I'm 100% on the same page. The G word, governance always continues to be a challenge. I've found extensively, again in enterprises, there continues to be a lot of education that's required, particularly at a senior executive level on... Governance doesn't mean something slow and expensive. It actually means something fast. So maybe to paraphrase what you said, and I agree with this, is that the streaming in realtime data actually underscores the need for governance. Is this even a worst challenge as it relates to that executive education and the process and organizational readiness of data operations and data governance teams to be able to cope?

 

Ray Deiotte (06:49):

I think it does because everything is becoming more and more instrumented. In healthcare, we look at consumer wearables, we look at IOT devices. The more we continue to deal with healthcare at home and remote sensing on patients, the larger a problem this becomes. And so, what it speaks to is not just the need for governance, but the need for, as you said earlier, more of an agile governance model, where we can adapt rapidly to those new tools coming in, whether it's a new glucose monitor even, for our traveling nurses. All of those bits and pieces should be able to fall into that dynamic governance model that takes systems thinking with agile development and agile integration and marries the two together in a way that solutions can be rapidly integrated and transform the way that we're providing care or that we're providing capability to our customers.

 

Chris Knerr (07:51):

I'm going to put you on the spot. I love this idea of agile governance, but I think a lot of people would say that sounds like a paradox. I think you just gave some granular characteristics, but if you were to describe in your mind, what would be three or four major characteristics of agile data governance? What would they look like?

 

Ray Deiotte (08:13):

Yeah. It's all about taking it from a use case perspective. Instead of saying, "Hey, we need to master all of the data across the enterprise," which obviously is the end goal, it's about taking incremental steps and looking at use cases while looking at use areas of the data and doing the governance process on them. And now when we're talking about agile, we're not talking about two to four-week sprints, but we're talking about something that behaves more agile. That we're looking at very accomplishable steps, very repeatable steps to generate master data that falls in the governance of the organization. So the way we've implemented it in the past is that we start with use cases, we generate what we think the governance should be around that information, how it should be curated, how it should be stewarded.

 

 

And then we take a subset of the data that we know we have in that domain, we go, we master, we do everything that we need to, and then we circle back iteratively and understand the lessons learned from building out that curated now set of data and information that can be leveraged across the enterprise. And we take those lessons learned and put them back into this design thinking and agile process to really iterate and do this rapidly so that you're not trying to take on the millions or billions of [inaudible 00:09:28] environment.

 

Chris Knerr (09:30):

I really like this idea a lot. I want to replay this with two thoughts. When I help people with data strategy work, I have a point of view that I'm not going to do strategy work for more than a few months, because basically, it's not going to end up being meaningful. I have to get into doing some real-life project work and that informs what we're doing. And then I think that what I really like about this idea of agile governance is it's almost like the same one on [inaudible 00:10:00] the requirements. So we're absolving ourselves of the illusion or delusion that we can actually figure out apriori everything that we need to do.

 

 

We figure out enough to get going, we motivate it with some real-life work, and then we iterate from that. That over time, done correctly with the right kind of leadership and oversight, should lead us to some medium-sized building blocks of a data governance framework of explaining why governance doesn't mean something slow and ineffective and expensive. It actually means something that is building from the ground up in a way that's going to be useful and produce dollars in business value.

 

Ray Deiotte (10:42):

Yeah, absolutely. And the reason it's come about, Chris is that when we look at all of the things that are going on, I know we started talking about streaming, but as we start looking into the domain of augmented intelligence and advanced analytics, when we bring in inference on streaming data or we bring in inference on data, we don't necessarily, especially in healthcare and life sciences, have a well-governed, well curated dataset on which to build that. So the goal of this agile governance model is to go side by side with these high profile, high profit margin use cases around AI and advanced analytics, both with streaming and batch data to help build up the foundation so that you can make advances and build transformation into the operations of the organization without having to do an all-stop and build the entire platform and the entire foundation out in order to support your one or two AI or AA use cases. You can now do it simultaneously and start seeing the value of that advanced technology adoption faster than you could before.

 

Chris Knerr (11:55):

This is really interesting. Let me go a little further into one area of this. When I first started working, and this is going six or seven years back, when I first started heavily working on what at the time was called big data, there was a whole school of thought of the... I'd call it, if you build it, they will come. What you were calling before, data for data's sake. What I think I just heard you say, and I want to play this back, is that doing this correctly in an agile governance mode is almost like enlightened schema on read. That's schema on read in the sense of, I'll just dump everything and we'll figure out what we want to do with it later, but then you have a crayon drawing of the use cases you may want to do. You have some real-life proof points that again, back to these requirements management, you're neither assuming you can do everything apriori nor are you assuming that if you just collect an infinite amount of data, that you'll find something neat consequent to it. Is that a fair interpretation?

 

Ray Deiotte (12:56):

Yeah, I think it's perfectly fair. And to your point, it allows that incremental success to be driven by incremental progress. And eventually, you'll get to the point where you can go, "I want to look at the entirety of our data and see if there's any unknowns lurking in there." But that's not where you start, you start with the value. And the value is the problems that harass the organization day to day, that you can put a finger on, but you need to build into, and you do that agilely, both in the governance perspective and in the product development perspective.

 

Chris Knerr (13:26):

Yeah. Yeah. No, that's brilliant. This is really interesting, and I really like your thought process on this. Let me shift gears, because you mentioned something else that intrigued me. So when we started, you were talking about, you've got a lens for your customers for NetAapp. So, a two-part question, does NetAapp run on NetAapp? And then maybe the other part of this is, I said I love talking to people who are tech sector and non-tech sector. What do tech sector folks know that corporate IT folks should know? And vice versa. If you could walk a mile in the other person's shoes, what would each person see if they could walk in that alternate universe mirror?

 

Ray Deiotte (14:14):

First of all, NetApp runs everything on NetApp. Everything that we put out, we run internally. So we're a fantastic story of drinking our own Kool-Aid, eating our own eggs, whatever metaphor you want to put in there, we do it and we do it right. And that extends from Edge-to-Core-to-Cloud. We partner with all the hyperscalers and we leverage our capabilities in the hyperscalers. We leverage our capabilities in our data centers. We leverage our capabilities on the edge so that we can, like I said earlier, get ahead of any issues that might be coming. But having straddled that fence that you described, as I'm looking at corporate IT, I wish that they would understand that the things that the tech sector are putting out are only as good as the adoption and the marriage of value in corporate IT.

 

 

When we go to a customer, we're not trying to sell a box or sell a [inaudible 00:15:15]. We're trying to work on a problem that has high value for the organization. And I think sometimes that translation is really lost. And vice versa, if we look at what corporate IT could tell the tech sector, it's, "Look for the places of value. Understand the industry that you're pushing into. Know the personas and the problems that those personas have, and then work that way." And I think that's why my role at NetApp exists and the team that I'm on exists, because we bring that subject matter expertise of the corporate IT around healthcare and life sciences back into the tech sector to help us better build product and build solution, and then marry that with the value generation that we can do within our customer set. So it's all about communication and knowing one another and knowing the business of one another, especially when it comes to what the tech sector should know about corporate IT.

 

Chris Knerr (16:14):

Yeah. No, that's fascinating, and that absolutely makes sense. I've had this observation a lot from a talent mindset. A lot of the time your rock stars end up being cross-functional folks. Programmers who went to the business, business people who got interested in technology, but I think there's cross-pollination between the corporate sector, and then the technology sector is a really, really interesting lens on that.

 

Ray Deiotte (16:43):

Yeah. There was an article out a handful of years ago now about the value of business translators. And those are those people who understand enough of the tech side and understand enough of the business side that they can form that bridge. And it was very much taken under the guise of the adoption of artificial intelligence and advanced analytics. But I think that paradigm exists now for all of tech. That there has to be people who understand the implications for the business and understand the tech enough to make that bridge and join those two pieces together in a very meaningful, effective manner.

 

Chris Knerr (17:21):

Yeah. No, I agree. I've often joked that you hear technology, especially corporate technology folks. If you go to internal or external conferences, a theme for a long time has been show me the money, focus on the value, what's the business value. And I feel like maybe after many years of repeating that we're finally getting to a breakthrough where all of the leaders that I talked to are focused on this and one would hope that it's percolating down through the organization or software company side and on the corporate IT side. You mentioned the importance of industry business value and use cases. And let me, as we're wrapping up here, talk about some fundamentals. So if you think about the tactical lens, the human lens, the innovation lens, and what's happening with digital transformation the next couple of years, and how important is that... what they say when you're doing college applications. That pointiness around industry versus generalization at the technology layer or the platform level. Are there any thoughts or predictions in some of the spaces that we've talked about that are top of mind for you?

 

Ray Deiotte (18:50):

Yeah. I really think that transformation and innovation, again, are all focused around data and that data exploitation, but it's not going to look anything like it does today. There's an entire paradigm of people that has to change in those sectors in order to bring about the innovation that is already been embraced and is evolving in other vertical industries. Part of that is just simply the regulation of healthcare and life sciences, but that can't be all of it because you look at what's going on in SynTech, which is just as highly regulated, or DOD and aerospace, which is just as highly regulated, but they're making tremendous strides in innovation. So I think what's going to happen over the next couple of years is that we are going to see a mind shift as people move into retirement. And people who are now growing up with this technology of cloud and IOT and AI, they bring that early adoption or that ease of embracement to the innovation lands.

 

 

So we're going to see tremendous strides being taken in the augmentation of people and in the augmentation of decision making, when there is too much data to consider. We talk a lot about exploring all of the data within healthcare or all of the data within life sciences. Well, that's great, but the next step and the next phase is then to bring in all of the other data from outside of your own little fiefdom and your own little ecosystem so that you can look at things that you could never look at before. And that transformation of technology, but in a mindset from the people's perspective, it's only exist in pockets. And so if adoption of this ubiquitous 360-degree view of patient, provider, consumer, system, whatever you want to de-market, that is going to be the next drive.

 

 

Is to form up this complete continuum of knowledge about an entity that we can then exploit in any number of ways, whether it's precision medicine leveraging genomics or it's in population medicine and understanding risk factors that then impact the work that's being done on the precision front in healthcare. All of those things bring all of those pieces together with realtime streaming information, and realtime sensing and distributed care models is really going to transform that industry. But it's got to start with the mindset changing to evolve into people who are ready to consume that and make those transformational changes.

 

Chris Knerr (21:32):

Very interesting. I have just an observation that I'll share. I have a life sciences background as well on the manufacturing side, so I've spent a few years in the industry. One of the things, and this is, I'll say a US-centric comment, but that's the context of the discussion. Life sciences has got regulatory firewalls as you point out, but then you look at financial services and you're like, "They have the same thing." There's something mutual in healthcare economics and implicit financial firewalls that slow down the pace of tactical innovation.

 

 

So without finger pointing, I think it wouldn't be outrageous to suggest that the way that healthcare economics system is established today, there actually are disincentives to data interoperability. That's one observation. The second observation I agree with, which is that then if you go back to the classic IEOR, in healthcare, the only thing I can staple myself to is myself. I'm the unit of work. So I think that your population demographics motivation for... maybe that's the breakthrough where we as a professional consumers of healthcare, which we all care passionately about, we actually have to be the catalyst as humans to break through those artificial firewalls.

 

Ray Deiotte (23:02):

So to your latter point, I think that you absolutely see what we should be driving for, especially as consumers is to be the managers of our own data in the healthcare space. No longer should the IDNs and the hospital systems and the academic research facilities be the shepherds and stewards and curators of our data. Rather, we should be the stewards of our data ourselves. That way, the whole paradigm shift and monetary shift that you alluded to in your former comment about the difficulty of the fiscal models and compensation models in healthcare, that then starts to change dramatically as you shift the nexus of ownership in healthcare. Now we can start looking at better optimization and better efficiencies, which is what we should be looking at in healthcare anyway. We know that the paradigm between Hippocratic Oath and compensation by the federal government or the payer staff are typically in conflict with one another. But as we look at this, and as we look at the way that you can apply data and apply the information that the system has, driving efficiencies should negate all of that.

 

 

It should be to the point of being able to drive efficiencies and effectiveness by introducing augmentative technologies should get us past that point. I hate that excuse. I hate the excuse of, "Hey, we're trying to drive OpEx up by leveraging our CapEx models and doing all of these other things to make sure that our fiscals look right." Well, doing that by pushing more people through the door is one way to do it. The other way to do it is by streamlining internal processes. Even by leveraging the data that we already have and the analytics that we already have, we can really start to streamline things to the tunes of millions of dollars a year, just by looking at simple use cases across the ecosystem of healthcare. So by bringing together then that internal look on optimization and efficiency and trends transitioning to a consumer owned data model, I think we'll actually see more transformation in the healthcare space than we have in the past 100 years.

 

Chris Knerr (25:13):

I think you're spot-on and it's going to be really fascinating to watch this evolve because I see, I think the same as you do, it's like the points of light in the constellation. I can't quite see the mythical figure yet, but it's emerging slowly over time. And I think that this idea of the professional consumer of your own health data being the manager and accountable for that, I think that's got to be a wedge in changing something about the economic model in a way that's going to force multiplier to all of the operational efficiency and EBITDA and growth lens for anyone who's working in payer provider or on the manufacturing side. So I think all the pieces are coming into focus, and I think that, like so many things, data is going to be critical.

 

Ray Deiotte (26:15):

CMS has come out with their ruling on interoperability and data sharing and all of those things. All of those pieces now start to your point, aligning into those constellations. We just simply haven't brought them all together yet. But I think that's on the horizon for the next decade.

 

Chris Knerr (26:32):

Yeah. Yeah, absolutely. Well, I think that's a great place for us to wrap up. Can people find you online, Ray, if they'd like connect with you?

 

Ray Deiotte (26:45):

I don't know what you want to call it, but I typically keep myself to LinkedIn. You can reach me on LinkedIn. I've got a blog running around that you can find via LinkedIn as well, and that's the best way to hit me up.

 

Chris Knerr (26:58):

Really a fascinating conversation. Terrific to meet you. Again, for our audience, my guest today was Ray Deiotte, a CDO, excuse me, he's waiting for the promotion of NetApp. And you can find additional interviews in the podcast series at cdomagazine.tech. Thanks again, Ray, and hope you have a terrific day.

 

Ray Deiotte (27:18):

Thanks Chris. You too.

About the Author

Chris Knerr

Chris Knerr is Syniti's Chief Digital Officer. As a former Fortune 50 Client Executive Sponsor for large-scale data migrations at Johnson & Johnson, as well as a Syniti alliance partner, Chris serves as a powerhouse whose proven success, background and experience helps accelerate Syniti’s data strategy, analytics organization and offerings.

Follow on Linkedin More Content by Chris Knerr

No Previous Articles

Next Article
CEO to CEO Podcast: Larry Kihlstadius
CEO to CEO Podcast: Larry Kihlstadius

This week's guest on the CEO to CEO Podcast, Larry Kihlstadius, Vistage Chair, and coach to many CEOs.