Hey, guys. Welcome back to the Cloud Committing Insider, where we talk about the real world of cloud computing and how to make it work for your enterprise. I'm your host, David Lentigum, author, speaker, Beelist Geek. And joining me today, Rackspace is in the house. We're gonna talk today about private clouds and how to make AI work within private clouds and, you know, basically much of what you guys have been talking about and asking about for the last couple of years as we're trying to move into this AI stuff, make it work, but also get to the affordability of it and also get to the manageability of it and also get to the compliance control of it. And so joining me today, we have two experts. We have Brian Litchfield. He's vice president of private cloud solutions at Rackspace Technology, where he's responsible for strategy and direction of the company's private cloud offering. And Joe Vito, he's the Senior Vice President of Strategic Alliance Partnerships at Rackspace Technology and brings more than twenty five years of experience in technology leadership, cloud transformation, and enterprise solutions. So Brian, I'm gonna start with you. Tell us what you do at Rackspace. Yeah, no, thanks for having us on, David. Really excited to talk about this. It's one of the most exciting changes in the in the industry right now without a doubt and has dominated the headlines. And it's just a privilege to be able to work on kind of bleeding edge tech every single day. But like you mentioned, I'm responsible for Rackspace's private cloud product management and engineering portfolio. And so my team is really focused on not just making sure the platforms that we have for customer workloads today are stable, resilient, highly performance, and of course cost effective. But also how do we help our customers into that next, next layer, next era, if you will, of technology. And of course, that is really focused in the AI space today. And we've seen, you know, a number of things from pilots and POCs happening all across the world. And I think we're now starting to get into this mode of enterprises getting really serious about integrating AI into their workflows and excited to talk a little bit about that today. I can't wait to discuss it. So Joe, tell us about what you are at Rackspace. Yeah, David, thanks for having me. So at Rackspace, head up the strategic alliances and partnerships. And basically that covers our ecosystem of partners who represent the private cloud, right? The Dells, the VMware's of the world, as well as, as well as some of our professional services partners, and as well as as Brian just referenced, our newest part set partnerships with our AI platforms. So we we treat it as an ecosystem, and we try to manage as such on behalf of our customers. And as you can imagine, Brian and I kind of overlap in our efforts. So what I'm doing with the alliances and how that translates into our product creation goes hand in hand. Yeah. And I'm let's go ahead and get into the discussion because everybody's interested in this. I was at RSA last week and had lots of discussions with the hyperscalers and lots of discussions with the security folks. And the core thing was really how to deploy AI into your enterprise without bankrupting your enterprise. And I look at some of this stuff and I just did some cost analysis for a client. And the stuff, when you move into many of the hyperscalers and some of the more popular solutions out there, you're gonna end up paying six to seven times that of some of the all cloud solutions, and so this is gonna be very important. That's not minor money, by the way. So in building these AI systems, and I made a living architecting AI systems for a long period of time, they're ten to twenty times the cost of traditional systems. So we got a real barrier here. If we're not able to figure out how to do this in a more secure economical way, or we're able to maintain more control, this stuff is not gonna work for us. And I see one of the few options out there, you know, being the private cloud stuff and the ability to kind of look at the capabilities there. So Brian, I'm gonna go to you first. So if AI is now a business priority for most organizations, but may still struggle to turn that priority into measurable results at scale, many still struggle to turn that priority into measurable results at scale. Why is that? What are we missing? I think you hit the first barrier, right, to scale and entry right on the head. The hyperscalers do an amazing job, you know, innovating and creating new tools and capabilities, But it is an ideal environment for kicking the tires with a new piece of technology, not necessarily that you're fully committing to running twenty four seven. And that's where I think some of that cost pain comes into play. For a POC or a pilot, it makes total sense to leverage these tools, pay for a few hours at a time, and you're not you're not beholden to, you know, long term commitment. But when you then want to actually run these workloads twenty four seven, three sixty five, you realize that kind of on demand or even reserved price is just way, way higher and and more impactful to the bottom line than you want it to be. And so that's where I do think private clouds start to come into play. You have opportunities when designing AI workflows. Some need to be real time, always on, and some some tasks can be scheduled and used in off hours. And with the private cloud solution, you get an economic benefit of committing to the, you know, the whole box, the whole card, etcetera. The security and privacy that comes with that being the only organization using that piece of equipment. But you also have full and complete control over how and when that workload runs and flexes across any number of use cases, whether it's a, you know, employee facing or end user customer facing chatbot to back end, back office workflow management and improvement, you get much more flexibility in terms of how and when those workflows are running and at what capacities. And you get more bang for your buck, you know, to to sum it up. Yeah. I agree. I agree. It's great great points. Joe, any follow-up with you? Yeah, I think Brian hit, you know, touched on a couple of different things. One, you know, my time with hyperscalers and over my career, right, lots of tools, you have to make lots of decisions and you have to build this stuff. That's even before you get it out the door to really productionize it. So I think in a private cloud space, the way Rackspace does it, we simplify that drastically. We cut the time down drastically. So you can focus on what the business wants, which is, hey, what what are the use cases am I looking at? What are the agents I'm trying to create? How do I deploy them? So we what we I think in a private cloud environment, the way we've organized I'm sorry, the way we've kinda operationalized our partnerships around, the various stacks or layers of the stack, We've taken that difficulty away from the CIOs or the CTOs, whoever's making these decisions, when we're saying, hey, focus on your data, focus on getting your data prepared, focus on modeling, and there's lots of different ways you can enter into into our AI platforms, and then get agents deployed. That's business value. It's not pilots, and it's not all the other stuff that comes before it. So we we just accelerate time to market in this private cloud space, and we do it by, you know, basically rightsizing what most of our customers are gonna need and not overbuilding and not providing a ridiculous number of options. Yeah. Know. So some of my clients will push back on that, and they can talk about, you know, looking at let's look at our AI private cloud and they say, we don't wanna maintain our own equipment. We wanna get out of the data center business. And I'm not really suggesting that. I'm suggesting that you use the architecture, but you make it somebody else's problem, people who know how to operationalize stuff and maintain it and are able to do so at scale. And by the way, you're able to do so at a cost point, which is relative to owning your own and sometimes better than owning your own equipment. So to me, it's kind of a no brainer in terms of an option. So speaking of operationalization, and Joe, I'm going to go to you first. So what are some of the issues around operationalizing AI in enterprise environments when using a private cloud or when using cloud computing in general? Yeah. I think so if you look at doing yourself versus coming to us as a managed private cloud, right, you're gonna have to make decisions on hardware, hardware configurations. You've now got a whole, inventory or options around chips, GPU versus CPU. Right? When do you actually use one versus the other? How do you contextualize that utilization? You think about the private cloud software and then and then sitting on top of your AI platforms. What what when we say operationalize, that comes stacked, pre engineered, deployed, and we manage the operation the environment for you. Right? The security of it, the regulatory compliance of it, that makes it a lot easier on customers. We spend all day, Brian and myself, right, with our partners, making these configuration decisions, understanding the use cases, understanding the workloads that are appropriate for specific I'm sorry. The use cases drive a certain type of workload, drive a certain type of consumption pattern. We kind of go through that exercise with our partners, and we get this kind of set up ahead of your con your need or consumption of it. And we'll obviously continue to refine that as we see, you know, AI workloads moving into inferencing and or they're already there. Inferencing and fine tuning, these things are all kind of circulating. These AI use cases are circulating around. And we're managing that and managing the kind of the the the build out of the stacks. Brian, what do you think? Yeah. I I think I think Joe's spot on here. One of the things that, you know, CIO, CTO, you know, executives looking to embed AI workflows into their, you know, daily business side by side with some of those mission critical applications is there aren't a lot of tech stack decisions that have to be made. Right? And you can get bogged down when you're talking about operationalizing, AI. You can get bogged down into these smaller decisions that don't quite have the same impact as being really crisp and clear on your use cases, making sure that you've got your data in the right place to to make it effective, and building the more differentiated strategy for your organization on how you're gonna leverage AI. That's where the I think the real magic happens, and that's you know, Rackspace is super, super passionate about building and operating those platforms and working with partners and pulling together the very best of technology that's available for our customers. And, you know, I would encourage leaders that are looking to build AI into their workflow to, you know, consider the trade offs. You can go and design everything from the dirt under the data center all the way out all the way up to the packet, you know, as it leaves a switch. Or you can really lean into making sure you've got clarity on the business priorities, which AI use cases make the most sense. I can tell you from just our experience within Rackspace, we started kicking around some of this stuff a few years ago, more in the generative AI space. The ideas were limitless. Right? There was no lack of those ideas. And I think some of the most effective executives and where we've seen the most traction is where leaders start to really nail down is is an investment in workflow x, right, or back office process y, the the very best use of our time? And and does it give us a differentiated experience relative to the other companies in our in our market or our industry? I think that's where, you know, that's where Rackspace becomes just a great partner. Right? We can take care of, you know, everything below that decision and turn it into a quick and easy to deploy and manageable stack while leaders can really focus on the value that they're getting out of that. Yeah, great. I'm gonna go back to you, Brian and talk about something that I hear a lot. They talk about the time to value of AI systems and they go, well, look, pretty much the public cloud is going to be the easy button for all AI systems, but that's not necessarily the case. So tell me about time to value as related to the way in which you guys do private cloud deployments for AI. Yeah, so, Joe mentioned we've got a fairly opinionated stack, but it's one that has some flexibility into it. And so we think of the, you know, enabling that time to value in a few different streams. Right? I'll call them, like, four four entry points into the, you know, the AI ecosystem that you can stitch into your organization. The first is the most simplistic. Right? It's direct to chip. Maybe maybe you're an enterprise, maybe you're a leader that has a ton of engineering, time and energy and talent. And, you know, the differentiation that you're gonna get out of AI is actually building from, you know, the software layer up on top of the stack. And you really just need help in procuring, managing, because we know data center space is painful to come by. Chips are really hard to come by. You need somebody that can take care of the the physical atoms while you work on the bits that sit on top of it. So we would think of that as kind of a direct to chip model. The next is, our, you know, developer ready or app ready, suite of services. This is where Rackspace takes that hardware, and we give you a software layer on top of it, whether it's to, immediately turn on and via APIs, create an inferencing endpoint. Maybe you need to fine tune an open source model to, you know, a dataset that you have. All of the tools and widgets that kind of data engineers and scientists need to build up some of these capabilities, we've kinda prepackaged those and built them as, you know, opinionated endpoints that you can just spin up. You're still leveraging a private cloud on the back end. I think that's the one unique thing. You're not just in a sea of GPUs and, you know, only getting a slice of of, those cycles each time. You're you're still on that dedicated hardware or dedicated chip, but we're giving you the tools to to go a little faster. And then on the the the next two, we come with more of an outcome based, focus. And that's through what I would call our AI platforms. That's gonna be the partnerships that we have with Palantir and Uniphore, where they're coming with a full suite of, you know, higher level systems across data, model training, AI agent management, etcetera. And then last but not least is an outcome driven service where Rackspace actually, you know, kinda takes the reins with customers, works side by side, understands the business problem, understands the workflow, and a Rackspace engineer, a forward deployed engineer will actually go and build that for you and give you that outcome. Of course, the the I think the unique thing or the thing that that makes Rackspace special in this space is that all four of those kind of routes to an AI platform are are unified underneath. Right? It's, you know, it's Rackspace data centers, Rackspace managed capacity. We're handling all of the physical aspects of it and making that available in a a variety of ways so that organizations that wanna do more themselves can turnkey do more themselves. Organizations that just wanna come with their business problem, Rackspace can help them go and solve that straight away. And the burden's on us to make sure that the the results are delivered. Right? And doing that faster and faster, each iteration I think is is really important. Great answer. Anything to add to that, Joe? No. I think Brian articulated the the kind of three use case scenarios, for lack of a better term. And that's exactly where we talk to our customers. Right? We we very when we go in and talk about them and say, here's our private cloud, here's these use cases, what are the outcomes you're trying to drive? And each of our AI platforms is is a starting point to meet certain personas from customer standpoint. So a Uniphore platform might be really good for somebody who understands a use case that they're looking for in a specific industry, in a specific area. Will come in with, what we call small SLM, small language models that are contextualized to that use case, and then we contextualize the customer data. Let's get the agent out the door. Very, very targeted, very kinda really great for customers who are just kinda getting comfortable with AI and and want a more consumable, what we say is business AI dashboard. Our our Palantir customer may have lots of sources trying to kinda aggregate those sources into a common ontology and then start to use that in modeling. And they want the full kind of platform capability to monitor monitor the ecosystem. So they might be for more mature customers. So we think we've got AI platforms for different types of customers and personas and to meet different needs. And then obviously, Brian referenced the inferencing scenario customer. So I think that in all of this is, again, we have this conversation around models, data, and what are the agents you want to get out the door to fit, you know, to change your operating model. There's that's where the conversation is. It's not about, you know, which chip are you using, what's this, you know, what's the stack look like. We wanna remove that from your from your vernacular. Right? Because the business doesn't care. The business wants outcomes. And so our entire focus with our customers is how do we accelerate your business outcomes? So Joe, gonna go back to you. Compliance and your ability to have governance over data and domain over data and dominion over data, I hear that a lot. And many folks are looking to private clouds providing those capabilities. Tell me about that advantage and what's involved with that? And how is that deployed on a private cloud versus in a public cloud instance? Yeah, I think customers in specific industries, there's a number of healthcare segments that say, hey, listen, we don't want this data out in the public domain. We want it in a secure, dedicated private cloud with all the securities wrapped around that we would expect, and we want it regulatory compliant in that specific industry. Private cloud fits perfectly. Right? It's an easier conversation with your CISO. It's an easier conversation, right, with the business. Right? We're not putting we're not deploying this content out into a, into a public cloud. There are other parts of the world. If you go to EMEA, you go to UK, right, it's a sovereign play. They do not want to use hyperscalers. I wanna stay very focused on private clouds, or in this particular case, the sovereign version of that. So we're meeting the need there. So there's lots of and and quite frankly, even in financial services, we have found that, you know, some customers may may have some things in the public cloud, but specific things around AI that they believe deem is, you know, competitive advantage, they don't want exposure to that data. They want it, you know, in a private cloud and secured. So we think we're fit meeting that need for security and compliance, regulatory compliance across different types of industries. And so we'll map our controls and our our controls to those regulatory statements or object control objectives. So we're we're meeting, I think, the privacy need that we're that we're seeing more and more now in the market. I think out of the gate, everybody's like, hey, you know, let's go to public cloud with it. I think people are now realizing you don't have to. You can go to a private cloud and get that, you know, same level of security that you would presumably have in your own data center. Brian, what are your thoughts? Yeah. I think I think to take this one step further, this goes to one of the advantages and and time to value problems, right, in operationalizing some of this is that it's one thing to stand up a service, but it's another to make sure that it meets whether it's, you know, kind of baseline SOC two to PCI to high trust, etcetera. You need you gotta go through all of that work. And so one of the one of the values, one of the benefits that Rackspace brings to the table here is that we can take that core platform and we can deliver it to organizations across those various lanes that I talked about earlier. But our customers get to inherent the controls from, like I say, the dirt under the data center all the way up to where our management stops. And and they can sit and focus their compliance efforts really on their unique application or their specific industry, and they get to to benefit and piggyback off of the work that we've done, ensuring that platforms are, you know, secure and private, and and can be managed. The other advantage of private clouds is for research work. Rackspace, has continued to expand services into the health care space. And what we found is a lot of, hospitals, especially university hospitals have a research arm. And the the the just ease of use with which you can effectively, you know, firewall and air gap an AI solution from any other system in a private cloud, it's just a kind of turnkey capability. It comes right out of the box. And so for those highly regulated areas or experimental workloads where you really don't want there to be any intermingling with other systems or capabilities, but you still need all the horse power and all the compliance rigor that goes with those others, private cloud, again, becomes just a really great opportunity or option, in getting that up and running and and moving quickly. And I think as the world moves from everybody trying to race and create their own model from scratch and instead turns into taking some of the open source models that are available or even a licensed model and then injecting and and fine tuning with, you know, domain specific knowledge becomes more the the workflow. Again, you don't have to go and find multi hundred k w, you know, cabinet locations to run these intense foundational model training workloads, you can do this on, on the same configurations that are used for inferencing. And so these stacks become very flexible for organizations to do research and then immediately turn it into a production like application and move it off to out of that air gap solution and into a more mainstream mission critical side of the house. If that makes sense. It does make sense. And Brian, I'm gonna back to you. One of the complaints that I hear from my hyperscale clients was kind of a noisy neighbor performance issue. In other words, performance wasn't consistent. In many instances, took an hour to run one day and it took two hours to run another day and a half an hour to run the next day after that. But of course the bill's exactly the same. Tell me about performance and private clouds. What are some of the advantages that you get when running systems on things that you in essence have dominion over? Yeah. You know, oversubscription is a thing, right? In any multi tenant solution, you know, oversubscription is a key part of the architecture there, and it affords you some cost advantages. But when you're in a space, you know, like like we see with AI workloads where latency and I don't mean latency in terms of network connectivity from one location all the way out to an end user, but latency within the within the system itself is paramount to its success. The closer you are to the bare metal and, like, effectively having control over that that piece of metal, just the the more return you get for your dollars. And we've taken the approach from a private cloud perspective that from a GPU up, we want a customer to have that effectively dedicated, you know, reserved stack. You get a couple of benefits. One, you probably don't need as many cards. And so as somebody that's selling these stacks, it's actually I mean, you know, each customer might might have slightly fewer cards to kinda outperform the workload they already have. But also with that control, you get to decide quota management. You get to to decide, quality of service across, you know, multiple applications within your own stack. And so, you know, you might be okay with a run taking a little bit longer and you you dial down the the quota for that particular job and dial up, you know, the quota for something that's maybe customer facing. Those are controls that you get to have where you don't quite have the same granular fine tuned control, in in a hyperscaler or even in some cases, neo clouds. You know, we are really aiming and and building these AI stacks for the enterprise, not for just somebody to come and play with a GPU for an hour and then and log off. Right? That's when I think creates some of that noisy neighbor, you know, experience that others get. We're we're working towards that, you know, treating an AI workload like a twenty four seven, three sixty five mission critical part of the stack. It's not an afterthought. It's not a science experiment. It's truly, truly production component of an organization's business. Great. So anything to add, Joe? No, I think that's a whole dedicated nature. And and the way, you know, the way we'll kind of manage it or manage it with you and for you is it's just an easier exercise. I think the you know, when you get into the as you said, some of the noise that may exist in some of the hyperscaler workspace, you know, it can get frustrating, I'm sure, for customers, and it's unpredictable in some cases, right? So we'd like to try to offer more predictability, more access, and there's more flexibility in that access. And I think that gives some reassurances to, I think sponsors, whether it's business or tech, that they can achieve their business outcomes in a timely manner. Great. So Brian, I'm gonna go to you. Where can we find, you know, it defined the Rackspace private cloud solution for us, AI solution for us. And where can we go on the web to find out more about it? Who can we contact? Where do we get more information? Yeah. So I'm sure we'll we'll we'll drop a link or maybe do some AV magic and get something to pop up here with QR code and all that fun stuff, for some sites and forums that you can go and and ask more questions. You can always reach Joe or I via LinkedIn or just our first name dot last name at Rackspace dot com. Send us an email if you want. Joe might not be happy that I said that, but, you know, he'll he'll survive. You know? But, again, you know, we are focused, like I mentioned, on that that production workload. We're focused on helping customers solve those those, you know, those use case or business problems and and and driving that into a, you know, an integral part of the the, you know, the IT stack that, you know, the three tier architecture that we know and love and has been around for twenty years. Stitching AI into that is a really important part. And so like I mentioned, we have everything from a direct to chip offering to developer ready capabilities to AI plat platforms through the partnerships that, you know, Joe mentioned with Uniphore and Palantir. And, of course, Rackspace going and performing, you know, end to end, you know, work defining, designing, and deploying a use case into production on behalf of our customers, right, and embedding with their teams. So any of those avenues, right, become, you know, a unique opportunity to partner with Rackspace across the globe. Right? We're deploying platforms from, you know, the the US to EMEA, UK, and even some workloads in APAC. So really excited about how this space is gonna continue to grow and, you know, to see what problems we're able to solve together. Yeah and I will put a link and a QR code up on the screen so you can find Rackspace and reach out to them directly. So people can reach out to you on LinkedIn, Brian? Is that your Of course. Okay. LinkedIn, email, know, you name it. I'm more than happy to happy to respond. What about you, Joe? Same thing. Please reach out. Know, this is a a changing space and people are, know, creating new use cases, derivatives use cases every day. We're creating SLMs to fit those use cases. So, you know, we we can't hear enough from the customers. So any everybody in your in your audience that, you know, once ideas it's specific industry ideas or on the on the technical side, we're more than happy to respond. And so it's it's I will I will tell you, as everybody already knows, this is moving so fast. So things are changing fast. So please reach out in any venue, whether it's the email or LinkedIn. Yeah, I can't stress this enough. I mean, as an architect, we have to have every technology on the table as we deploy and build these AI solutions and have to get to some business value very quickly. And if we're throwing lots of money at it, the business value is just not gonna be there. And if we have a competent technology that's able to do what we need it to do, have more control over it and some other options that are out there, that's something enterprises should check out right now as we're getting ready to spend many trillions of dollars on this AI infrastructure as these enterprises go through transformation. So make sure to reach out to Rackspace, understand about the private cloud offerings, the architectural patterns and how they differ from the other patterns of deployments and some of the other all cloud stuff, as well as the hyperscalers out there. It's gonna be very important. Anyway, don't forget to like and subscribe and check out my other videos here. Check out my Unreal Cloud Computing blog. You know, check out my LinkedIn Learning courses and, you know, check out our other stuff. And also, you know, make some time to check out what Rack Space has to offer, because I think it's a very important lesson in some of these alternatives out there that are going to be hugely advantageous to enterprises out there. So until next week, you guys take care. I'll talk to you soon. Cheers.
Webinar:
From AI Strategy to Results: How Private Cloud Accelerates Enterprise AI
David Linthicum, Founder & Lead Reasercher, Linthicum Research | Joseph Vito, Senior Vice President of Strategic Alliance Partnerships, Rackspace Technology | Bryan Litchford, Vice President of Private Cloud Solutions, Rackspace Technology
April 6, 2026 | 31 mins
AI is a priority, but for many organizations, measurable results still feel too slow, too complex, and too hard to scale. If you’re under pressure to turn AI investments into faster insights, greater efficiency, and real business value, this is a webinar you need to watch.
Join David Linthicum and Rackspace experts Joseph Vito, Senior Vice President of Strategic Alliance Partnerships and Bryan Litchford, Vice President of Private Cloud Solutions, to see how a Private Cloud approach can help you move from AI strategy to execution faster, while maintaining the control, security, and performance enterprise workloads demand. Built on Rackspace’s deep private cloud expertise and powered by Dell technologies, including Dell AI Factory and the Dell Data Platform, this approach delivers a secure, high-performance foundation for enterprise AI at scale.
You’ll walk away with practical insight into how to accelerate time to value, reduce deployment complexity, and build an AI environment designed to deliver outcomes, not just experimentation.
The webinar will include:
-
How Private Cloud can accelerate AI time to value through faster deployment, stronger control, and performance optimized for enterprise workloads
-
How to drive measurable business outcomes with AI by improving efficiency, accelerating insights, and maximizing ROI
-
How Rackspace, built on Dell AI Factory, helps organizations design, deploy, and scale Private Cloud AI solutions with greater confidence, speed, and operational readiness
Related resources
Partner solutions: Palantir
Deploy and run Palantir Foundry and AIP in production with the engineering and operational discipline enterprise AI requires
Learn more
Partner solutions: Uniphore
Rackspace Technology and Uniphore enable regulated enterprises to deploy full-stack AI platforms — securely, efficiently, and at scale
Learn morePower your AI initiatives
Rackspace Private Cloud AI helps harness the power of AI in a secure environment, leveraging a full AI software stack and the latest AI-optimized hardware.
Get Started
Ready to Turn AI Strategy into Real Results?
Connect with our experts to explore how Private Cloud can accelerate your AI initiatives. Whether you're looking to reduce complexity, improve performance, or drive measurable business outcomes, we’re here to help you move forward with confidence.