77: Nutanix Weekly: Deliver Enterprise Grade Edge AI/ML with Nutanix

Nov 14, 2023

In the past, the Hyperscalers have always been positioned as the default/ “you can’t go wrong” deployment zone for AI/ML workloads. But this is not true for many use-cases, see why AI and machine learning are drifting away from the cloud. 

Real-world examples from many ISVs and customers tell a different story with multiple drivers: low-latency, data privacy and cost (especially at scale) that make the Core DC (owned or hosted) a better choice. With leading market analysts agreeing on explosive data growth at storefront/ branch/ customer service center locations, for Edge AI/ML use cases, on-prem deployments are a ‘no-brainer’.

Blog: https://www.nutanix.dev/2023/10/12/deliver-enterprise-grade-edge-ai-ml-with-nutanix/

Host: Philip Sellers
Co-host: Harvey Green
Co-host: Jirah Cox
Co-host: Ben Rogers

Philip Sellers: Hello, and welcome to Episode 77 of Nutanix Weekly. I’m your host for today. Phil Sellers, a  solutions architect here at XenTegra.
Philip Sellers: I have a great slate of folks joining me today. Good! Harvey Green, CEO of XenTegra-GOV. Harvey, how are you doing today?

Harvey Green: I’m doing great. How are you

Philip Sellers: hanging in there. It’s Monday. But we’re almost to the point where we’ve survived the Monday.

Harvey Green III: Yes.

Harvey Green III: yes. Did they cap it off this way?

Philip Sellers: Yeah, it’s it’s a good way to cap it off with some good friends team we’re doing with Jirah Cox.    Jirah,  how are you?

Jirah Cox: Philip. Good and happy. Monday.

00:00:47.920 –> 00:01:01.630
Philip Sellers: Happy Monday to you, too. We’ve also got Ben Rogers with us. Ben. How things going, hey, gentlemen, how are you doing? It’s been an episode or 2 since I have been here. But doing well. And thank you guys for having me back.

00:01:01.790 –> 00:01:14.079
Philip Sellers: Yeah, welcome back. Yeah, we’ve we’ve had a few episodes that the great Ben Rogers hasn’t been able to join. But we’re trying to keep things going. Ben, what’s been keeping you busy out there?

00:01:14.730 –> 00:01:32.579
Ben Rogers: Well, just my day to day, job, you know, trying to service my customers, and look at what their needs are, especially from a sales mentality of where they’re pushing the envelope. And man, I’ve been going to a few tech events trying to learn myself. I had the pleasure of

00:01:32.610 –> 00:01:57.089
Ben Rogers: sitting down with one of our leaders last week for dinner pie, and if you’ve ever had the chance to meet pie, pi is our technical evangelists. But, more importantly, what he gets to do is he gets to talk to customers. He gets to kind of see where trends in the industry are going. So as we get in this topic this this afternoon, I fall. Interesting that my dinner really was around. I

00:01:57.090 –> 00:02:10.540
Ben Rogers: I asking, you know, where is this ball moving? And where are companies really trying to leverage our products to make sure that they have success in the future? And and to my surprise, you know a lot of puts

00:02:10.972 –> 00:02:16.599
Ben Rogers: around Api, you know, I mean how how we can digest and ingest

00:02:16.660 –> 00:02:36.569
Ben Rogers: Api infrastructure and automate that. And then, you know, man, our, our data base and just the core of you know, what we do best is data and how we handle the zeros and ones to kind of how the birth of our product has come out of that strong goodness of

00:02:37.150 –> 00:02:56.160
Ben Rogers: You know our data control. And as you look to the future where you’re looking at AI data governance where you have machines in the background that are gonna have to have some kind of policies and rules of regulation where they can replicate data and guard data.

00:02:56.160 –> 00:03:07.470
Ben Rogers: When you sit down with some of our leadership, and you kind of dig into where this ball is going. You really quickly find out how fascinating our product set is.

00:03:07.470 –> 00:03:34.700
Ben Rogers: And so it was something that man II was glad that I had the pleasure to sit down with a mine like pie, so that I can get myself wrapped around the axle where things are going, and what kind of the next generation tools that are coming out, not only with AI, but with data, governance, automation, all these things. And you’re really begin to be thankful that you work for a company like this, this kind of tackling these next generation problems

00:03:34.700 –> 00:03:57.229
Ben Rogers: and putting some sense around it. So kind of wanted to do that as a trailer of our conversation today. But also, you know what if I’ve been into and a lot of what I’ve been into is trying to make sure we’re meeting our customers goals today. But understanding or our customers are going. And better yet, understanding the technology that our company is developing to meet those goals.

00:03:57.260 –> 00:04:20.350
Philip Sellers: Yeah, Harvey, there you go, Harvey. It reminds me of a conversation we’re having with a resource research organization along with Zentigra. You know it’s it is about the data. It’s about data sets and being able to automate those data sets. I mean, you know, you’re you’re looking at how we can use new tanics in those ways. Anything that that you’d want to tag in there as well.

00:04:22.210 –> 00:04:27.199
Harvey Green III: I mean, I plan to

00:04:27.230 –> 00:04:46.949
Harvey Green III: if if we keep going like this, we’ll never get into it. But ultimately, you know the the conversation that Bill was just having around it being about the data. Very much. So in in the category of research. And you know, making sure that

00:04:46.970 –> 00:04:48.640
Harvey Green III: things are

00:04:48.850 –> 00:05:00.159
Harvey Green III: secure, but also enable the users to go and do what they need to do around the data and pulling insights from the data.

00:05:00.410 –> 00:05:19.499
Harvey Green III: So there’s there’s a whole lot that goes into that. But I won’t belabor the point. I’ll let you get in there. It is a lot of things. It is the security aspects of it which can deliver on. It’s the access to the data which can deliver on.

00:05:19.500 –> 00:05:46.000
Philip Sellers: And then it’s also you know, just the ability to be able to decipher and run workloads around that data that help us solve really interesting problems for customers. And and I think that’s the the huge win here is that it is a full breadth solution that we can present and be able to really drive value for resource organizations, healthcare organizations.

00:05:46.000 –> 00:05:49.669
Philip Sellers: universities, things like that. So pretty stepped about it.

00:05:49.940 –> 00:05:51.130
Harvey Green III: Yes.

00:05:51.220 –> 00:06:06.179
Philip Sellers: you know. I do. Wanna go ahead and throw in our tagline a little bit of an advertisement here at the front of the podcast yeah, this is new for some of our podcast but really, what we’re trying to do here at some tiger is podcasting with context.

00:06:06.220 –> 00:06:11.110
Philip Sellers: That’s why we we love the conversations about what’s going on in the real world.

00:06:11.290 –> 00:06:21.500
Philip Sellers: what our customers are asking for, how we’re using the technology to be able to deliver great solutions, great outcomes for our customers. And that that’s the difference. We yeah, we’re taking a

00:06:21.700 –> 00:06:30.270
Philip Sellers: an article a blog post. But we really want to wrap it with that context of how we’re leveraging this in the real world.

00:06:30.650 –> 00:06:33.620
Philip Sellers: and on that same note, you know, if you’re working with a partner.

00:06:33.680 –> 00:07:02.339
Philip Sellers: that it’s not giving you that kind of advice, it’s not really giving you advice. Then. Maybe you’re working with the wrong partner, and we’d love a chance to talk with you. So you know, that is our stick here as integral. We’re trying to bring real world experience to our customers. We’re trying to add value to their world. So if you’re not partnered with a value, added reseller that is doing that sort of activity and adding value to your world reach out. Give us a call.

00:07:02.460 –> 00:07:19.120
Philip Sellers: We’d love to work with you. So that’s the end of the ad. you know. Today, Jyra, you’ve brought to us a a blog post for delivering enterprise grade edge AI and Ml. With Newtonics. Why did you choose that topic today?

00:07:19.560 –> 00:07:23.780
Jirah Cox: I think it’s it’s exciting. I think it’s it’s Super

00:07:24.590 –> 00:07:26.090
Jirah Cox: shouldn’t say exciting again.

00:07:26.360 –> 00:07:43.180
Jirah Cox: It’s it’s top of mind, right? So it comes up in a bunch of conversations like that alluded to there, right around, you know, more intelligent workloads running at the edge right for like customers in in tons of verticals, right? No matter what you do.

00:07:43.360 –> 00:07:50.910
Jirah Cox: all the fascinating stuff, all the good data comes from out there. So what do I do with it? There? What do I send back to Hq. Into the data center, to the cloud.

00:07:50.970 –> 00:07:54.020
Jirah Cox: So doing, smart stuff at the edge matters to customers.

00:07:54.770 –> 00:08:12.989
Philip Sellers: Yeah, it’s sort of like a sorting activity, isn’t it? I mean, you’ve created so much data out at the edge. I think about cameras. Video. There’s so many hours and hours of of video data created in stores out on the highways. You know, we’ve got cameras everywhere.

00:08:13.180 –> 00:08:17.690
Philip Sellers: Backhauling all of that data, storing all of that data.

00:08:17.950 –> 00:08:29.710
Philip Sellers: It’s kind of useless. You. You need something to help you make sense of what’s valuable and then transmit that back to somewhere for long term storage. You don’t need to

00:08:29.950 –> 00:08:33.409
Philip Sellers: see 3 h of nothing happening on the roadway.

00:08:33.659 –> 00:08:35.640
Jirah Cox: We’re completely right. And

00:08:35.659 –> 00:09:01.820
Jirah Cox: and the edges where you kind of act on it. Right? You change what assign shows to steer traffic differently. You send someone message on their app. Hey? You just walk past the green beans, and they’re on sale this week, and maybe I know that they’re on your shopping list, or something like that. It gets creepy at some point, but like better Thanksgiving right? Thanks. Green beans should be on all of our shopping lists. So I had good odds on that one. But

00:09:01.890 –> 00:09:19.849
Jirah Cox: yeah, you know, having a an associate respond to. You know, someone who’s dwelling near the high dollar Tvs, or something like that, right? Is is where you have to act. So then can we make those smarter decisions more locally and have less data to drag over the pipe each way to get analyzed and acted on

00:09:20.610 –> 00:09:21.810
Philip Sellers: absolutely.

00:09:22.350 –> 00:09:35.500
Philip Sellers: Then, as you talked with pi you know you you had a great conversation about the way that new tenant sort of sees the world. And the way you’re approaching these problems.

00:09:35.560 –> 00:09:43.599
Philip Sellers: as as we kind of dive into the topic, what what sort of stands out from that conversation that you’d want folks listening to know.

00:09:43.900 –> 00:10:09.150
Ben Rogers: So it was interesting, because I will be the first to admit I’m not an AI expert. I know enough to be dangerous and to have a limited conversation. So that’s kind of bug me, you know, we’ve got this Gbt in a box product. I wanna make sure that I’m I’m talking about it effectively, and that you know II can. You know my customers trust that I’ve got their interest to. And

00:10:09.160 –> 00:10:12.780
Ben Rogers: and the thing that Pi really explained to me was that.

00:10:13.180 –> 00:10:37.249
Ben Rogers: you know, workload is a workload is a workload. So I understand the basic concept of our Hci products. How our clusters work, how workloads runs on that cluster! So the first thing he said to me is, don’t over complicate what the workload is, of course, the workload. The apps gonna be doing all these marvelous things. But from a architectural conceptual topic

00:10:37.330 –> 00:11:03.149
Ben Rogers: the workload is like any other workload running on on one of our clusters, so that made me feel good that I understand the foundation. Now the edge. What was interesting there is. You look at the different industries that are starting to depend on AI. You know, healthcare is one. But like, look at manufacturing, look at like quality control where they’re actually looking decisions on online in time, in real time on the fly.

00:11:03.150 –> 00:11:11.540
Ben Rogers: as things with products like one of our customers was talking about that they were looking at using AI to look at things like, okay.

00:11:11.540 –> 00:11:38.880
Ben Rogers: a, a. A location is buying these products. But what products are they not buying? And are we delivering those products that are just sitting on the shelf or in the back of a warehouse doing nothing. So that’s something that’s gonna be on the edge that’s gonna need to be tabulated in somewhat of a real time experience. And that’s where I really see where our solution kind of comes at play. The other big thing about AI that you know Pi was kind of telling me about it. So

00:11:38.990 –> 00:11:50.089
Ben Rogers: security has become a big concern for people, because as cool as it is for AI to be able to reach out to the data sets. You know, you have to have data behind these engines to drive them.

00:11:50.890 –> 00:11:53.790
Ben Rogers: controlling that access to data.

00:11:53.950 –> 00:12:09.709
Ben Rogers: And you know, when you start looking at you know, protected data or data that’s very crucial. Like, I think of research data at the company I came from. They only wanted to select few or processes to get to that data.

00:12:09.710 –> 00:12:39.569
Ben Rogers: knowing that AI can help them analyze that. You gotta be careful who’s loud on the pipe in that into that or not so kind of setting the stage from a architectural understanding that this is just like any other, any any other workload that would run on our clusters, and the security that needs to wrap around that, and how it’s being used at the edge, and how it really can make a difference to an organization in their operation when they can bring this in real time, and being able to run it locally on one of our clusters.

00:12:39.850 –> 00:12:54.329
Philip Sellers: Yeah, you you pull out security. And and I think that’s an interesting one to talk about, because you talk about the the security of the data set who has access to that. We’ve we’ve been down this road before, you know, when Sharepoint brought unifert, universal search.

00:12:54.550 –> 00:13:14.830
Philip Sellers: and we pointed at our file shares. Well, people had access to a lot of data that they didn’t know they had access to. But now because of search. Now they’re seeing it and so we got better at at securing our file shares because of that problem. If the same thing is true when we talk about data governance

00:13:14.980 –> 00:13:30.369
Philip Sellers: and AI, you know, I think we have that unknown problem of of too much permission for so many users in our organizations, and it’s going to surface that. But you’re right. You have to have data sets sitting behind it.

00:13:30.700 –> 00:13:36.480
Philip Sellers: Jayra is, as you hear Ben talk through that. I mean, what? What are the advantages here

00:13:36.560 –> 00:13:42.760
Philip Sellers: to the new tanic stack, and particularly to the the edge? AI, and they’ll use case

00:13:42.960 –> 00:13:54.730
Jirah Cox: well, conveniently for all 4 of us. There’s a great list here even sorted for us with the with those strengths right so to paint a board picture for our our listeners here, right? You can find this blog post on our dev

00:13:54.780 –> 00:14:05.369
Jirah Cox: blog You thinks that dev in your browser. But So we’ve touched on security, right? We’ve we’ve

00:14:05.870 –> 00:14:09.830
Jirah Cox: understood that for. Understood that for a long time, that it’s inherent strength of the platform, the way that we ship

00:14:09.940 –> 00:14:26.680
Jirah Cox: the the our controller vms which run the storage fabric, the control plane, of course, even the hypervisor, when you pick a hypervisor all in a in a secure fashion. So if that one’s been thoroughly covered on form factors, right? So since we’re just a software defined solution

00:14:26.780 –> 00:14:50.240
Jirah Cox: running on a wide array of hardware right of all shapes and sizes. This means we could even reach down for stuff that is, tiny and more convenient to run at the edge right? So we don’t necessarily need to ask customers for 4 post racks and 220 power. Right? We know it that at the edge sees it look like an edge sized solution. So since we just run in software, we can all certainly offer that as well.

00:14:51.620 –> 00:15:09.969
Philip Sellers: Yeah, I love the the the call out here for the Lenovo Ajax 1021 it’s a little baby server, I mean, it’s it looks almost like a blade server. If if you’re familiar with what blades looked like, you know, when they were first introduced, you know, decade 1215 years ago.

00:15:10.140 –> 00:15:27.850
Philip Sellers: but it’s a full system, and it is purpose built for the edge. So it’s a really cool form factor. That I’m I’m a huge fan of. I went ahead and just kinda pulled it up here. I don’t think there’s a photo. Oh, yeah, there is. So if you’re watching the video on Youtube, you can kinda see a photo of it. But

00:15:27.960 –> 00:15:32.889
Ben Rogers: it’s definitely a differentiated form factor in my mind.

00:15:32.980 –> 00:15:51.630
Jirah Cox: It’s in some ways. It’s like the guts of a really beefy laptop. Right? So yeah, the a couple of extra Ssds, a couple of extra neck options. You know, you could hang it on a wall, you could. They have to show them with these things all standard up like books, with bookends literally holding the holding them together so it looks like a book.

00:15:51.650 –> 00:16:02.690
Philip Sellers: honestly, you know, was up at Lenovo headquarters. They’ve got a petting zoo where you can touch all these things in in their headquarters.

00:16:02.710 –> 00:16:07.460
Philip Sellers: It’s about the size of a full size keyboard. That’s that’s about the size we’re talking here.

00:16:07.730 –> 00:16:25.969
Philip Sellers: Yeah. And that’s the power of software. Right? I mean you, you point that out jayra it. It’s it’s able to adapt to the used case. And you think manufacturing has been mentioned. I mean quality control all sorts of different places that you can apply the the workload

00:16:26.250 –> 00:16:56.160
Philip Sellers:  you know, as we go through the rest of this list. It it’s it’s talking about workload resiliency. You know, multi workload support. That’s all table stakes for you guys. You know, Hci gives you that great substrate to to be able to run. You know any sort of a workload including containers. And that’s that’s where we come to a conversation about Kubernetes. Which may be a little bit foreign to people listening. We’ve talked about it, a couple of times here on the podcast but

00:16:56.320 –> 00:17:03.880
Philip Sellers: I mean, II guess it doesn’t hurt to kind of take a step back and and talk. What are our Kubernetes? What? What is?

00:17:04.079 –> 00:17:06.860
Philip Sellers: What is that technology. All about

00:17:07.400 –> 00:17:09.320
Jirah Cox: gyrato. I mean, yeah, is that part of

00:17:09.329 –> 00:17:32.870
Jirah Cox: yeah, it is hard. It’s mean. It’s about scheduling your containerized workloads. Right? So and defining availability requirements for them to say, I need 15 of this kind of container running. I need 12 of that kind of container running, or when I have a new version of the container. How do I roll that out into production right, either all at once, or drain that in over time, but ultimately get back to what I’ve declared. My containerized infrastructure needs to look like

00:17:32.920 –> 00:17:43.810
Jirah Cox: and that, of course, has huge impacts on on. You know, edge workloads on Aiml as well. Right? So I talked to manufacturing customer last week where they’re looking to roll out their.

00:17:44.200 –> 00:17:55.040
Jirah Cox: you know, control systems on containerized platforms out at the edge right? So what? Where? They would have used to been a physical machine then became a Vm. Now it’s become a container at the edge as well.

00:17:55.100 –> 00:18:03.489
Jirah Cox: You know, so kind of right sizing. What does the workload really need, but not really changing the availability model for it? Right? So

00:18:04.240 –> 00:18:19.640
Jirah Cox: with with this use case for edge for AI and Ml, what that really means is we can let our customers run and deploy AI workloads really, whatever way they need to right for the business and for the right kind of workload. So you can run AI in a Vm.

00:18:19.700 –> 00:18:34.680
Jirah Cox: And we do that with like the pi torch framework. You can run that in a pod, and we do it with like cube flow right? So we can kind of match, either kind of availability model. The Vm. Or the container. Both run great, and it’s more like what fits your application better.

00:18:35.900 –> 00:18:45.140
Philip Sellers: And it also points out here. You you’re giving customers choice. They don’t have to standardize on your Kubernetes engine. N. Ke. but you’ve also got supported here for others.

00:18:45.370 –> 00:19:00.380
Jirah Cox: Totally. Yeah, that mean the the actual business of running those containers honestly, more closely needs to align with, like your software management and your control plane. That can be us. It can be something else. It’s not us. And either way, we’re a great platform to run the actual workload itself.

00:19:00.670 –> 00:19:22.279
Philip Sellers: Yeah, it’s fantastic 0 touch deployment remote deployment. I mean this. This is a huge one, and and is one where we run workshops on that you know. How do? How do you foundation. How do you create those clusters? And probably I’m gonna throw it to you. I mean, how much value does that 0 touch deployment add for a customer experience.

00:19:22.940 –> 00:19:23.920
I mean

00:19:24.270 –> 00:19:29.650
Harvey Green III: a lot and and I’m sure that comes as no surprise to you. You know, being able.

00:19:29.660 –> 00:19:40.079
Harvey Green III: And I guess in technology period. You know, we’ve we’ve gone from the days of everything, having to be sneaker net where you you know, you deploy by actually carrying it out there.

00:19:40.110 –> 00:20:06.030
Harvey Green III: You talked earlier about the Lenovo hx, and and it’s size you don’t, wanna you know, send somebody halfway across the country carrying one of those to go deploy it and put it in you wanna be able to found a way to deploy it remotely, be able to do that effectively, consistently and actually have a method that you could knock out a bunch of them from one place. Not just

00:20:06.260 –> 00:20:09.209
Harvey Green III: you gotta travel 50 places to put 50 of them in.

00:20:09.310 –> 00:20:36.380
Philip Sellers: Yeah, you and I both done that. I mean previous role. I mean, how to travel. We got a lot of status out of that travel. Doing it the old school way. And now maybe you can use remote hands, someone to to, you know, local to where you’re trying to do this deployment. They can do it because we’ve simplified the deployment. We, you know, Newtonics has given us a much easier model that can be carried out by

00:20:36.430 –> 00:20:40.969
Philip Sellers: less skilled staff, you know generalists or contractors.

00:20:40.980 –> 00:20:49.850
Harvey Green III: Again, the blue to the blue, the green to the green, the red to the red and send me a message to let me know it’s there, so I can check to make sure it’s done

00:20:49.900 –> 00:20:52.820
Ben Rogers: also. And also what makes this

00:20:52.860 –> 00:21:00.199
Ben Rogers: team is consistency. You know that everything’s being done the same way every single time every single site

00:21:00.350 –> 00:21:25.630
Ben Rogers:  Harvey, you know that I was a big, a proponent of man. Give me a cookie cutter if I can. If I can get my model into a cookie cutter, then I can stamp that thing out as many times as you want. So you wanna grow big. We can stamp big that to me when I see the 0 touch deployment. I appreciate the automation. I appreciate all those things, but what I appreciate more than anything is the consistency that every time it’s done it’s done with

00:21:25.670 –> 00:21:35.400
Ben Rogers: same level of action that as a manager or C-level. I know that all the sites are gonna look very similar to each other across from our architectural standpoint.

00:21:36.110 –> 00:21:49.629
Philip Sellers: Yeah. And it’s not just day. One. Right? I mean it. It also applies to, you know, day 2 day 90, you know, having to grow a cluster, having to increase resources, you know. So do that with 0 downtime.

00:21:50.050 –> 00:22:06.380
Philip Sellers: No, no need to to take a service, you know a cluster out of service and and make changes to it. You can just add a node. That linear scale is a huge value for our position here. No, I am going to throw out a bone to my conversation with Pi.

00:22:06.800 –> 00:22:25.780
Ben Rogers: We’ve we’re talking about AI and Ml. And our clusters. But really you’re you’re still talking about the new Tanks story. I mean not not to downsize the AI Ml. Conversation. But again, when I first started talking to Pine I was talking about artificial intelligence. I was really like, you know I’m scared to talk about this.

00:22:25.780 –> 00:22:43.329
Ben Rogers: and he reassured me that it’s just like any other workload. So for new tanics customers, our future new tanics customers again. And we’re treating AI and Ml workloads just like we would databases Vdi any other. All the rules still apply to these

00:22:43.330 –> 00:22:55.869
Ben Rogers: workloads, even though they’re doing, you know, artificial intelligence in in, in the application set. So that gave me some comfort. And as I’m going through this list his words are ringing out to me going. It’s just another workload.

00:22:56.620 –> 00:22:57.460
Philip Sellers: Yeah.

00:22:57.710 –> 00:23:26.529
Philip Sellers: Yeah. And and I think that’s important to understand. It’s a different kind of workload. But it is still just a workload in the sense of the infrastructure underneath is is still standardized. Yeah, there are a couple of places where you’re doing interesting things. Like, you know, support for storage showing, kubernetes, containers, persistent storage, those kinds of things right? Gyra, I mean, there, there are definitely places where you’re also adding value to that story.

00:23:27.590 –> 00:23:31.470
Jirah Cox: Oh, 100%. Right? So the

00:23:31.820 –> 00:23:48.760
Jirah Cox: absolutely the past right ran on physical servers. Then move to vms. After tomorrow we’re gonna run on containers and parallel to all of this mainframe is still not really dead. Right? So you know, shifts happen over a long, long time in the industry. And

00:23:48.790 –> 00:23:59.000
Jirah Cox: that’s that’s to be the platform that can run more than one thing at a time. Right? So you know, for some, for some customers, I typically kind of tend to set a high level filter

00:23:59.050 –> 00:24:00.930
Jirah Cox: on the conversation around

00:24:01.040 –> 00:24:13.679
Jirah Cox: the apps that you would develop in-house have the most chance in the next 3 to 5 years of running in containers. Because you control the source code. You have autonomy there, right? The apps that you purchase right or purchased and is given to you to run

00:24:13.850 –> 00:24:20.619
Jirah Cox: the least chance. Right? Those things, of course, are probably gonna stay in vms for quite a long time. The support model might even dictate that. So.

00:24:21.070 –> 00:24:42.880
Jirah Cox: given that that there’s some kind of blend here. Right? That’s not 0 100 or 100 for most customers. How do we solve this? Right? Well, you you probably would lock out by having infrastructure that can do both at once. Right? Because you can’t. You don’t want to silo for just your containers and a silo for just your vms. And then, oh, hope I got the blend right there like this can just be one platform, or that runs the entire business. And as that

00:24:43.200 –> 00:24:45.110
Jirah Cox: ratio changes over time, right?

00:24:45.160 –> 00:24:49.549
Jirah Cox: Probably hopefully growing more on the container side, shrinking on the Vm. Side, but maybe not.

00:24:49.590 –> 00:24:51.929
Jirah Cox: Then, of course, the platform adapts there along with you.

00:24:53.670 –> 00:25:04.259
Philip Sellers: So there’s a great visual here, and I know the people listening. Can’t see this so definitely, go out and take a look at the the blog post. But

00:25:04.290 –> 00:25:16.980
Philip Sellers: yeah, yeah, we will talk through this just a little bit. You know the the model here kind of presented is deploying the cloud fine, tune it, then run it on premises. Why, why that approach

00:25:17.330 –> 00:25:46.830
Jirah Cox: so usually, the using the AI training team is where most of the horsepower is needed. Right? That’s where that’s why. Certain, you know, Gpu stocks are where they are today. Right is because, that’s a very Gpu heavy activity, right for the initial training and development of the model. Some of these models even come sort of like, let’s say, like pre trained, or they’re at the 90% mark when you install them. And then, you know, there’s a couple of stages of like refinement, and then also execution. Right? So maybe there’s a

00:25:46.830 –> 00:25:55.550
Jirah Cox: tailoring it to maybe your customer data and then actual execution for doing the actual AI work right? What are we gonna be generating here? Right? Are we

00:25:55.550 –> 00:26:12.490
Jirah Cox: sampling or inferring customer sentiment over a voice call, are we, you know, writing better. Kb, articles are we put supporting a chat bot to get better answers. There, are we watching? You know we talked about it here on this conversation, watching camera output to say, you know, let’s infer some intent here and dwell time

00:26:12.600 –> 00:26:15.280
Jirah Cox: in space, whether that’s, you know.

00:26:15.300 –> 00:26:24.130
Jirah Cox: helping out with self checkout or people in aisles or traffic on a highway, or whatever so different kinds of used cases for different kinds of

00:26:24.210 –> 00:26:31.829
Jirah Cox: hardware. But commonly training at the edge is not the right use case, right? Usually trained in the data center trained in the cloud

00:26:31.930 –> 00:26:41.509
Jirah Cox: refined in the data center, commonly next to your actual data and then run at the edge. Next to where all the fire hose of data feeds are from the cameras, from the sensors, from the other kind of

00:26:41.540 –> 00:26:42.830
Jirah Cox: inputs there.

00:26:43.550 –> 00:26:52.409
Philip Sellers: so make it smart in the cloud with the biggest data, and then apply it to the edge, and try to get the best insights out of the edge

00:26:52.980 –> 00:26:53.880
Jirah Cox: 100%.

00:26:55.080 –> 00:27:01.939
Philip Sellers: So we we are talking that retail example the blog post goes on to talk about that. What? What

00:27:02.410 –> 00:27:04.309
Philip Sellers: what can we learn from this example

00:27:06.880 –> 00:27:25.219
Jirah Cox: the sound of me reading only slightly ahead of Philip here. Happy end of Monday to everyone out there. The no, this is, you guys calls out, a great use case here for

00:27:25.510 –> 00:27:55.410
Jirah Cox: this customer. This is examples of retailer with, say, you know, 2,000 stores. It talks about the reality, right? That they’re not. Maybe all identical, right? So you know, 7, 700 of them are. So is this is what is what this calls out a chunk of them. Right? Let’s say 700 are way larger, higher revenue, right? They’re probably more tier. One stores. You might have more that are also tier 2. But this this means that your deployment on New tenix at the edge can actually look like the real world right where they’re not all the same, but managing them

00:27:55.410 –> 00:28:20.139
Jirah Cox: does not mean living in 2 camps right? I don’t have team A and team B or silo A and Silo B. This is just all new tanics. Whether it’s going to run containers, whether it’s going to run vms, whether it’s gonna run. Both have Gpus not have Gpus be running vms would be running AI workloads. Right? All that that standardization of how do I deploy it rapidly at the edge? How do I support it? How do I know my team to deliver it to the business and productize it

00:28:20.190 –> 00:28:22.489
Jirah Cox: all? Get standardized, all gets repeatable.

00:28:23.750 –> 00:28:34.909
Harvey Green III: Yeah, ultimately, kind of leans back to what Dan was talking about earlier. You you have the infrastructure setup. You manage the infrastructure and you let the workloads be the workloads. It’s just another workload.

00:28:37.330 –> 00:28:57.009
Philip Sellers: And and it’s just another data set at the end of the day. You know, we we talk about the power of the new tenix platform. And it really is around data portability. It’s around being able to move that data easily between systems and and get the most value out of it. And and again, that that rings true here in this conversation.

00:28:57.010 –> 00:29:08.619
Philip Sellers: you know, you have the same data services underlying. And so it’s easy to get that data back into a centralized place and make use of it there as well. Yeah, no matter what you want to run right vms or containers, you know.

00:29:08.860 –> 00:29:33.180
Jirah Cox: It’s it’s not an either or choice right? It’s a it’s a both, and it’s a what’s right for you, right? It’s a you know. When the waiter says, you know, crimber lay or bread pudding for dessert. Right? You know it’s it’s there’s not. Right or wrong, it’s personal purpose. Can’t see this. And I definitely invite them to go look at this on the web page. But if you look at that picture there, and you look kind of the stack

00:29:33.580 –> 00:30:01.369
Ben Rogers: new tanks Aos at the base, I mean, that’s the base of the platform for anything you’re gonna do with new tanks with Aos. That’s the base of this platform, so not to, you know, drive 5 words into the ground. But again, may our basic offerings, or what’s covering us being able to do this. So if you’re running new tanks load, you already know what the goodness is of running AI and new Tanks world. If you’re looking at new tanics, we’re AI enabled and ready to jump in that arena with you securely.

00:30:01.370 –> 00:30:23.490
Ben Rogers: And you know, man, we’re looking at putting some of these tool sets in the cloud. One of the conversations I had with pi not to get too far Futuristic forces, I mean, our Cvm is building more intelligence into it, and the Cvm. In certain cases in the cloud is being repurposed because it no longer has to talk to the hardware like it used to so security

00:30:23.530 –> 00:30:38.439
Ben Rogers: data governance, you know, man, looking my optically at the data of what’s going on are all things that we’re building inside of our Cdm, so more goodness to come. But again, it just fits into what we’re doing today and where our roadmap is going.

00:30:39.660 –> 00:30:41.899
Philip Sellers: Yeah, absolutely. I mean.

00:30:42.090 –> 00:30:59.200
Philip Sellers: the integration points the the way that you leverage the platform is is key here. You’ve got a great foundation from a storage perspective, but you’ve extended upon that with things like files and objects. Now you’ve got integration points to kubernetes for containers and persistent storage.

00:30:59.350 –> 00:31:13.809
Philip Sellers: And oh, yeah, Ben, you were talking about that consistency of deployment for the new tanks cluster. That’s what Kubernetes is all about. That’s what containers are all about. Is that consistently deployed application running wherever you need to run it

00:31:13.920 –> 00:31:18.430
Philip Sellers: locally at the edge in the cloud. All of those things.

00:31:18.700 –> 00:31:25.589
Philip Sellers: Just lead back to a better, more consistent operating environment for your your customer.

00:31:26.010 –> 00:31:34.880
Philip Sellers:  And again, I know we talked about a lot of different use cases here, but in this article it’s it’s specifically talking about generative AI

00:31:35.250 –> 00:31:50.959
Philip Sellers: which again takes all that unstructured data and indexes it and makes use of it, and then turns it into a usable format. And so I do kind of wanna hit on that. I guess you know, Primer for

00:31:51.360 –> 00:32:05.409
Philip Sellers: AI in general generative AI is something different, right? And and so I’ll throw it to you, Jyra, hopefully. Not a curveball, but I mean the generative AI thing is is a little different. You know. This is the chat gpt the

00:32:05.540 –> 00:32:13.030
Philip Sellers:  the things that do work for you on your behalf. What? What’s a little different there.

00:32:13.090 –> 00:32:38.989
Jirah Cox: right? So to call back to something Ben said at the beginning, right? So like, if you’re listening to this, and if you’re not an AI expert, don’t worry. We’re not either. Right? So you’re in. You’re in good company there. And you know, people that are AI experts work at the AI startups right? And everyone else is just running kinda what they build but in general, yeah, like, you know, is Google intelligent, absolutely right? And it’s just matching what you typed in with like a a search query result.

00:32:39.810 –> 00:33:04.940
Jirah Cox: this new crop of stuff right in the last 12 months or so. It’s all about creating something that didn’t exist before, right? Whether that’s in text, whether that’s an understanding or a linkage, or a connection, or even visuals. Right? So think about. Of course, Chat Gp is the easy example, Dolly, and all the other things that can either give you a text or an image, or even audio. That’s one of my new favorite hobbies.

00:33:04.940 –> 00:33:11.860
Jirah Cox: when I go? Youtube dumpster diving now, is this AI generated pop music, right? Do you want to hear

00:33:11.880 –> 00:33:19.659
Jirah Cox: Johnny Cash singing? A. The blues on my name’s here.

00:33:20.200 –> 00:33:26.100
Jirah Cox: Taylor. Swift songs, because, like, it’s actually amazing, right? So there’s all channel that just does. Just does.

00:33:26.590 –> 00:33:31.930
Jirah Cox: AI Johnny Cash singing Taylor swift songs. Right? So all these things about creating things that didn’t used to exist.

00:33:31.990 –> 00:33:43.469
Jirah Cox: Is all is all with this Gen. AI category is for right? So whether it’s, you know, text audio image, you know, full on movie. That’s what is all in this category.

00:33:43.810 –> 00:34:05.540
Philip Sellers: Yeah. I mean, you know, as funny as it is, my daughter came in with some app that’s called face swap, or something like that, and you know it moves your face or someone else’s face onto a different photo. And now you can just feed in a few sentences describing what you want and output. You’ve got a a photo of Harvey, riding a unicorn

00:34:05.580 –> 00:34:16.140
Jirah Cox: I know out in space. So yeah, start turning into the old curmudgeon like.

00:34:42.679 –> 00:34:58.860
Jirah Cox: Get those people answer faster right? The the customer experience matters more than anything these days. Right? So getting your customers to a better experience, better answer faster really gives your company a competitive edge. Right? So what this does is this helps our customers right people, that we entertainx people that Zintig, or talk to

00:34:58.970 –> 00:35:16.909
Jirah Cox: go and talk to their line of business owners and say, we can give you an AI platform to get there faster, right? That doesn’t just involve. Okay, go to public cloud rent forever. And if your edge site loses its Internet connectivity. It’s Sol, right? We’re like, Hey, there’s a new way to do. AI here that actually can work disconnected

00:35:16.950 –> 00:35:18.450
Jirah Cox: is easily manageable.

00:35:18.520 –> 00:35:28.010
Jirah Cox: You can run it in the cloud, but it all, you probably wouldn’t want to run it at the edge, actually, for faster answers. Without needing a small army of folks to manage all this.

00:35:28.240 –> 00:35:44.140
Philip Sellers: Yeah, I mean, from prior job. This has been probably 9 years ago. Yeah, I found it really interesting that we could put cameras in a site, and you could count the number of people the foot traffic coming in and out of a location as a retail

00:35:44.280 –> 00:35:50.930
Philip Sellers: person you. You want to know how much foot traffic do you have in your sites? And it’s only gotten better since then. So now

00:35:51.620 –> 00:36:16.340
Philip Sellers: I know there are stores that are looking at and analyzing their their layout. Where do they have hotspots? Where do people stop? Where are they attracted to? And and all of that data goes back into designing better experiences that that drive better outcomes for each of those businesses. So it’s a huge upside for for companies to start looking into ways to

00:36:16.350 –> 00:36:30.790
Philip Sellers: to leverage AI and the data that they have, you know, they may already have that security camera data. They probably already have lots and lots of cameras in their their locations. Now, you can actually put that to valuable use.

00:36:31.110 –> 00:36:55.130
Ben Rogers: You know, they went from how much foot traffic are we having every day to? Why is still in here? Buy and be jerky 3 times a day for us on the sales side, you know it’s helping us tremendously, cause they’re now doing AI, and they’re looking at. You know what our customers find, what are trends in our product portfolio? I mean, it’s really helping us as a company go. Okay.

00:36:55.130 –> 00:37:18.860
Ben Rogers: what products are really landing with our customers? And what as a sales team should we be out there talking about, you know. Sell sales gets to be a dirty little word, but some of the things that they’re doing are really allowing our custom mean allowing our company to address our customers needs. And so I mean, AI is coming everywhere, but in different verticals. It’s used different ways. I’ve I’ve already seen it affecting my vertical.

00:37:18.860 –> 00:37:27.130
Ben Rogers: and some of it’s cool. Man. I look at it. That’s pretty killer now. There’s the whole dark side, and we won’t go there. But there’s always 2 sides to every coin. My friend.

00:37:27.260 –> 00:37:52.399
Philip Sellers: yeah. And and I may have talked about it on this podcast. But I think it applies here. There’s a fast food retailer who transition during Covid to using an app for orders, and the outcome for them was being able to have that data know more personally what people buy more often target things, you know, to drive additional sales but also

00:37:52.400 –> 00:38:05.929
Philip Sellers: know where they need additional capacity. That drive, you know more drive through capacity, more people working what time of day, you know, staffing, all of those kinds of things is outfall of of being able to have that data.

00:38:06.040 –> 00:38:15.949
Philip Sellers: I guess the underlying there is data is king. That is the end. All be all for us is data. And you know, the

00:38:16.130 –> 00:38:17.709
Philip Sellers: the businesses that are gonna

00:38:18.000 –> 00:38:24.469
Philip Sellers: succeed in the future. They’re the ones that know how to leverage that data and differentiate themselves using it.

00:38:26.560 –> 00:38:35.289
Philip Sellers: Oh, guys, I think this is a great place for us to stop. I really appreciate the conversation. It’s always great to to have a chat with Ben

00:38:35.680 –> 00:38:45.809
Philip Sellers: with gyra. And with Harvey we we appreciate you guys joining us today and anything you want to share before we wrap up last words.

00:38:49.110 –> 00:38:57.110
Ben Rogers: I appreciate you guys, let me be here. II again. I know I missed a few and thank you for having me back and great conversation. Thank you for having me.

00:38:57.510 –> 00:38:59.419
Philip Sellers: Yeah, always great. Have you been?

00:38:59.820 –> 00:39:14.799
Jirah Cox: This is probably a poor reflection of me as a podcast. But no, I can’t think of anything to plug. So have everyone have a great week. Thank you guys so much for joining us, and we’ll see you on the next. You tanks weekly.

00:39:15.060 –> 00:39:16.829
Have a great afternoon.