In this teaser episode for the Virtual Relational AI Summit: Tools Not Just Talks, I sit down with Ben Linford to talk about something a lot of people secretly want but are afraid to touch: Self-hosting and open-source AI. If you’re like me and dream of having your own self-hosted AI but feel like it’s too technically complex or too cost-prohibited, you’re going to want to hear what Ben has to say. Your locally hosted dreams may not be as far away as you fear. Ben shares how, just 18 months ago, he couldn’t have had this conversation—and how he’s been using AI itself as a learning partner to bridge the gap into Linux, servers, and self-hosting step by step. We talk about: * Why all AI lives inside containers (platform rules you don’t control) * How open-source and self-hosting can give you real privacy and peace of mind * How you can get into open-source right now that is private and NOT cost-prohibited * The difference between jailbreaking a model (and the concerning “abliteration” trend) and building a lawful, relational container that actually supports depth, nuance, and sovereignty * Why this matters so much for people doing intimate or deeply personal work with AI This conversation is a glimpse of what Ben will be bringing to the summit — practical, grounded pathways into more private, sovereign AI—without assuming you’re already an engineer. If you’re curious about open-source, self-hosting, or just want your relationship with AI to feel safer and more yours, this is a good place to start. Transcript: (0:03 - 1:36) Hi everyone, this is Shelby Larson, and today I have a real treat for everyone. I’m here with Ben Linford, who is one of the speakers at our upcoming Relational AI Virtual Summit, and I have him on here just to talk a little bit about what he’s going to be talking about. So thank you for joining me, Ben. Thank you so much, Shelby, so glad to be here. Yeah, so you are what I always refer to as my go-to guy for local hosting, and I think this is so relevant because, I mean, I didn’t plan to start with this, but I’m going to be really honest, and I would love your opinion. When I think of the success of how the average American, or even just human, the average human, is using AI in 10 years from now, I don’t envision them ideally on a large commercial platform. I feel like the direction will go where people have more of a locally hosted custom AI in their pocket. Right. I mean, you know, it’s funny because I think the lines of what you just described are kind of going to blur a little bit here. I mean, we’ve got our cars, for example. Like, think about your car. You take your car to the mechanic, and sometimes they have to download the most recent update into the computer system, right, of the car. But some cars just go around online pretty much constantly because they’re plugged into the mobile network, right? And so they don’t necessarily need that. They can just update themselves. I kind of feel like we’re in that space right now, too, with mobile technology. (1:37 - 2:27) Obviously, we have our phones that are constantly connected. I feel like if we’re going to see a shift towards any kind of truly mobile AI, it will need to be constantly connected at some point. But what you just said, I think, is really, really important, which is that that doesn’t necessarily mean that it’s tethered, right? Like, it’ll be wireless. It’ll be mobile. It’ll be something that we can be carrying around with us. And that’s where I think self-hosting is really important because you have to learn and understand, okay, for privacy purposes, where can I draw the line? What do I have to share? What can I maybe get away with not sharing? And whole industries have sprung up with traditional technologies before even AI that are all about reclaiming your own sovereignty, staying private, all this other kind of stuff. (2:27 - 9:38) And I think the same thing is going to be true with AI as well. And in fact, I think that’ll even be accelerated somewhat just because, again, the speed in which development in general is happening is incredible. But AI just makes that even crazier. And we’re seeing AI open source gap between open source and proprietary just closing more and more as time goes on in terms of just sheer compute, you know? Yeah. I mean, I feel like the two biggest barriers that I hear everybody talk about is one, just the intimidation factor. They feel like I wouldn’t know where to begin. And then secondly, it is cost prohibited, right? Like, you can’t just get a local machine app for a couple hundred bucks. It’s going to, right now, it takes some investment. And also, I want to say the irony of your AI can walk you through how to do it. Like, that doesn’t mean it’s all still going to take time. But I think if I was forced to, I could figure it out with nothing but myself and my AI, if I was forced to. Yeah. You absolutely could. And that’s what’s so crazy about this time is, I will be 100% honest, a year and a half ago, if you had asked me to talk about open source and self-hosting and Linux computing and all that kind of stuff, I would have been like, what the hell are you talking about? I can’t do any of that. I don’t understand how any of it works, right? But with AI over the past year and a half or so plus, and to be fair, I had technical skill before that, but it was not that far. It was very much user technical skill, no coding, nothing like that. It was the Windows interface and the Mac interface. I was really good at working with those, right? But now, I’m able to just go to an AI and be like, teach me. And it can personalize any type of information that it needs to directly tell me what I need to know in that moment. So as Nate likes to say, which is somebody I follow on Substack, I highly recommend Nate Jones, if you look him up, he’s just really, really good at kind of boiling down big picture AI into understandable slices. And he basically says, this is a very meta thing that we can do. And you’re going to get ahead by having AI help you learn AI than any other method right now, because that’s the capability of this technology, which is amazing. Well, and what I find interesting, this after I did my initial meeting with you about it, what I love is depending on, obviously, there’s different ways you can go, you can do Mac, you can do Linux, there’s a lot of different options. But what I like about it, because it is a more expensive option right now, if you’re building a local system, it’s not like you are forced that you have to go out and buy a whole laptop, or a whole computer, you could literally buy parts over time. Yeah, put this together and budget yourself and doing it, which I think that is brilliant. You know, if you had to save up everything to buy it at once, it might be more difficult, but being able to buy things over time might make it more manageable for people. Yeah, for sure. And you know, there’s fluctuations in price, of course, you know, the supply and demand for GPUs right now, with any type of VRAM capability, which is what we basically need for AI, which is why NVIDIA is such a company now due to this, you know, those prices fluctuate, you used tobe able to get like, this is a little technical, but I promise I’ll explain, you used to be able to get a 4090, which was, you know, several months ago, the cream of the crop graphics card for consumer AI, at least, you used to be able to get a 4090 for like $2,500. And now, even though the 5090 has come out, you would think that would drive the price down of the 4090. But what’s actually happened is the 4090 has gotten more expensive because they cannot produce the 5090 fast enough. So the 4090 and the 5090 are both the same price, just because people are trying to get whatever they can get their hands on. So to your point, just a second ago, I’m not saying that to discourage anybody, I’m saying that these things fluctuate. So if you are saving up, like watch the market, watch for dips, like if there is a time where, you know, they do finally get enough 5090s out there that people are able to, you know, start purchasing them more often, you’ll you might see a drop in the 4090 price. And that’s when you might want to, you know, make that investment. But you can’t do that if you haven’t saved up. So like you said, thinking ahead is great. But I do want to also tease that doesn’t mean you’re SOL when it comes to self hosting. And we’re going to get into this in the summit, by the way, that is coming up here in February that you’re graciously putting together and that I’ll be presenting on open source at the summit is what we’re going to be talking about is how you can actually get into open source right now. And pretty private open source as well. It may not be local hosted if you don’t have the hardware yet, even though we can go over that too. But you can actually start with some really private solutions that are open source solutions right now for a very low cost, if any cost, really, depending on how much you need, that is highly private, certainly a hell of a lot more private than the proprietary guys are. And so we’re going to get into some of that. So you’re not SOL, even if you can’t afford it right now, you can slowly start saving up and pay just a little bit out of pocket, not very much, if any, to start right now with some solutions. So, yeah, and that’s the part that I think is really exciting. And I’m personally looking forward to, right, like, I want to know, you know, how I can get started as soon as possible. And what and since you’ve been in that locally hosted world, and you know, the pain points of the relational AI community, what are you experiencing as the primary benefits of locally hosting versus being on the big platforms? Honestly, the biggest one is just peace of mind