0.000000 2.800000 Tech It From Me is an independent and solo-produced podcast. 2.800000 7.440000 Everyone's talking about OpenAI, Anthropic, and Google DeepMind, 7.440000 11.280000 but while the headlines are dominated by billion-dollar labs, 11.280000 14.320000 there's a quieter movement building in the background. 14.320000 17.440000 Openweight, an open-source AI. 17.440000 19.840000 It's fast-moving, it's innovative, 19.840000 23.920000 and it might just be the biggest wildcard in the entire AI race. 23.920000 27.600000 Welcome to the Tech It From Me podcast, I'm Mike Madole. 28.480000 30.880000 Today, we're not talking about the closed-door labs 30.880000 34.320000 with their secret sauce models and mystery training data. 34.320000 36.720000 We're talking about the other side of AI. 36.720000 41.280000 The stuff you can download, run, and even tinker with yourself. 41.280000 44.720000 We're talking about projects like Mistral, 44.720000 47.760000 Meta's LLaMA, and all the community-driven tools 47.760000 50.320000 that anyone from a startup founder 50.320000 52.720000 to a curious teenager can get their hands on. 53.520000 59.120000 The question is, is open-source AI or Open-weight AI 59.120000 62.560000 about to disrupt the big AI status quo, 62.560000 64.480000 or will it always be playing catch-up? 64.480000 66.640000 This is Tech It From Me. 66.640000 67.840000 Let's go. 67.840000 70.320000 This is the Tech It From Me podcast. 70.320000 74.480000 Now, before we can talk about whether open-source 74.480000 78.640000 or open-weight AI can compete with the big players, 78.640000 81.440000 we need to be pretty clear what we actually mean 81.440000 87.360000 by open-source because in AI, the term gets used pretty loosely. 87.360000 91.280000 In traditional software, open-source is simple. 91.280000 96.320000 You get the code, you can run it, change it, redistribute it, 96.320000 100.000000 even sell it, all under an OSI-approved license. 100.000000 105.280000 Linux, Apache, WordPress, that's true open-source. 105.280000 107.920000 AI, though, is trickier. 108.640000 112.400000 In most cases, when people say open-source AI, 112.400000 115.120000 they really mean open-weight AI. 115.120000 120.160000 That's when the creators release the actual trained parameters, 120.160000 123.280000 the brain of the model, so you can run it yourself. 123.280000 127.920000 But the license might still have rules about how you can use it. 127.920000 130.560000 Take Mehta's Lama 3. 130.560000 134.960000 Yeah, you can download it, run it locally, fine-tune it, 134.960000 136.800000 and build apps on top of it. 136.800000 139.440000 But it's not fully open in the purest sense. 139.440000 144.240000 Mehta's license specifically blocks commercial use for companies 144.240000 147.680000 with over 700 million monthly active users. 147.680000 151.120000 In other words, sure, you can use this 151.120000 155.680000 unless you're Google, Microsoft, or Apple. 155.680000 162.480000 Then there's Mistral, a Paris-based AI company with fewer than 50 employees 162.480000 167.440000 that's making waves, because they release models with minimal restrictions. 167.440000 173.280000 They've said publicly they believe in open AI as a competitive equalizer, 173.280000 177.680000 and their models are already being baked into all kinds of tools. 177.680000 182.960000 And you can ignore the grassroots movement around projects like OLlama, 182.960000 188.960000 which make it as easy as downloading an app to run a large language model on your own laptop. 189.600000 195.920000 No cloud API keys, no reoccurring fees, just your own hardware, and the model. 195.920000 201.920000 Now, compare that to the closed model world, OpenAI's GPT-40, 201.920000 207.120000 Anthropic's Cloud 3, or Google's Gemini Ultra. 207.120000 210.880000 You can use them, but only through their platform, 210.880000 213.840000 and you'll never see their weights or training data. 213.840000 216.880000 The companies say that's for safety reasons, 216.880000 222.000000 but it also keeps them firmly in control of pricing, updates, and capabilities. 222.000000 226.800000 So, when you hear open-source AI, it's worth asking, 226.800000 234.160000 do they mean truly open, like Linux, or open-weight with a few legal speed bumps? 234.160000 237.760000 Because in this race, the details really do matter. 237.760000 243.760000 So, why is open-weight and open-source AI suddenly so hot right now? 244.720000 252.160000 I'll give you four reasons. First, cost. Every API call to a closed model adds up. 252.160000 258.080000 If you're a startup running GPT-40, your monthly bill could rival your rent. 258.080000 262.800000 With open models, you can run them locally on a decent GPU, 262.800000 270.480000 no pre-query fees, no unexpected invoices. Secondly, it's the speed of innovation. 271.840000 276.800000 Close companies release updates every few months. In the open-source world, 276.800000 282.560000 improvements can happen daily. Developers across the globe are fine-tuning models, 282.560000 287.360000 adding features, fixing bugs, and sharing those improvements instantly. 287.360000 290.640000 Thirdly, customization. 290.640000 296.640000 Close models are like renting an apartment. You can decorate, but you can't knock down a wall. 296.640000 301.200000 Open models are like owning the building. You can rip out the kitchen, 301.200000 307.200000 add a new floor, turn it into a coffee shop. In industries like health care or a law, 307.200000 312.800000 that flexibility is truly gold. And fourth, privacy and control. 312.800000 318.800000 For some organizations, especially in finance, defense, or government, 318.800000 324.240000 sending sensitive data to a third-party API is truly a non-starter. 324.240000 328.720000 Running a model in-house keeps everything in your own environment. 329.520000 335.360000 It's kind of like the Linux moment for AI. Linux never killed windows, 335.360000 339.440000 but it became the backbone of the internet and countless embedded systems. 339.440000 345.760000 Open AI models could follow that same trajectory, not replacing the giants entirely, 345.760000 350.800000 but powering an enormous share of the ecosystem in the background. 350.800000 356.560000 Now, of course, there are some pretty big hurdles. The first is obviously compute. 357.360000 363.840000 Training a frontier model, something that will compete with GPT-4, now GPT-5, 363.840000 371.760000 or cloud 3.5 costs tens or even hundreds of millions in GPUs, energy, and talent. 371.760000 378.720000 Most open projects don't have that budget, so they focus on smaller, more efficient models. 378.720000 384.400000 Then there's the quality gap. Open models are improving quickly, 384.960000 391.920000 but if you're doing complex reasoning, nuanced conversation, or highly factual answers, 391.920000 398.400000 the top closed models still have the edge. There's also security and misuse. 398.400000 405.520000 An open model can be fine-tuned for malicious purposes, generating deepfakes, 405.520000 409.200000 creating phishing emails, or even writing code for malware. 409.920000 417.040000 With closed models, companies can filter or monitor usage with open ones. The responsibility falls 417.040000 425.520000 entirely on the user. And finally, corporate capture. Even open projects often rely on funding 425.520000 432.000000 from Big Tech or Venture Capital. If the money runs out, or the investors want to pivot, 432.000000 438.560000 that openness could vanish overnight. The AI world has already seen open initiatives 438.560000 444.800000 quietly close up shop once there's a profit motive. So it begs to question, 444.800000 450.640000 can open-weight and open-source AI really stand toe-to-toe with the closed giants? 450.640000 458.160000 Personally, I see three potential futures. Number one, domination in niches. 458.160000 466.000000 Open models become the default for specialized industries where customization, privacy, 466.000000 472.560000 and cost control matter more than cutting-edge performance. Think legal document review, 472.560000 477.200000 customer service bots, or industry-specific research tools. 477.200000 486.640000 Number two is a permanent second place. Frontier models, the absolute best, stay closed forever. 486.640000 492.560000 The cost, the talent pool, and the safety concerns keep them behind company walls. 493.520000 498.320000 Open models keep up in some areas but never quite match the bleeding edge. 498.320000 504.320000 And then thirdly, the hybrid era. This is the one I think we're headed towards. 504.320000 508.640000 Businesses will use open models for everyday automation, 508.640000 515.760000 internal tools, and cost-sensitive workloads. And then call on closed models for the moments 515.760000 521.520000 where they need maximum intelligence, reasoning, or multimodal capability. 522.480000 525.760000 And make no mistake, the big players already know this. 525.760000 531.360000 Meta isn't releasing Lama just to be nice. They're playing a strategic game, 531.360000 537.440000 keeping themselves in the AI conversation, building developer loyalty, and putting competitive 537.440000 544.560000 pressure on open AI and Google. Right now, open-weight AI is still the underdog. 544.560000 548.720000 But history is full of underdogs that change the rules of the game. 549.600000 556.880000 Linux didn't kill Windows. Android didn't kill iOS. But both reshape their industries in ways 556.880000 562.880000 that force the big players to adapt. Open AI models could do exactly the same, 562.880000 570.080000 even if they never hold the performance crown. The real question isn't if open AI will compete. 570.080000 574.880000 It's how it will force the closed giants to change their game. 576.240000 579.600000 Tech It From Me is an independent and solo-produced podcast. 579.600000 581.067688 [BLANK_AUDIO]