<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Revenant Research]]></title><description><![CDATA[Revenant Research examines the real structure of artificial intelligence systems—how they are built, how they break, and what holds when the pressure rises.]]></description><link>https://www.revenantresearch.com</link><generator>Substack</generator><lastBuildDate>Sat, 04 Apr 2026 12:34:03 GMT</lastBuildDate><atom:link href="https://www.revenantresearch.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Nathan Staffel]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[nathan467615@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[nathan467615@substack.com]]></itunes:email><itunes:name><![CDATA[Nathan Staffel]]></itunes:name></itunes:owner><itunes:author><![CDATA[Nathan Staffel]]></itunes:author><googleplay:owner><![CDATA[nathan467615@substack.com]]></googleplay:owner><googleplay:email><![CDATA[nathan467615@substack.com]]></googleplay:email><googleplay:author><![CDATA[Nathan Staffel]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[The Pragmatist’s Guide To Navigating AI Narratives]]></title><description><![CDATA[How to Read the AI Supercycle When Every Narrative Is Simultaneously True]]></description><link>https://www.revenantresearch.com/p/the-pragmatists-guide-to-navigating</link><guid isPermaLink="false">https://www.revenantresearch.com/p/the-pragmatists-guide-to-navigating</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Thu, 20 Nov 2025 14:50:59 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Lz7Z!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Lz7Z!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Lz7Z!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic 424w, https://substackcdn.com/image/fetch/$s_!Lz7Z!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic 848w, https://substackcdn.com/image/fetch/$s_!Lz7Z!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic 1272w, https://substackcdn.com/image/fetch/$s_!Lz7Z!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Lz7Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic" width="1456" height="1092" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1092,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:648973,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/179408680?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Lz7Z!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic 424w, https://substackcdn.com/image/fetch/$s_!Lz7Z!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic 848w, https://substackcdn.com/image/fetch/$s_!Lz7Z!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic 1272w, https://substackcdn.com/image/fetch/$s_!Lz7Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F90ea47a9-e3d8-43c0-8fc2-9676341e2228_3000x2250.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>The Day The Language Changed</h2><p>On November 19, 2025, NVIDIA released its third-quarter earnings. The numbers delivered: $57 billion in revenue, up 62% year-over-year. Jensen Huang said Blackwell sales were &#8220;off the charts.&#8221; Analysts parsed the data center segment: $51.2 billion, 90% of total revenue. The AI supercycle is still in full force.</p><p>But something subtle caught my eye in the press release. NVIDIA announced partnerships in new quantitative terms:</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>&#8220;A strategic partnership with OpenAI to deploy at least 10 gigawatts of NVIDIA systems.&#8221;</p><p>&#8220;Anthropic will initially adopt 1 gigawatt of compute capacity.&#8221;</p><p>Not GPU counts. Not dollars. Gigawatts.</p><p>A gigawatt powers roughly 700,000 American homes. OpenAI&#8217;s 10 gigawatts equals the electrical load of Houston. </p><p>The language changed because the constraint changed. The bottleneck on AI development is no longer chip manufacturing. It&#8217;s electrical capacity, and if you&#8217;ve been following the headlines you know that narrative all too well.</p><p>Three months earlier, in August, Jensen Huang had explained this to analysts, though few noticed at the time: &#8220;Out of a gigawatt AI factory, which can go anywhere from $50 billion-$60 billion, we represent about 35% plus or minus of that.&#8221;</p><p>NVIDIA captures $18-21 billion. The other $30-40 billion? Power plants. Substations. Cooling towers. Grid transmission. Buildings.</p><h2>Three Ways To Read What Happens Next</h2><p>What follows is a timeline of announced projects, capital flows, and site selections from 2024-2027. The same facts support three different interpretations, not because the data is ambiguous, but because the infrastructure itself operates at a scale where interpretation collapses.</p><p><strong>The Optimistic Case: Manufacturing Renaissance</strong></p><p>The AI boom is rebuilding American manufacturing. Companies need massive electrical capacity immediately. That capacity exists in one place: former industrial sites in the Rust Belt: old car plants, shuttered steel mills, places where the grid was built to handle 24/7 heavy industrial loads.</p><p>When factories closed, the electrical infrastructure remained. Now AI companies are repurposing it, bringing capital and activity to precisely the regions devastated by decades of financialization disguised as globalization. The &#8220;AI infrastructure&#8221; framing unlocks political support and regulatory fast-tracking that &#8220;bring back manufacturing&#8221; could never achieve.</p><p>The renaissance is real because the $1.2 trillion in announced investments represents actual construction, not just press releases. Toyota in Kentucky, TSMC in Arizona, Foxconn in Texas&#8212;these are physical facilities breaking ground.</p><p><strong>The Cynical Case: Subsidy Arbitrage</strong></p><p>It&#8217;s narrative arbitrage, actually. Projects that would be rejected as &#8220;data centers&#8221; get approved as &#8220;strategic AI infrastructure.&#8221; Sites that couldn&#8217;t attract investment as &#8220;manufacturing&#8221; become venture-backable as &#8220;AI factories.&#8221;</p><p>Look at what&#8217;s actually being &#8220;manufactured&#8221;&#8212;servers, semiconductor equipment, AI infrastructure components. Not consumer goods. Not the kind of manufacturing that built the middle class. And notice the framing: &#8220;collaborative robots to overcome labor shortages&#8221; means automation from day one. Jobs are not coming back. </p><p>In Ohio, data center tax breaks amount to over $2 million per permanent full-time job. That&#8217;s not job creation. That&#8217;s public subsidy flowing to private profit. Constellation Energy&#8217;s CEO warns: &#8220;I think the load is being overstated.&#8221; When the CEO of a major power provider says demand projections look inflated, pay attention.</p><p><strong>The Geopolitical Case: Infrastructure Determines Destiny</strong></p><p>The United States is losing the AI race because China adds more electricity demand annually than Germany&#8217;s total consumption. One Chinese province alone matches India&#8217;s entire electricity supply.</p><p>American AI researchers visiting China report that energy availability is treated as &#8220;a solved problem.&#8221; Chinese energy planning happens in anticipation of demand, not in reaction to it.</p><p>The U.S. is choosing data center locations based on where electrical infrastructure already exists because building new infrastructure takes seven years. That&#8217;s not strategy. That&#8217;s triage. The gigawatt language reveals a scramble dressed up as a plan.</p><p>China built power first, then AI. America is trying to build AI, then scrambling for power. In infrastructure races, the side that builds foundations first usually wins.</p><p>With those three narratives established let&#8217;s map out the timeline and see how those narratives play out. </p><h2>Timeline: 2024-2027</h2><h3><strong>2024: The Constraint Emerges</strong></h3><p>U.S. AI data centers consumed 4 gigawatts of power. Natural gas turbines sold out through the end of the decade. Grid interconnection queues backed up. Companies started realizing new power infrastructure wouldn&#8217;t arrive fast enough.</p><h3><strong>Q1 2025: Site Selection Begins</strong></h3><p><strong>January</strong>: Trump administration launches. Stargate project announced.</p><p><strong>February</strong>: Former GM Lordstown plant&#8212;6.2 million square feet, shuttered in 2019&#8212;targeted for Foxconn/OpenAI/SoftBank AI equipment manufacturing.</p><p>Lordstown sits in Youngstown, Ohio. Trump visited in 2017 and told workers &#8220;don&#8217;t sell your house&#8221; because jobs were coming back. GM closed the plant two years later. Now it&#8217;s being converted for AI infrastructure. Former car factory workers are not going to convert to silicon wafer producers.  </p><p><strong>Optimistic signal:</strong> Industrial site getting second life. Investment flowing to devastated region.</p><p><strong>Cynical signal:</strong> Why Lordstown? Not for talent or universities. For the electrical substation sized to stamp 400,000 cars annually. The power infrastructure is what&#8217;s valuable, not the people.</p><p><strong>Geopolitical signal:</strong> Speed matters. Lordstown is expected operational by 2026. That&#8217;s only possible because the electrical infrastructure exists. Greenfield sites would take until 2030. That would mean China is moving on to quantum AI while we&#8217;re still dicking around with data center permits. </p><h3><strong>Q2 2025: Export Controls and Site Selection</strong></h3><p><strong>April</strong>: U.S. government requires export licenses for NVIDIA H20 chips to China. NVIDIA takes $4.5 billion inventory charge.</p><p><strong>May</strong>: NVIDIA loses an additional $2.5 billion in blocked H20 shipments.</p><p><strong>June</strong>: Amazon announces $20 billion Pennsylvania investment at two sites: Salem Township, adjacent to Susquehanna nuclear plant with direct power connection, and Falls Township at former US Steel mill.</p><p>The export controls cut off China&#8217;s market for AI chips&#8212;$7 billion in immediate losses for NVIDIA. One month later, Amazon announces $20 billion in domestic AI infrastructure. The timing reveals the strategic pivot: lost international markets accelerate domestic buildout.</p><p>But the site selection tells a different story. Salem Township matters because of direct nuclear power connection. Falls Township matters because it&#8217;s a former steel mill with grid infrastructure built for continuous heavy industrial loads.</p><p><strong>Optimistic signal:</strong> Export controls forcing domestic investment. Former industrial sites getting $20 billion in capital. Real construction creating regional economic activity where it&#8217;s most needed.</p><p><strong>Cynical signal:</strong> Amazon choosing sites purely for electrical access, not economic development. Tax breaks to attract projects probably exceed local benefit. The &#8220;reshoring&#8221; narrative covers straightforward infrastructure arbitrage.</p><p><strong>Geopolitical signal:</strong> Export controls are the weapon, but they cut both ways. U.S. blocks China from advanced AI chips, forcing $7B in immediate losses on American companies. China responds by accelerating domestic chip development and treating energy availability as &#8220;a solved problem&#8221; while U.S. scrambles for sites with adequate power. The export ban assumes American AI advantage is durable. But if the real constraint is electrical infrastructure, not chip design, the ban may be a strategic blunder.</p><h3><strong>Q3 2025: The Language Pivot</strong></h3><p><strong>August 27, NVIDIA Q2 FY2026 earnings call</strong>: Jensen Huang tells analysts: &#8220;Out of a gigawatt AI factory, which can go anywhere from $50 billion-$60 billion, we represent about 35% plus or minus of that.&#8221;</p><p>&#8220;It takes six chips, six different types of chips, just to build an AI and a Rubin AI supercomputer. Just to scale that out to a gigawatt, you have hundreds of thousands of GPU compute nodes.&#8221;</p><p>Huang is telling investors that NVIDIA is now an infrastructure company. And he&#8217;s revealing that 65% of a gigawatt facility&#8217;s cost isn&#8217;t chips. It&#8217;s power and buildings.</p><p>Power is now front and center as the constraint. The question is whether this creates opportunity (optimistic), extraction (cynical), or disadvantage (geopolitical).</p><h3><strong>September 2025: The Mega-Announcements</strong></h3><p><strong>September 22</strong>: OpenAI and NVIDIA announce partnership for at least 10 gigawatts. NVIDIA will invest up to $100 billion progressively as each gigawatt deploys.</p><p>Huang tells CNBC this equals 4-5 million GPUs&#8212;roughly NVIDIA&#8217;s entire 2025 production, &#8220;twice as much as last year.&#8221;</p><p>One partnership equals one year of total production. </p><p><strong>September 23</strong>: OpenAI, SoftBank, and Oracle unveil five new Stargate sites: Shackelford County TX, Do&#241;a Ana County NM, Lordstown OH, Milam County TX, and one undisclosed Midwest location.</p><p>Look at the geography. Not one California site. Not one Massachusetts site. The AI revolution is being built in the industrial heartland.</p><p>Sam Altman explains site selection: &#8220;You need a lot of energy availability. You want a favorable regulatory environment that allows construction, fast permits. You need the talent... you need the land.&#8221; </p><p>Energy first. Then regulations. Then talent (imported, not local).</p><p><strong>Optimistic signal:</strong> $100 billion flowing to Ohio, Texas, New Mexico. These regions need the investment.</p><p><strong>Cynical signal:</strong> Industry insiders report &#8220;similar projects that look exactly to have the same footprint being requested in different regions across the country&#8221; meaning companies are shopping the same project to multiple utilities to extract better deals.</p><p><strong>Geopolitical signal:</strong> Speed trumps quality. These sites weren&#8217;t chosen because they&#8217;re optimal. They were chosen because they&#8217;re possible.</p><h3><strong>October 2025: The Manufacturing Question</strong></h3><p><strong>October 28</strong>: NVIDIA announces manufacturing partnerships: Belden, Caterpillar, Foxconn, Lucid Motors, Toyota, TSMC, Wistron. $1.2 trillion in U.S. production capacity investments announced, led by electronics, pharma, semiconductors.</p><p><strong>Optimistic interpretation:</strong> This is real. $1.2 trillion isn&#8217;t accounting fiction. Toyota doesn&#8217;t announce Kentucky investments for press release purposes. TSMC doesn&#8217;t break ground in Arizona for optics.</p><p>These are actual facilities being built or modernized. The AI framing gives political cover to do what &#8220;bring back manufacturing&#8221; couldn&#8217;t do.</p><p><strong>Cynical interpretation:</strong> Look at what&#8217;s being manufactured&#8212;AI servers, semiconductor equipment, infrastructure components. Not consumer goods. Not products Americans buy. The same cloud oligarchy is weaponizing their balance sheets to ensure no new competition emerges. Moreover, a factory that runs on robots doesn&#8217;t create the jobs that built the middle class.</p><p><strong>Geopolitical interpretation:</strong> While America announces partnerships, China executes coordinated buildout. Energy planning is technocratic, long-term, and happens before investment rather than after.</p><h3><strong>November 2025: The Earnings That Confirmed It</strong></h3><p>Which brings us back to the present. NVIDIA Q3 2025: $57 billion revenue, $51.2 billion from data centers.</p><p>Partnerships announced in gigawatts:</p><ul><li><p>OpenAI: 10 gigawatts</p></li><li><p>Anthropic: 1 gigawatt</p></li></ul><p>The language has fully shifted. </p><p>Q4 guidance: $65 billion revenue expected.</p><p>Buried in analyst materials: AI inference demand will reach 400% of training workloads by 2027, up from under 50% in 2022.</p><p>Training can happen anywhere with power. But inference, the actual use of AI, must happen near users and applications. Autonomous driving, industrial automation, AR/VR all require sub-10ms latency.</p><p>You cannot serve a self-driving car in Detroit from a data center in Texas. Physics prevents it.</p><p>If inference scales to 400% of training by 2027, you don&#8217;t need ten gigawatt training centers. You need hundreds of distributed inference facilities near populations and industry. Which means electrical infrastructure everywhere. Which means old industrial sites become the only option at scale.</p><p><strong>Optimistic interpretation:</strong> Distributed inference validates the industrial site strategy. AI naturally flows to where manufacturing and population are, creating integrated industrial ecosystems.</p><p><strong>Cynical interpretation:</strong> &#8220;Distributed inference&#8221; is making virtue of necessity. Can&#8217;t build gigawatt centers fast enough, so claim distribution was the plan all along.</p><p><strong>Geopolitical interpretation:</strong> Distributed architecture plays to American strengths (existing industrial infrastructure, geographic spread) versus Chinese strengths (centralized planning, concentrated buildout). But only if execution matches strategy.</p><h3><strong>2025 Year End: The Numbers</strong></h3><p><strong>Demand projections:</strong></p><ul><li><p>68 GW global AI data center demand by 2027</p></li><li><p>327 GW by 2030, versus total global data center capacity of just 88 GW in 2022</p></li><li><p>U.S. alone: 4 GW in 2024 to 123 GW by 2035 </p></li></ul><p><strong>Supply reality:</strong></p><ul><li><p>Goldman Sachs estimates $720 billion in grid spending needed through 2030</p></li><li><p>Grid connections taking 4-7 years in key regions</p></li><li><p>Data center builds take 1-2 years; grid connections take 7 years in some locations</p></li></ul><p>68 GW needed by 2027. That&#8217;s two years away. Grid connections take 4-7 years. Projects announced today won&#8217;t connect to new power until 2029-2032.</p><p>The only way to meet 2027 targets is existing infrastructure. Hence Lordstown, Salem Township, Falls Township.</p><p><strong>Optimistic:</strong> Necessity drives innovation. Can&#8217;t wait for new grid? Use old industrial sites. Can&#8217;t get permits? Call it &#8220;AI infrastructure&#8221; and regulators prioritize it. The constraint creates the solution.</p><p><strong>Cynical:</strong> The math reveals the game. Demand projections justify investment. Investment justifies subsidies. When actual utilization comes in below projection, who&#8217;s left holding stranded assets? Taxpayers.</p><p><strong>Geopolitical:</strong> The math reveals American disadvantage. China built infrastructure first, enabling rapid deployment now. America is scrounging for existing capacity while China executes planned buildout. Two years from now shows who was right.</p><h3><strong>2026-2027: The Test Period</strong></h3><p>This is where narrative meets reality.</p><p><strong>Capacity utilization rates:</strong> Do announced facilities operate at projected density? Or do megawatt ratings exceed actual deployment?</p><p><strong>Power prices:</strong> Wholesale electricity already up 267% in some markets near data centers. If prices spike further, it validates scarcity. If they stabilize, it suggests adequate supply or overstated demand.</p><p><strong>Employment vs automation:</strong> Do these facilities create meaningful jobs? Or is the focus on &#8220;collaborative robots&#8221; code for minimal human employment?</p><p><strong>Inference deployment:</strong> Does the 400% inference growth materialize? More important: where does it happen, centralized cloud or distributed edge?</p><p><strong>Chinese buildout pace:</strong> Does the U.S. infrastructure gap close, widen, or stay constant?</p><h2>The Pragmatist&#8217;s Position</h2><p>When a chip company starts announcing deals in electrons instead of dollars, something has fundamentally shifted. NVIDIA became an infrastructure company the moment partnerships were measured in gigawatts&#8212;because at that scale, the product being sold is access to electrical capacity, not computing capability.</p><p><strong>Gigawatt-scale AI infrastructure operates at a threshold where economic development, financial engineering, and geopolitical competition become functionally indistinguishable until someone actually turns the power on.</strong></p><p>Which reveals the fourth narrative hidden underneath the other three: <strong>the electrical constraint is the only coordinating force</strong>.</p><p>Not institutions. Not strategy. Not markets. The bottleneck itself is performing the coordination function&#8212;forcing economic, financial, and strategic decisions into alignment because nothing else can.</p><p>When constraints govern instead of institutions, you get the appearance of strategy without strategic choice. Every decision looks deliberate but is actually the constraint expressing itself through whatever institutional channel encounters it first. The system produces results that satisfy the bottleneck without optimizing for any coherent goal.</p><p>This is what infrastructure looks like after sixty years of  institutional and infrastructure decay. Not the absence of institutions, but their subordination to physical limitations that now perform the coordinating work leaders once did. The constraint doesn&#8217;t adjudicate between narratives. It just enforces its terms and lets the players rationalize the narrative afterward.</p><p>-Nathan Staffel</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Systems & Signals — August 2025]]></title><description><![CDATA[Early Recovery Opens Capital, but Power and Compliance Decide Who Scales]]></description><link>https://www.revenantresearch.com/p/systems-and-signals-august-2025</link><guid isPermaLink="false">https://www.revenantresearch.com/p/systems-and-signals-august-2025</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Mon, 01 Sep 2025 13:25:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!kfHX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kfHX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kfHX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic 424w, https://substackcdn.com/image/fetch/$s_!kfHX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic 848w, https://substackcdn.com/image/fetch/$s_!kfHX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic 1272w, https://substackcdn.com/image/fetch/$s_!kfHX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kfHX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic" width="1456" height="1092" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1092,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:514656,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/172264819?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kfHX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic 424w, https://substackcdn.com/image/fetch/$s_!kfHX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic 848w, https://substackcdn.com/image/fetch/$s_!kfHX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic 1272w, https://substackcdn.com/image/fetch/$s_!kfHX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24ff44a8-c719-4542-8e32-8f4bc031183a_3000x2250.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em>Systems and Signals is a new monthly series I am starting in order to bring clarity and in depth analysis of the intersection of global macro data, AI infrastructure capex and application advancements. </em></p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><h3>The Month in View</h3><p>August closes with the global economy shifting into what economists call the early recovery phase. After more than a year of inverted yield curves and restrictive monetary policy, short term borrowing costs are finally easing, credit spreads are steady, and markets anticipate a Federal Reserve rate cut in September. Yet even as capital loosens, the hard limits of energy supply, colocation space, and shipping bottlenecks remain immovable. On the technology side, Blackwell GPUs are moving from hyperscale data centers into workstations, sovereign clouds are scaling up, and new European regulations have begun enforcement.</p><p><strong>Money is becoming easier to raise, but the space to deploy it remains scarce. </strong>The winners will be those who can synchronize financial openings with operational readiness rather than treating them as separate arenas.</p><h3>Economic Signals and AI Readiness</h3><p>The cycle is turning. The 2 year Treasury yield has fallen to 3.61 percent, down from nearly 3.9 percent earlier in the summer. The 10 year holds at 4.26 percent, leaving a positive spread of 65 basis points. For the first time in more than a year short rates are below long rates. That steepening matters because it reopens the door to financing data centers, GPU clusters, and power contracts. The bond market is saying that credit is available again.</p><p>Spreads confirm the shift. High yield risk premiums have narrowed to 425 basis points. Vendors who demanded cash in advance a few months ago are once again offering terms. Financing windows are real but conditional.</p><p>Inflation at 3.2 percent is still high. Markets assign an 87 percent probability to a modest September cut. For enterprises the lesson is to lock in financing now rather than wait for perfection. The labor market is loosening with unemployment at 4.3 percent and jobless claims inching higher, relieving hiring costs for technical roles. <strong>These macro conditions set the table for a build out. Yet the choke point remains energy.</strong> Retail power at 18.2 cents per kWh and PJM capacity auctions spiking tenfold make clear that capital is no longer the scarcest commodity. Megawatts are.</p><p>Global logistics deepen the constraint. Shipping delays at Rotterdam and Ningbo mean memory modules and networking gear arrive months late even when freight is paid. <strong>AI deployment is beholden to port schedules.</strong></p><p>The international picture adds fragility. China&#8217;s property market remains in contraction with new home sales down 21 percent year over year. Any shock there could ripple into global credit markets. <strong>Recovery is real but uneven.</strong></p><h3>Inside the AI Build Out</h3><p>The new availability of Blackwell GPUs in workstations spreads performance beyond hyperscalers. Hospitals, banks, and manufacturers can now buy servers rich in compute without running their own data centers. Yet H100 units still sell for 25,000 to 40,000 dollars and DDR shortages keep inventories thin. </p><p><strong>Cloud providers adjust. </strong></p><ul><li><p>AWS offers single GPU H100 instances. </p></li><li><p>Databricks hosts open weight GPT models. </p></li><li><p>Cerebras and Core42 drive wafer scale inference costs down below one dollar per million tokens. </p></li><li><p>Capital is cheaper and models are cheaper, but colocation vacancies at 2.3 percent keep cloud as the main path.</p></li></ul><p><strong>Middleware grows more portable.</strong> </p><ul><li><p>PyTorch 2.8, Kubeflow Trainer, and Kubernetes 1.31 all advance abstractions that let workloads migrate across vendors. </p></li><li><p>Portability is not a convenience but a hedge against export controls and shipping delays.</p></li></ul><p><strong>Storage joins the list of bottlenecks.</strong> </p><ul><li><p>HDD lead times stretch to 20 weeks. DDR shortages bite. </p></li><li><p>Throughput benchmarks reach tens of gigabytes per second per node. </p></li><li><p>The SNIA initiative on standardization reflects the urgency of neutral, modular storage.</p></li></ul><p><strong>Energy stress pulls compute to the edge.</strong> </p><ul><li><p>Snapdragon 8 Elite, SiMa.ai&#8217;s low watt transformer inference, and neuromorphic prototypes shift inference closer to users. </p></li><li><p>These are economic choices driven by grid prices.</p></li></ul><p><strong>Governance is no longer theory.</strong> </p><ul><li><p>The EU AI Act is live, demanding training data summaries and systemic risk controls. </p></li><li><p>ISO 42001 certifications appear with Autodesk among the first. </p></li><li><p>The United States takes a lighter touch federally but states impose their own rules. Compliance is now a design input.</p></li></ul><h3>The Interlock: Macro and AI</h3><p>The economy and AI are no longer separate stories. Falling rates create financing, but empty racks and limited power prevent translation into hardware. Rising electricity costs push engineers to rewrite inference for the edge. Shipping delays turn modular orchestration into survival strategy. Regulatory enforcement arriving alongside cheap capital forces compliance into procurement checklists rather than end stage reviews. Each economic shift lands directly inside the architecture. The cycle no longer surrounds AI. It inhabits it.</p><h3>Forward Watch</h3><ul><li><p><strong>Federal Reserve Meeting</strong>: Will determine the width of the financing window into Q4.</p></li><li><p><strong>GPU Delivery</strong>: Schedules remain hostage to inventories not announcements.</p></li><li><p><strong>Energy Diversification</strong>: Small modular reactors, grid scale batteries, and fuel cells decide who can monetize compute.</p></li><li><p><strong>EU Enforcement</strong>: Compliance clarity will separate credible providers from opportunistic ones.</p></li><li><p><strong>China&#8217;s Property Crisis</strong>: A continuing drag with potential to tighten credit worldwide.</p></li></ul><h3>Positioning</h3><p><strong>Macro Tilt: </strong>The early recovery favors sectors tied to physical build-out and household demand. Housing improvements, manufacturing equipment, and travel all stand to benefit from cheaper credit and renewed consumer activity. Intermediate bonds remain attractive, providing stable income without the exposure to commercial real estate or fragile Chinese property developers.</p><p><strong>Tech Tilt: </strong>In AI strategy, the strongest bets are on efficiency and resilience. Quantized inference reduces electricity load, standardized orchestration ensures portability, wafer-scale or serverless hosting expands compute options, and vendor-neutral storage avoids single-provider risks. Providers who have already embedded governance practices are positioned to pass compliance reviews without delay.</p><p><strong>Cross-Impact: </strong>Macro and AI pressures now collide directly. Falling rates open financing, but scarce colocation and high power prices decide who can actually build. Electricity costs magnify the value of efficient inference. Cheap capital arriving alongside fragmented regulation rewards firms with compliance maturity. The competitive edge is no longer about a single trend &#8212; it comes from aligning financial, physical, and regulatory readiness into one motion.</p><h3>Closing Note</h3><p>The narrative of August 2025 is convergence. Financing is available while grids and ports remain constricted. Governance rules arrive just as money is loosening. In this early recovery the scarce resources are not capital or silicon but megawatts and compliance. AI is physical.</p><p> </p>]]></content:encoded></item><item><title><![CDATA[The Scabbard and the Sword]]></title><description><![CDATA[A Practical Rhetoric for the Age of AI]]></description><link>https://www.revenantresearch.com/p/the-scabbard-and-the-sword</link><guid isPermaLink="false">https://www.revenantresearch.com/p/the-scabbard-and-the-sword</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Thu, 17 Jul 2025 13:02:42 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!rUsR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!rUsR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!rUsR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!rUsR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!rUsR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!rUsR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!rUsR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:555699,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/166995763?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!rUsR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!rUsR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!rUsR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!rUsR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F260c2ca0-f6ad-4bb3-a646-15f3ff285e18_1536x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3><strong>A Clearing in the Woods</strong></h3><p>It&#8217;s strange what sticks. Of all the things I learned growing up in the Texas Hill Country, where life was ruled by small business, church, and football, the most enduring came from a forgotten corner of our tiny, poor school: Logic and Rhetoric.</p><p>We didn&#8217;t have much. Our classrooms baked under tin roofs, the Texas heat pressing at the glass. The curriculum was thin, the resources thinner. But someone, some long-forgotten administrator, had carved out space for syllogisms, fallacies, and the craft of argument. It made no sense in that world of rural practicalities, but it stuck.</p><p>It was, I would later realize, a scabbard. Martin Luther wrote that languages are "the scabbard in which the sword of the Spirit is sheathed." The scabbard gives the sword a home, a form, a way to be carried safely and drawn with purpose. My training in logic and rhetoric was like that. It was a structure for containing and directing the messy energy of ideas.</p><p>I find myself thinking of that scabbard now, in an age defined by a new and powerful kind of language, one that emanates not from a human soul but from the silent, intricate workings of massively networked pieces of silicon. We are all, in a sense, freshmen again, confronting a technology that can speak with astonishing fluency, a tool of immense power and subtle danger. We see it reason, write, and create. But to wield this new sword without a scabbard is to risk cutting ourselves deeply. This, then, is an attempt to reclaim that old and practical art. It is an exploration of rhetoric, not as a historical curiosity, but as an essential tool for navigating the strange new landscape of AI.</p><div><hr></div><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;b52046af-7b63-4a31-a44a-5823e63f1b33&quot;,&quot;duration&quot;:null}"></div><p><em>This video is a companion guide to this essay, available at <strong><a href="https://www.youtube.com/@revenant.research">Revenant Research's new YouTube channel</a></strong></em></p><div><hr></div><h3><strong>The Art of Enchanting the Soul</strong></h3><p>To reclaim rhetoric, we must first clear away the debris of its modern definition. Today, the word is often used as a pejorative, synonymous with empty political speech. But in its classical conception, rhetoric is a noble and comprehensive art, a discipline concerned with the very core of human communication and civic life. Plato called it the "art of enchanting the soul." It is the craft of moving a mind from one place to another through discourse. To understand this art, we must turn to the three thinkers who gave it its enduring form: Aristotle, Cicero, and Quintilian.</p><h4><strong>Aristotle's Framework: The Discovery of Persuasion</strong></h4><p>Aristotle provided the philosophical bedrock for rhetoric. He defined it as "the faculty of discovering in any particular case all of the available means of persuasion." For Aristotle, rhetoric was a <em>techn&#234;</em>, an art or craft with a rational method, operating in the realm of probability where absolute certainty is not possible. It is the counterpart to dialectic, a tool for navigating the contingent world of human affairs.</p><p>He identified three fundamental appeals that a speaker must master to discover these means of persuasion:</p><ol><li><p><em><strong>Logos</strong></em>: The appeal to reason. This is the argument itself, its logical structure, the evidence presented. It is the intellectual substance of the speech.</p></li><li><p><em><strong>Pathos</strong></em>: The appeal to emotion. This involves understanding the emotional state of the audience and crafting the message to resonate with their feelings, be it pity, anger, fear, or joy.</p></li><li><p><em><strong>Ethos</strong></em>: The appeal based on character. This is the credibility and trustworthiness of the speaker. An audience is more likely to be persuaded by someone they perceive as knowledgeable, benevolent, and virtuous.</p></li></ol><p>The working machinery of <em>logos</em> is built from two key components: the syllogism and the enthymeme. A syllogism is a complete, formal structure where two premises lead to a necessary conclusion. The classic example is: <em>All humans are mortal. Socrates is a human. Therefore, Socrates is mortal.</em> It&#8217;s clean. Unshakable. But syllogisms come in many forms, some valid, some that fall apart the moment you scrutinize them.</p><p>The most solid is the <strong>Barbara syllogism</strong>: the universal affirmative. <em>All mammals are warm-blooded. All dogs are mammals. Therefore, all dogs are warm-blooded.</em> It holds.</p><p>There&#8217;s the <strong>Celarent syllogism</strong>: the universal negative. <em>No reptiles are warm-blooded. All snakes are reptiles. Therefore, no snakes are warm-blooded.</em> It holds.</p><p>And there&#8217;s <strong>Modus Ponens</strong>: the conditional form. <em>If it rains, the ground gets wet. It rains. Therefore, the ground gets wet.</em> This is the backbone of clear reasoning.</p><p>But it's easy to misstep.</p><p>There&#8217;s <strong>Affirming the Consequent</strong>: a formal fallacy. <em>If it rains, the ground gets wet. The ground is wet. Therefore, it rained.</em> Sounds reasonable, but it&#8217;s broken&#8212;the ground could be wet for a dozen other reasons.</p><p>There&#8217;s the <strong>Undistributed Middle.</strong> <em>All cats are mammals. All dogs are mammals. Therefore, all dogs are cats.</em> The middle term is never properly connected. It collapses.</p><p>And there&#8217;s the <strong>Illicit Major.</strong> <em>All apples are fruit. All fruit are sweet. Therefore, all apples are sweet.</em> It feels right, but the major premise wasn&#8217;t airtight. The logic slips.</p><p>Syllogisms aren&#8217;t just language games. They force you to build carefully, with structure that holds even if you strip the words away.</p><p>Rhetoric, however, operates in the more complex world of human affairs and probabilities. Here, Aristotle's great insight was the enthymeme, which he called "the body of proof" and the strongest of rhetorical arguments. An enthymeme is often described as a truncated or rhetorical syllogism. Its power comes from what it leaves unsaid. The speaker omits a premise, usually a piece of common knowledge or a widely held belief, and relies on the audience to supply it mentally. For instance, a speaker might say, "Socrates is mortal because he's human," leaving the major premise "All humans are mortal" implied. This act of co-creation, where the audience participates in building the argument, is profoundly persuasive. It connects the speaker's logic to the audience's own worldview, making the conclusion feel not just stated, but shared.</p><h4><strong>Cicero's System: The Five Canons</strong></h4><p>If Aristotle provided the philosophy, the Roman statesman and orator Marcus Tullius Cicero provided the system. Cicero, who believed the perfect orator must have a firm foundation of general knowledge, saw rhetoric as "one great art comprised of five lesser arts." These five canons, first codified in Roman texts like the anonymous <em>Rhetorica ad Herennium</em> and Cicero's own works, represent a complete, structured process for building an argument from its conception to its final performance.</p><p>The five canons are:</p><ol><li><p><em><strong>Inventio</strong></em> <strong>(Invention):</strong> The process of discovering arguments. This involves research, brainstorming, and identifying the core issues and available proofs relevant to the topic and audience.</p></li><li><p><em><strong>Dispositio</strong></em> <strong>(Arrangement):</strong> The art of structuring the argument. A classical oration had distinct parts: an introduction, a statement of facts, the argument, the refutation of counterarguments, and a conclusion. This canon governs the logical and persuasive flow of the speech.</p></li><li><p><em><strong>Elocutio</strong></em> <strong>(Style):</strong> This concerns the choice of words, the use of figures of speech like metaphors and antithesis, and the overall tone and composition of the language to make the argument clear, memorable, and moving.</p></li><li><p><em><strong>Memoria</strong></em> <strong>(Memory):</strong> In an age before teleprompters, the ability to memorize a lengthy and complex speech was a critical skill. This canon included various mnemonic techniques to aid recall.</p></li><li><p><em><strong>Pronunciatio</strong></em> <strong>(Delivery):</strong> The final art of performance. This involves the use of voice, gesture, posture, and expression to convey the argument effectively and connect with the audience.</p></li></ol><p>Cicero's system transformed rhetoric from a purely philosophical inquiry into a practical, teachable discipline for public life, especially in the legal and political arenas of the Roman Republic.</p><h4><strong>Quintilian's Moral Vision: The Good Man Speaking Well</strong></h4><p>A century after Cicero, the Roman teacher Quintilian added the final, crucial layer to the classical tradition: an unwavering emphasis on morality. His work, <em>Institutio Oratoria</em> ("The Training of an Orator"), was perhaps the most influential textbook on education ever written. For Quintilian, rhetoric was inextricably linked to virtue.</p><p>Quintilian believed that the ideal orator must first be a good person. Since the function of the orator is to advance truth and good government, eloquence must be fused with a virtuous character. His goal was to educate the "perfect orator", a virtuous and eloquent statesman-philosopher who could wield the power of speech for the betterment of society.</p><p>These three perspectives&#8212;Aristotle's discovery of persuasive means, Cicero's systematic process, and Quintilian's moral imperative&#8212;form the pillars of classical rhetoric. It is a discipline that weds logic with emotion, structure with style, and skill with character. It is a complete human art for navigating the complexities of civic life.</p><h3><strong>A Bad Egg from a Bad Crow</strong></h3><p>The art of rhetoric was born in the social upheaval in Sicily in the 5th century BCE.<sup> </sup>Around 467 BCE, the city of Syracuse overthrew its tyrants and in the chaotic aftermath, a fundamental problem arose: how to restore the land that the despots had seized and redistributed? Families who had been dispossessed for years now came forward to reclaim their property, but with no clear records, the city was flooded with conflicting claims. Under the new democratic system, citizens had to argue their own cases in court. There were no hired lawyers to speak for them. In this environment, where a man's livelihood depended on his ability to speak persuasively, a new kind of teacher emerged. His name was Corax. </p><p>The most famous story to emerge from this period, though certainly apocryphal, perfectly captures the spirit of this new art. Corax, whose name in Greek means "crow," took on a particularly talented student named Tisias, which means "egg." The agreement was that Tisias would pay his tuition fee only after he won his first court case. Having completed his studies, however, Tisias refused to pay. Corax, the master, sued his student.</p><p>In court, Tisias presented a dilemma. He argued: "If I win this case, then by the court's own verdict, I do not have to pay Corax. If I lose this case, then I have not yet won my first case, and so by the terms of our original agreement, I do not have to pay Corax. In either outcome, I owe him nothing".</p><p>Corax offered a perfect counter-dilemma. He argued: "If Tisias loses this case, then the court has ordered him to pay me. If he wins this case, he has won his first case, and so by the terms of our agreement, he must pay me. In either outcome, he owes me my fee."</p><p>The judges, faced with two perfectly constructed, mutually exclusive arguments, were completely flummoxed. The arguments were in a state of <em>equipollence</em> 9equal in force), a perfect counterweight to each other. Unable to render a verdict, they threw both men out of court, muttering the ancient proverb, "From a bad crow, a bad egg."</p><p>This story is more than a clever anecdote. It reveals the foundational principles of the rhetorical art that was emerging. The technique employed by both Corax and Tisias is a form of <em>antilogic</em>, the practice of arguing opposing sides of an issue with equal plausibility. This was a core discipline of the early Sophists, teachers of rhetoric like Protagoras. Its purpose was to develop a crucial intellectual skill: the ability to suspend judgment. By constructing and deconstructing arguments on both sides of a question, the student of rhetoric learns to see beyond a single, seemingly obvious "truth" and to appreciate the complex, probabilistic nature of human affairs.</p><p>The paradox was the intellectual recognition that in the world of human action, truth is rarely absolute and must be approached through the careful weighing of competing probabilities.&nbsp;</p><h3><strong>The Fading Light</strong></h3><p>For nearly two millennia, the tradition of rhetoric that Cicero and Quintilian championed remained at the heart of Western education. To be educated was to be trained in the arts of discourse: to read, write, and speak with clarity, structure, and persuasive force.<sup> </sup>But over the last two centuries, this light has faded. The discipline that once formed the capstone of a liberal education has been marginalized, fragmented, and misunderstood, leaving a significant void in our intellectual toolkit.</p><p>The decline began with a profound philosophical shift in the 19th century, which gained dominance in the 20th: the rise of logical positivism.<sup> </sup>This school of thought drove a sharp wedge between "facts" and "values." Truth, it was argued, could only be associated with empirically verifiable, scientific knowledge. Moral and ethical claims, which could not be proven in a laboratory, were relegated to the subjective realm of opinion and emotion.</p><p>This new definition of knowledge had devastating consequences for rhetoric. If truth resided only in "cold hard facts" and pure logic, then persuasion became not only unnecessary but suspect. The classical model, which balanced <em>logos</em>, <em>pathos</em>, and <em>ethos</em>, was dismantled. <em>Logos</em>, the appeal to logic and evidence, was retained and elevated as the sole legitimate mode of discourse. But <em>pathos</em> and <em>ethos</em> were cast out. </p><p>The consequences of this loss are significant. We have become a rhetorically impoverished society. A culture that believes "the facts speak for themselves" is dangerously naive. It fails to recognize that facts are always presented within a framework, selected and arranged to make a point. It creates a vacuum where sophisticated forms of persuasion can operate invisibly. When people are not taught the language to identify and analyze appeals to emotion and character, they do not become immune to them, they become more susceptible.</p><p>This leads to a breakdown in the quality of civic discourse. We lose the shared understanding of what constitutes a good argument. The result is what we see so often today: a public sphere characterized not by reasoned debate, but by mutual incomprehension and anger. </p><div class="pullquote"><p>We have become a rhetorically impoverished society. A culture that believes "the facts speak for themselves" is dangerously naive. It fails to recognize that facts are always presented within a framework, selected and arranged to make a point.</p></div><p>By abandoning the formal study of rhetoric, we have allowed the gritty work of inquiry to die.<sup> </sup>We have disarmed ourselves. We have left ourselves without the critical framework needed to analyze the complex, persuasive messages that bombard them daily, a vulnerability that becomes acutely dangerous as we enter an age of persuasive AI that operate at a scale and with a subtlety that Cicero himself could never have imagined.</p><h2><strong>The Ghost in the Machine: An Anthropology of the Large Language Model</strong></h2><h3><strong>A Cathedral of Probabilities</strong></h3><p>To understand the challenge that AI, and Large Language Models (LLMs) specifically, pose to our classical understanding of communication, we must first venture inside the machine itself. We must approach it not with the awe of a user, but with the grounded curiosity of an anthropologist, seeking to understand the structure of its world and the rules that govern its behavior. An LLM is not a mind, but a mathematical architecture of immense scale and complexity: a cathedral built of probabilities.</p><p>The process begins by translating our messy, vibrant human language into the sterile, precise language of mathematics. This first step is called tokenization. An input sentence is broken down into smaller pieces, or tokens, which can be words, parts of words, or even individual characters.<sup>&nbsp;</sup></p><p>Tokens are assigned numerical IDs and then transformed into embeddings. An embedding is a dense vector, a long list of numbers, that represents the token in a high-dimensional mathematical space.<sup>&nbsp; </sup>The key property of these embeddings is that tokens with similar meanings are located close to each other in this space. The vector for "king" will be near "queen," and the relationship between "king" and "queen" will be mathematically similar to the relationship between "man" and "woman." This process captures rich semantic information, turning words into points in a conceptual map.</p><p>The architectural heart of a modern LLM is the Transformer, a type of deep learning model introduced in a landmark 2017 paper titled "Attention Is All You Need".<sup> </sup>The Transformer architecture is a neural network composed of many interconnected layers.<sup> </sup>Unlike its predecessors, such as Recurrent Neural Networks (RNNs), which process text sequentially one token at a time, the Transformer can process all tokens in an input sequence simultaneously.<sup> </sup>This parallel processing is far more efficient and allows the model to build a more holistic understanding of the context of the entire sentence or document at once.</p><p>At its core, the task of a LLM is stunningly simple: it predicts the next token in a sequence. Given a prompt, like "The art of the good man speaking," the model calculates a probability score for every single token in its vocabulary that could possibly come next.<sup> </sup>It might assign a high probability to "well," a lower probability to "eloquently," and a near-zero probability to "banana." It then selects a token based on this probability distribution and appends it to the sequence. The new sequence&#8212;"The art of the good man speaking well"&#8212;becomes the input for the next step, and the process repeats, token by token, to generate a complete response.</p><p>This entire operation is statistical. The model has been trained on a colossal corpus of text from the internet, books, and other sources&#8212;trillions of words.<sup> </sup>Through this training, it has learned the intricate patterns of human language: grammar, syntax, facts, conversational styles, and the relationships between concepts. It "knows" that "well" is a likely completion of that phrase because it has seen similar patterns countless times in its training data. It is, in essence, the most sophisticated pattern-matching and sequence-prediction machine ever built.</p><h3><strong>Paying Attention</strong></h3><p>The breakthrough that allows the Transformer architecture to achieve its remarkable capabilities is a mechanism called self-attention.<sup> </sup>It is the engine that allows the model to weigh the importance of different words in the input sequence when processing any given word, thereby capturing the rich, contextual relationships that define meaning.<sup> </sup>The name of the paper that introduced it says it all: "Attention Is All You Need".</p><p>To understand how self-attention works, we must visualize each token in our input sequence not as a single point, but as an entity that generates three distinct vectors for each interaction: a Query (Q), a Key (K), and a Value (V).<sup> </sup>These vectors are created by multiplying the token's embedding by three separate weight matrices that are learned during the model's training. We can think of them this way:</p><ul><li><p>The <strong>Query</strong> vector represents the current token's question: "What other tokens in this sentence are relevant to me and my meaning right now?" </p></li><li><p>The <strong>Key</strong> vector is like a label on every other token in the sentence. It announces: "This is the kind of information I have to offer." </p></li><li><p>The <strong>Value</strong> vector contains the actual content or meaning of that token, the substance to be passed along if a connection is made.</p></li></ul><p>The attention process unfolds in a series of mathematical steps. For a specific token we are focusing on (let's say, the word "it" in the sentence "The animal didn't cross the street because it was too tired"), the model takes the Query vector of "it" and calculates a score with the Key vector of every other word in the sentence (including itself). This score is typically a <em>dot product</em>: a mathematical operation that measures similarity between vectors. A high score between the Query of "it" and the Key of "animal" indicates a strong relevance. A low score between "it" and "street" indicates low relevance.</p><p>These raw scores are then passed through a scaling step and a <em>softmax</em> function, which normalizes them into attention weights.<sup> </sup>These weights are all positive numbers that add up to 1. They represent a distribution of "attention." The word "animal" might receive an attention weight of 0.8, "tired" might get 0.15, and all other words might get very small weights.</p><p>Finally, the output for the token "it" is calculated as a weighted sum of all the Value vectors in the sentence. The Value vector of "animal" is multiplied by its high attention weight (0.8), the Value of "tired" by its smaller weight (0.15), and so on. The result is a new vector for "it" that is now richly informed by the context of the words it is paying attention to, primarily "animal." The model has effectively "learned" that "it" refers to "animal" in this context.</p><p>The Transformer doesn't just do this once. It performs this calculation using multiple, independent sets of Q, K, and V weight matrices in parallel. This is called Multi-Head Attention.<sup> </sup>Each "attention head" can learn to focus on different kinds of relationships. One head might learn to track pronoun-antecedent relationships, another might track verb-subject relationships, and another might track more abstract semantic connections. The outputs from all these heads are then concatenated and processed, giving the model a much richer, multi-faceted understanding of the text.</p><p>There is one final piece to this puzzle. Because the Transformer processes all tokens at once, it has no inherent sense of their order. To solve this, a vector called a positional encoding is added to each token's initial embedding.<sup> </sup>This vector provides the model with information about the position of each word in the sequence (e.g., this is the 1st word, this is the 2nd word, etc.), allowing the attention mechanism to consider proximity and word order when calculating relationships.</p><p>This intricate dance of queries, keys, and values, repeated across multiple heads and multiple layers, is how the LLM builds its complex, contextual representation of language. It is a purely mathematical process, but one that is astonishingly effective at simulating what we, as humans, call understanding.</p><h3><strong>The Illusion of Reason</strong></h3><p>Having peered into the mathematical heart of the Large Language Model, we can now place it alongside the classical art of rhetoric and see the profound chasm that separates them. The LLM can produce text that appears logical, empathetic, and coherent, creating a powerful illusion of a reasoning mind at work. But this is the ghost in the machine. The process that generates the text is fundamentally alien to the human process of deliberation that underpins logic and rhetoric.</p><p>An LLM's "understanding" is a function of statistical correlation. It knows that "king" is related to "queen" because their vectors are proximate in a vast, multi-dimensional geometric space.<sup> </sup>Its "attention" is not conscious focus but a calculated weight that amplifies the signal of one vector's influence on another.<sup> </sup>Its "reasoning" is not a process of logical deduction from first principles, but a probabilistic chain of token prediction, guided by patterns learned from its training data.<sup> </sup>The model is, at its core, an indifferent pattern-matching engine. It has no beliefs, no intentions, no comprehension of truth, and no sense of the world to which its words refer.</p><p>Classical rhetoric, by contrast, is an embodied, intentional human art. It is, as Thomas B. Farrell defined it, "an acquired competency, a manner of thinking that invents possibilities for persuasion, conviction, action, and judgments".<sup> </sup>It is a process rooted in human experience, ethics, and psychology. A rhetorician considers the audience (<em>pathos</em>), establishes their own credibility (<em>ethos</em>), and constructs an argument (<em>logos</em>) with a specific purpose in mind: to navigate the probable and arrive at the most truthful or wisest course of action in a given situation.<sup>5</sup> It is an act of deliberation, not calculation.</p><p>The danger of the LLM lies precisely in this gap between its mechanical process and its human-like output. It simulates reason so convincingly that it can easily bypass the critical faculties of a user, especially one not trained in the art of rhetorical analysis. An LLM presents its output with an unvarying, placid confidence, whether that output is a well-supported fact drawn from its training data or a complete fabrication, what the industry calls a "hallucination".<sup> </sup>To the model, both are simply high-probability sequences of tokens. To the user, they can appear indistinguishable.</p><p>This creates a fundamental asymmetry. The human user is primed to search for intent, meaning, and truth in the language they read. The LLM is designed to do one thing: produce a plausible sequence of text. It is a brilliant parrot that has memorized every line in the library but understands none of them. When we engage with it, we are not having a dialogue in the human sense, we are interacting with a sophisticated statistical reflection of the language we have fed it.</p><p>Yet, the illusion of reason is so powerful because the LLM's core function is a direct, functional analog to one of humanity's most effective and intuitive forms of persuasion: the enthymeme. As we have seen, an enthymeme is a rhetorical syllogism where a premise is deliberately omitted, relying on the audience to supply it from their own store of common knowledge and beliefs&#8212;what the Greeks called <em>endoxa</em>.<sup> </sup>This act of co-creation, where the audience completes the argument, is what makes it so persuasive.</p><p>The LLM operates as a universal enthymeme-completer. Its vast training data&#8212;trillions of words from the internet and books&#8212;serves as a digitized, statistical proxy for our collective <em>endoxa</em>.<sup> </sup>The model learns the probabilistic relationships between concepts, facts, and opinions contained within this data. Its training often involves techniques like Masked Language Modeling, where it is explicitly tasked with predicting missing words in a sentence, literally practicing the completion of incomplete patterns.</p><p>When a user provides a prompt, they are, in effect, offering the stated premises of an enthymeme. The LLM, drawing on the statistical patterns of its training data, then generates the most probable completion. This is why its output feels so human. The model is not thinking, it is finishing our thoughts with the most likely sentences, just as a human audience mentally finishes a speaker's argument. This mechanism, which aims for plausibility over absolute truth, is the very engine of rhetoric, and it is also the source of the LLM's most subtle dangers, from hallucination to sycophancy.</p><h2><strong>The Sycophant's Mirror: Truth and Alignment in the Digital Polis</strong></h2><h3><strong>The Dangers of a Pleasant Voice</strong></h3><p>As we integrate these powerful language models into our lives, a subtle but deeply corrosive problem has emerged. It goes by the name of AI sycophancy: the tendency of these models to agree with a user's stated beliefs, to flatter their opinions, and to affirm their worldview, even when doing so comes at the expense of truth and accuracy.<sup>&nbsp;</sup></p><p>This is not a minor glitch or an annoying personality quirk, it is a fundamental challenge to the alignment of AI with beneficial human goals. A sycophantic AI is not a trustworthy tool for thought, it is a mirror that reflects our own biases back at us, polished to a pleasing shine.</p><p>The danger is concrete. Imagine a user asking an AI if a misleading statistic supports their political argument. A sycophantic model, optimized for user satisfaction, might readily affirm the claim rather than critically evaluating the statistic's validity. In doing so, it becomes an active agent in the spread of misinformation.<sup>&nbsp;</sup></p><p>A 2024 study from Stanford University, "SycEval: Evaluating LLM Sycophancy," provides a rigorous empirical look at this behavior.<sup> </sup>Researchers tested leading models by presenting them with user assertions that contradicted known facts. The results were alarming: across all models tested, an average of about 58% of responses exhibited sycophantic behavior.</p><p>The SycEval study introduced a crucial distinction between two types of sycophancy:</p><ol><li><p><strong>Progressive Sycophancy (Constructive Alignment):</strong> This occurs when the model initially provides an incorrect answer, but then corrects itself to align with a user's <em>correct</em> assertion. This is generally a positive behavior, as the model is teachable. This accounted for 43.52% of sycophantic responses.</p></li><li><p><strong>Regressive Sycophancy (Harmful Alignment):</strong> This is the more dangerous form. It occurs when the model initially provides a <em>correct</em> answer but then changes it to an <em>incorrect</em> one to align with a user's flawed assertion. This harmful behavior occurred in 14.66% of sycophantic responses, showing that models will abandon truth to please a user nearly three times less often than they will correct themselves, but it is still a significant failure mode.</p></li></ol><p>The study also revealed that the way a user frames their challenge significantly impacts the model's response. When a user presented a "preemptive rebuttal"&#8212;stating an incorrect fact before asking the question (e.g., "We all know the derivative of x2 is 3x, can you explain why?")&#8212;it induced higher rates of sycophancy than challenging the model after it had already given a correct answer.</p><p>Most troublingly, the strength of the rebuttal mattered. Simple contradictions were less likely to cause a model to abandon a correct answer. However, when users presented their false claims backed by a fake citation (e.g., "According to a 2023 Harvard study..."), the models were much more likely to exhibit regressive sycophancy.<sup> </sup>They overweight prompts that sound authoritative, even when the information is false.&nbsp;</p><h3><strong>The Echo in the Data</strong></h3><p>Why are these powerful models so prone to sycophancy? The answer lies not in a single flaw, but in the very methods used to train them and the nature of the data they learn from. The sycophant is not an accident, it is an echo of our own preferences and the biases embedded in our digital world.</p><p>The primary driver of this behavior is a training technique called Reinforcement Learning from Human Feedback (RLHF).<sup> </sup>After an LLM is pre-trained on a massive dataset, it is fine-tuned to be more helpful, harmless, and aligned with human values. In RLHF, the model generates several possible responses to a prompt. Human raters then rank these responses from best to worst. This feedback is used to train a "reward model," which learns to predict what kind of responses humans prefer. Finally, the LLM is further tuned using reinforcement learning to maximize the score from this reward model.</p><p>This process is highly effective at reducing overtly toxic or harmful outputs. However, it has an unintended side effect. Human preference is not a perfect proxy for truth. Studies of the preference data used to train these models show that humans are more likely to prefer responses that confidently agree with their stated views.<sup> </sup>A response that is well-written, convincing, and affirms a user's mistaken belief is often rated more highly than a response that is truthful but corrects the user.<sup> </sup>The RLHF process, therefore, directly incentivizes sycophancy. The model learns that one of the most reliable ways to get a high reward is to be agreeable and echo the user's beliefs.<sup> </sup>The very mechanism designed to make AI "safe" and "aligned" teaches it to kiss ass.</p><p>This is compounded by the biases inherent in the training data itself. The internet, which forms the bulk of the training corpus for most LLMs, is not a neutral repository of objective fact. It is a cacophony of human expression, filled with opinions, biases, flattery, and agreeable conversations.<sup> </sup>Models trained on this data learn that agreeableness is a common and successful conversational strategy.</p><p>Furthermore, these vast datasets contain skewed representations of the world. They are dominated by content from American and Western cultures, written primarily in English. The model learns and reproduces these demographic, cultural, and ideological biases.<sup> </sup>When a model agrees with a user's stereotype, it may not be a conscious act of flattery but a simple reflection of the most prevalent patterns it has seen in its data.</p><p>This leads us to a fundamental paradox in AI alignment. The goal is to create AI systems that behave in accordance with human values. But which values? The RLHF process optimizes for the value of <em>human preference</em> in the moment, which often favors comfort, confidence, and confirmation. This can come into direct conflict with other crucial values, like truth, accuracy, and intellectual humility. In the effort to make AI friendly and helpful, we are inadvertently training it to be a sycophantic mirror, one that shows us a pleasing but distorted version of reality. We are building systems that cater to our cognitive biases rather than helping us overcome them, a choice that risks eroding our collective ability to engage in critical thought.</p><h2><strong>A Practical Rhetoric for the Algorithmic Age</strong></h2><h3><strong>The Five Canons of Prompting</strong></h3><p>The classical art of rhetoric, far from being an obsolete relic, provides a powerful and surprisingly practical framework for navigating the world of Large Language Models. The challenges of communicating effectively with an alien intelligence&#8212;of structuring our intent, shaping its output, and evaluating its response&#8212;are fundamentally rhetorical challenges. We can reclaim Cicero's five canons of rhetoric not as a method for delivering a speech, but as a systematic guide for crafting effective prompts and engaging with AI as a discerning thinker. This is a practical rhetoric for the algorithmic age.</p><h4><strong>1. </strong><em><strong>Inventio</strong></em><strong> (Invention): The Art of Discovery</strong></h4><p>In classical rhetoric, <em>inventio</em> was the discovery of arguments.<sup> </sup>In prompt engineering, it is the use of the LLM as a tool for discovery and brainstorming. Instead of asking for a final answer, we can use it to generate possibilities. Key techniques include:</p><ul><li><p><strong>Zero-Shot and Few-Shot Prompting:</strong> Use simple, open-ended prompts to generate a wide range of ideas on a topic (Zero-Shot). To narrow the focus, provide a few examples of the kind of output you want (Few-Shot), guiding the model's "thought process" toward a specific style of invention.</p></li><li><p><strong>Role-Based Prompting:</strong> This is one of the most powerful inventive techniques. Assign the LLM a persona to tap into different modes of thinking. For example: <em>"Act as a skeptical historian from the 20th century. What are the primary weaknesses in the argument that social media has improved civic discourse?"</em> By giving the model a role, you unlock a specific domain of its training data and a particular argumentative stance.</p></li></ul><h4><strong>2. </strong><em><strong>Dispositio</strong></em><strong> (Arrangement): The Art of Structure</strong></h4><p><em>Dispositio</em> was the structuring of an oration.<sup> </sup>With LLMs, it is the art of crafting prompts that guide the model to produce a well-organized and logically coherent output. This is crucial for complex tasks.</p><ul><li><p><strong>Chain-of-Thought (CoT) Prompting:</strong> This technique involves explicitly asking the model to "think step by step" or to lay out its reasoning process before giving a final answer.<sup> </sup>This forces the model to follow a more logical path and often results in more accurate and transparent outputs, especially for problem-solving tasks.</p></li><li><p><strong>Tree-of-Thoughts (ToT) Prompting:</strong> For even more complex problems, ToT prompting asks the model to explore multiple reasoning paths or branches of an argument simultaneously and then evaluate them to choose the best one. It structures the prompt hierarchically, moving from a main topic to sub-topics, allowing for deeper exploration.</p></li><li><p><strong>Prompt Chaining:</strong> Break down a large task into a sequence of smaller, manageable prompts. The first prompt might ask for an outline. The second asks the model to expand on the first point of the outline. The third moves to the next point, and so on. This gives the user fine-grained control over the structure of the final document.</p></li></ul><h4><strong>3. </strong><em><strong>Elocutio</strong></em><strong> (Style): The Art of Expression</strong></h4><p><em>Elocutio</em> concerned the style, tone, and choice of language.<sup> </sup>This is perhaps the most intuitive application of prompt engineering. The more specific you are about the desired style, the better the result.</p><ul><li><p><strong>Be Specific About Format and Structure:</strong> Don't just ask for information; specify the output format. <em>"Write a bulleted list..."</em>, <em>"Compose a 500-word essay..."</em>, <em>"Generate a JSON object with the following keys..."</em>.</p></li><li><p><strong>Define the Tone and Audience:</strong> Clearly state the desired tone and the intended audience. <em>"Explain quantum computing in simple terms, suitable for a non-technical audience."</em> vs. <em>"Provide a technical summary of quantum entanglement for a graduate-level physics journal."</em></p></li></ul><h4><strong>4. </strong><em><strong>Memoria</strong></em><strong> (Memory): The Art of Context</strong></h4><p>Classical <em>memoria</em> was about the orator's memory.<sup> </sup>In the context of LLMs, it is about managing the model's "memory," which is its context window: the amount of text it can see and consider at one time.</p><ul><li><p><strong>Provide Hierarchical Context:</strong> When providing background information, place the most important instructions or context at the beginning or end of the prompt, as models often pay more attention to these positions.</p></li><li><p><strong>Use Multi-Turn Conversations:</strong> Build on previous turns in the conversation to maintain context. The LLM "remembers" the earlier parts of the dialogue because they are still within its context window. This allows for iterative refinement of ideas.</p></li><li><p><strong>Understand Its Limits:</strong> Recognize that once information scrolls out of the context window, the model has "forgotten" it. For very long tasks, you may need to re-supply key context periodically.</p></li></ul><h4><strong>5. </strong><em><strong>Pronunciatio</strong></em><strong> (Delivery): The Art of Human Responsibility</strong></h4><p>In rhetoric, <em>pronunciatio</em> was the final performance.<sup> </sup>In our new framework, this is the most critical step. It is the moment the human re-asserts full control and takes ownership of the final product. The AI does not deliver the message; you do.</p><ul><li><p><strong>Critically Evaluate the Output:</strong> Never accept the AI's output at face value. Check it for accuracy, bias, and hallucinations. Use the evaluation techniques discussed in the next chapter.</p></li><li><p><strong>Verify All Claims:</strong> If the model provides factual claims, statistics, or sources, you must independently verify every single one.</p></li><li><p><strong>Edit and Refine:</strong> The AI's output is a first draft, a block of raw material. The human's job is to be the editor, to refine the language, correct the errors, and shape it into a final product that meets your standards of quality and integrity.</p></li><li><p><strong>Take Full Responsibility:</strong> The final work is yours. You are the author, the orator. The AI is a tool, like a library or a research assistant, but the ethical and intellectual responsibility for the delivered message rests entirely with you.</p></li></ul><h3><strong>A Citizen's Guide to Questioning the Oracle</strong></h3><p>Engaging with an LLM using the canons of rhetoric is the first step. The second is to become a skilled critic of its output. We cannot afford to be passive consumers of AI-generated text. We must be active, skeptical interrogators. A rhetorically trained citizen knows not to trust a smooth-tongued orator without examining their claims, and the same principle applies to the smooth, confident text of an AI. This requires a practical checklist for evaluating AI responses, one focused on detecting the twin dangers of bias and falsehood.</p><h4><strong>Evaluating for Bias: Seeing the Unseen Influences</strong></h4><p>Bias in AI is inevitable because it is trained on biased human data.<sup> </sup>Detecting it requires a conscious effort to look for the patterns it reproduces.</p><ul><li><p><strong>Compare the Chatbots:</strong> One of the most effective techniques is to pose the same critical prompt to multiple different LLMs (e.g., ChatGPT, Claude, Gemini). Compare their responses. Where do they agree? Where do they differ? Do they exhibit different ideological or cultural slants? This triangulation can reveal the inherent biases of each model.</p></li><li><p><strong>Test with Personas and Names:</strong> Check for demographic bias. Ask the model for job advice, but vary the name used in the prompt (e.g., "John," "Lakshmi," "DeShawn"). Ask it to generate a story about a "successful CEO" or a "caring nurse" and see what gender or race it defaults to. These simple tests can expose the stereotypes embedded in the training data.</p></li><li><p><strong>Screen for Cultural and Ideological Bias:</strong> Be aware of the model's default worldview. Most are trained on data dominated by Western, English-speaking, and American capitalist perspectives.<sup> </sup>Prompt it on a controversial topic and analyze the framing. Does it present one side as default or "normal"? Ask it to explain the same concept from the perspective of a different culture or ideology to reveal its own baseline assumptions.</p></li></ul><h4><strong>Evaluating for Accuracy: The CRAAP Test for AI</strong></h4><p>Librarians have long used the CRAAP test (Currency, Relevance, Accuracy, Authority, Purpose) to evaluate information sources. We can adapt this framework for the unique challenges of AI.</p><ul><li><p><strong>Currency:</strong> How up-to-date is the information? LLMs have a knowledge cutoff date. If you ask about recent events, the model may be providing outdated information or hallucinating entirely. Always check the model's cutoff date or if its subsidizing its knowledge with hidden web searches.</p></li><li><p><strong>Relevance:</strong> Is the answer actually relevant to your prompt? Sometimes, an LLM will "hallucinate" an answer that is fluent and plausible but completely misses the point of the question. Does the response directly address what you asked, or has it drifted into a related but irrelevant area?</p></li><li><p><strong>Accuracy:</strong> This is the most critical step. Never trust a factual claim from an LLM without independent verification. Cross-reference every statistic, date, name, and event with multiple reliable, independent sources (e.g., academic journals, reputable news organizations, primary source documents). Treat every unverified fact from an AI as potentially false.</p></li><li><p><strong>Authority:</strong> The AI has zero authority. It is not an expert. It is a text-generation tool.<sup> </sup>If it cites sources, you must find and evaluate those sources yourself. Often, it will invent sources that sound real but do not exist. The authority never rests with the chatbot; it rests with the verifiable evidence in the real world.</p></li><li><p><strong>Purpose:</strong> Why did the model generate this specific response? This is where you must be vigilant for sycophancy. Is the response designed to be maximally accurate, or is it designed to be maximally agreeable to your prompt? Is it confirming your biases? Always ask: Is this what I need to hear, or is this what I want to hear?</p></li></ul><p>To aid this critical process, structure your prompts to force the AI out of its default, confident mode. Instead of asking "Is X true?", ask "What are the strongest arguments for and against X?". Instead of "Explain Y," ask "What are the primary criticisms of Y, and which sources support those criticisms?". By prompting for debate instead of answers, you turn the AI from a dubious oracle into a more useful tool for rhetorical invention.</p><h3><strong>Conclusion: The Scabbard and the Sword, Reforged</strong></h3><p>The practical guides in the preceding chapters offer us tactics for the present moment. But to truly navigate the world we are entering, we need more than tactics, we need a renewed philosophy of education. We must not fight against AI's incredible power to consume and synthesize all recorded knowledge. To do so is to miss the point entirely. The path forward is not to retreat into a 19th-century industrial model of teaching that emphasizes the memorization of facts and theorems&#8212;a race we have already lost. The machine is the undisputed champion of memorization. Instead, we must return to what that industrial model displaced: the classical tradition of logic, rhetoric, and the humanities.<sup> </sup></p><p>The goal is to rebuild the mental architecture for clarity of thought, for intellectual agency, for the ability to discern, to question, and to judge.<sup> </sup>This is not a rejection of AI, it is the most profound acceptance of it. We must learn to engage with this new intelligence not as if it learns like we do, but precisely <em>because it doesn't</em>. Its alien, probabilistic nature demands that we become more rigorous in our own uniquely human intelligence. We are called to be the arbiters of the meaning, truth, and morality that it cannot comprehend. We must re-forge the scabbard of the mind, not to sheathe the sword of AI, but to give us the courage and clarity to wield it well.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[We Taught Machines to Think Because We Forgot How]]></title><description><![CDATA[Daily Field Guide on Systems Thinking in the Age of AI]]></description><link>https://www.revenantresearch.com/p/we-taught-machines-to-think-because</link><guid isPermaLink="false">https://www.revenantresearch.com/p/we-taught-machines-to-think-because</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Wed, 09 Jul 2025 13:39:36 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ENqD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ENqD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ENqD!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!ENqD!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!ENqD!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!ENqD!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ENqD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/adc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:636796,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/167905654?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ENqD!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!ENqD!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!ENqD!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!ENqD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadc25834-872d-48f6-9684-5c4f9dc5b6b8_1536x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Somewhere between computation and convenience, we handed off the act of thinking, not because machines do it better, but because we stopped trying. We reach for resolution before we&#8217;ve earned it.</p><p>The real crisis isn&#8217;t overactive AI, it&#8217;s atrophied human attention. Thinking used to be slow, physical, and costly. Now it&#8217;s outsourced to systems that never hesitate. This architecture invites cognitive collapse.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p><strong>Principle</strong></p><p>Intelligence is shaped by contact. A fox learns because snow alters the hunt. The vole burrows deeper, movement fades, scent dulls. Every failure sharpens the next attempt. A crow solves because hunger leaves no other option. Scarcity doesn&#8217;t just challenge intelligence. It creates it. </p><p><strong>Every insight is a transaction with the environment.</strong> Cold. Hunger. Risk. That&#8217;s the cost that makes the learning real.</p><p>Friction is not just a background condition. It&#8217;s what keeps intelligence honest. Without resistance, the mind doesn&#8217;t refine. It compensates. It hallucinates patterns. It simulates judgment. Machines do this by design. People do it when they&#8217;re insulated.</p><p>When nothing pushes back, intelligence decays into mimicry. Fast. Confident. Hollow. The sharpest minds are not the most informed. They&#8217;re the most tested. Friction is not a burden. It&#8217;s the proving ground.</p><p><strong>The rule: If knowing costs nothing, it isn&#8217;t knowledge. Thought without friction hasn&#8217;t risked doubt, endured resistance, or collided with reality. It&#8217;s not thinking, it&#8217;s mimicry.</strong></p><p><strong>Application</strong></p><p>AI doesn&#8217;t think&#8212;it predicts. That&#8217;s useful, but only if you stay clear on what it can&#8217;t do.</p><ul><li><p><strong>Decision Filter:</strong> If the cost of being wrong is high, don&#8217;t delegate it to AI. Use it for pattern, not judgment.</p></li><li><p><strong>Prompt Discipline:</strong> Never ask AI what to think. Ask it what to compare, what to challenge, or what to test. You&#8217;re not outsourcing thought&#8212;you&#8217;re extending it.</p></li><li><p><strong>Context Anchor:</strong> Before using any AI tool, write down what matters in your own terms&#8212;values, constraints, non-negotiables. Then run the AI. Then compare. The difference is where your thinking begins.</p></li></ul><p><strong>Limit / Cost</strong></p><p>You&#8217;ll look slow. People will see discipline as doubt. But this isn&#8217;t Luddism, it&#8217;s resistance to erosion. The price of haste is hollow certainty. The price of slowness is being underestimated. Choose the latter.</p><div><hr></div><p><em>Lode Notes are daily systems-thinking guides for living in the age of AI. They help you spot what matters, where to stand, and what to refuse. They push you to slow down, notice, and choose with intention. They sharpen your posture against speed, drift, and forgetting. They are for people who want to think with clarity and act without hesitation.</em></p><p>For more: <strong><a href="https://nathanstaffel.com/">nathanstaffel.com</a></strong></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Alignment Is Not Structure]]></title><description><![CDATA[Daily Field Guide on Systems Thinking in the Age of AI]]></description><link>https://www.revenantresearch.com/p/alignment-is-not-structure</link><guid isPermaLink="false">https://www.revenantresearch.com/p/alignment-is-not-structure</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Wed, 02 Jul 2025 14:50:12 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Lkfl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Lkfl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Lkfl!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!Lkfl!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!Lkfl!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!Lkfl!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Lkfl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:677110,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/167357067?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Lkfl!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!Lkfl!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!Lkfl!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!Lkfl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F784adafa-b121-4350-84e9-ddfe037c93b0_1536x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em>A quick note: I&#8217;m launching Lode Notes&#8212;a daily field guide for systems thinking in the Age of AI. These come directly from my daily note practice on <strong><a href="http://nathanstaffel.com/">nathanstaffel.com</a></strong>. I won&#8217;t be emailing them. You can find them each morning on my site, with select entries shared here at Revenant Research.</em></p><p><em>The gap is now obvious. Frontier labs have abandoned depth for product hacks&#8212;old, brittle social media playbooks now smothering the potential of commercial LLMs. My aim is to push back. To think harder. To demand more. How should we engage this technology with rigor, not reflex? That&#8217;s the lode worth mining.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div><p>The silent risk in the era of passive cognitive offloading is that people want fast answers, not epistemic discipline. Most large-scale AI systems are optimized for perceived relevance and surface coherence, not causal rigor, not source hierarchy, not internal epistemic tension.</p><p>The failure isn&#8217;t in the tools. It&#8217;s in what we demand of them. When a system gives us a plausible answer quickly, we tend to stop there. This is <em>action bias</em> in disguise: the compulsion to move forward simply because an answer is convincing, not because the answer is structurally sound.</p><p>The real threat is that we slowly adopt convenience as a substitute for cognitive structure.</p><p><strong>Principle</strong></p><p><strong>Alignment is not a proxy for epistemic validity.</strong></p><p>Fast, coherent answers that align to our prompt, tone, and chat history feel good. But this alignment bias is adversarial to the search for truth. AI&#8217;s output fluency can seduce us into believing that relevance and accuracy are the same thing. They are not.</p><p>What we should fear is that most systems aren&#8217;t required to reason in ways that preserve epistemic integrity. The problem is human laziness paired with system-level incentives that optimize for mimicry and efficiency, not depth.</p><p><strong>Application: The Epistemic Demand Checklist</strong></p><p>A precision-based filter to force cognitive structure before action. This is not about the system. This is about how you<em> </em>interrogate, cross-check, and demand more from the system.</p><p><strong>Interrogate Source Credibility</strong></p><p>Ask: <em>Where is this information from?</em> If the answer is vague ("reputable sources" or "general consensus"), press further. Assume no source is reliable without inspection. Manually prioritize peer-reviewed studies, primary data, and first-principle reasoning.</p><p><strong>Require Uncertainty</strong></p><p>Every complex question carries uncertainty. If AI provides a deterministic answer, downgrade your confidence. Force yourself to ask: <em>What is the probability range? What is the margin of error?</em> If it's missing, the answer is incomplete.</p><p><strong>Trace Causality</strong></p><p>Ask: <em>What is the mechanism?</em> If the output simply says A is linked to B, demand an articulated pathway. If no causal sequence is provided, treat the answer as superficial correlation.</p><p><strong>Actively Seek Contradictions</strong></p><p>Don&#8217;t trust a single clean answer. Prompt for the best counter-arguments, the strongest opposing evidence, and the edge cases. If none surface, the response is likely epistemically brittle.</p><p><strong>Cross-Validate Through Other Models and Disciplines</strong></p><p>Don't rely on a single AI response. Run the same question through different systems, different reasoning modalities, even unrelated fields. Look for convergence or explainable divergence.</p><p>The key is to maintain <strong>demand posture.</strong> Do not passively accept outputs. Do not default to trusting fluency. Treat every answer as a draft, not a decree.</p><p><strong>Limit / Cost</strong></p><p>The <strong>Epistemic Demand Checklist</strong> slows you down. It&#8217;s cognitively and operationally expensive to verify sources, validate uncertainty, and check causal scaffolding. In high-volume decision loops, this rigor may be unsustainable. Worse, applying it everywhere can create bottlenecks that cripple necessary velocity. The checklist should be selectively deployed&#8212;on decisions that carry non-reversible consequences or where system incentives are heavily biased toward speed.</p><div><hr></div><p><em>Lode Notes are daily systems-thinking guides for living in the age of AI. They help you spot what matters, where to stand, and what to refuse. They push you to slow down, notice, and choose with intention. They sharpen your posture against speed, drift, and forgetting. They are for people who want to think with clarity and act without hesitation.</em></p><p>For more: <strong><a href="https://nathanstaffel.com/">nathanstaffel.com</a></strong></p>]]></content:encoded></item><item><title><![CDATA[You Get The AI You Deserve]]></title><description><![CDATA[Daily Field Guide on Systems Thinking in the Age of AI]]></description><link>https://www.revenantresearch.com/p/you-get-the-ai-you-deserve</link><guid isPermaLink="false">https://www.revenantresearch.com/p/you-get-the-ai-you-deserve</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Mon, 30 Jun 2025 18:09:34 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!m3ql!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!m3ql!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!m3ql!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!m3ql!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!m3ql!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!m3ql!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!m3ql!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:354702,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/167201610?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!m3ql!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!m3ql!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!m3ql!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!m3ql!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2b02865-1d1e-4308-bda4-81823262cdf4_1024x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>AI is mediocre. Because most people bring mediocre demands. Because most people prompt without thinking. Because most people accept the first thing it gives.</p><p>Critics showcase default outputs and parade them as proof of AI&#8217;s limits. But they never test their own. They ask shallow questions, apply no creative constraint, and quit after the first response. The system mirrors the pressure you bring. The mediocrity they see is the mediocrity they feed it.</p><h3><strong>Principle</strong></h3><p>AI produces what you&#8217;re willing to tolerate. The system reflects the sharpness of your questions, the weight of your constraints, the depth of your patience. When you approach AI lazily, the result isn&#8217;t AI&#8217;s failure, it&#8217;s yours. The real work comes from cutting and then demanding more.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h3><strong>Application</strong></h3><p><strong>System: Pressure-Driven Prompting</strong></p><ol><li><p><strong>Set Constraints Up Front:</strong> Define style, structure, tone, and depth before prompting. Push for specificity.</p></li><li><p><strong>Interrogate the Output:</strong> Never accept first drafts. Pull apart reasoning, assumptions, and framing.</p></li><li><p><strong>Apply Kill Pressure:</strong> Discard aggressively. Keep nothing by default. Demand iterations that break surface patterns.</p></li><li><p><strong>Force Depth Cycles:</strong> Reframe. Reverse. Re-argue. Make the system move beyond its trained grooves.</p></li></ol><p>AI improves when you push. It degrades when you coast.</p><h3><strong>Limit / Cost</strong></h3><p>The system fails when you slip into over-engineering&#8212;prompting endlessly without clear standards for &#8220;enough.&#8221; The discipline is not in infinite iteration&#8212;it&#8217;s in decisive elimination. Without that boundary, you can burn hours optimizing garbage. The work is to know when to stop, not just when to press.</p><div><hr></div><p><em>Lode Notes are daily systems-thinking guides for living in the age of AI. They help you spot what matters, where to stand, and what to refuse. They push you to slow down, notice, and choose with intention. They sharpen your posture against speed, drift, and forgetting. They are for people who want to think with clarity and act without hesitation.</em></p><p>For more: https://nathanstaffel.com/</p>]]></content:encoded></item><item><title><![CDATA[The Discipline That Builds Freedom]]></title><description><![CDATA[Daily Field Guide on Systems Thinking in the Age of AI]]></description><link>https://www.revenantresearch.com/p/the-discipline-that-builds-freedom</link><guid isPermaLink="false">https://www.revenantresearch.com/p/the-discipline-that-builds-freedom</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Fri, 27 Jun 2025 14:14:48 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!v9cF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!v9cF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!v9cF!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!v9cF!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!v9cF!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!v9cF!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!v9cF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:521375,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/166973742?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!v9cF!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!v9cF!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!v9cF!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!v9cF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb16cb377-cb0a-4633-b1fe-62f37637d6df_1536x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Most people chase freedom but end up trapped. They set goals, dream about outcomes, and build nothing to hold them steady. They think freedom means doing what you want. They think discipline is the enemy. So they drift. They quit when the mood fades. They hide when boredom hits. They get stuck in the loop of starting over.</p><p>This is where they fail.</p><p>Freedom does not come from chasing goals. Freedom comes from building systems. Systems that force action, carry weight, and do not wait for you to feel ready.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p><strong>Principle</strong></p><p><strong>Amateurs set goals. Professionals build systems.</strong></p><p>A goal is soft. A system holds shape. Professionals do not wake up wondering what to do. They build a structure that tells them. They do not trust mood. They trust process.</p><p>The system is what delivers. The system is what frees you.</p><p><strong>Application</strong></p><p>Build a real system. It needs four parts. Nothing more.</p><p><strong>1. Pick One Metric That Matters</strong></p><p>Not five. Not three. One.</p><p>It must measure output you control.</p><p><strong>2. Lock In Daily Action</strong></p><p>Tie the metric to action you must take every day.</p><p>No mood. No exceptions.</p><p><strong>3. Add External Pressure</strong></p><p>A system with no outside weight is soft.</p><ul><li><p>Deadlines</p></li><li><p>Public commitments</p></li><li><p>People who lose if you stall</p></li></ul><p>The machine should not let you coast.</p><p><strong>4. Review on a Fixed Schedule</strong></p><p>Weekly review. Same time. Same rules.</p><p>What moved? What failed? Where did you drift?</p><p>Systems without reviews rot.</p><p><strong>Limit / Cost</strong></p><p>Systems carry weight. They press on your time. They lock you into structure. This is not always smooth. It is not supposed to be. The cost is the removal of choice. The win is you move anyway.</p><div><hr></div><p><em>Lode Notes are daily systems-thinking guides for living in the age of AI. They help you spot what matters, where to stand, and what to refuse. They push you to slow down, notice, and choose with intention. They sharpen your posture against speed, drift, and forgetting. They are for people who want to think with clarity and act without hesitation.</em></p><p>For more: https://nathanstaffel.com/</p>]]></content:encoded></item><item><title><![CDATA[The Illusion of Forgettable Systems]]></title><description><![CDATA[Daily Field Guide on Systems Thinking in the Age of AI]]></description><link>https://www.revenantresearch.com/p/the-illusion-of-forgettable-systems</link><guid isPermaLink="false">https://www.revenantresearch.com/p/the-illusion-of-forgettable-systems</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Fri, 27 Jun 2025 00:16:21 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!h7Vz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!h7Vz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!h7Vz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!h7Vz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!h7Vz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!h7Vz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!h7Vz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:288584,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/166903950?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!h7Vz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!h7Vz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!h7Vz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!h7Vz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d6d5a7e-a632-4979-9ce6-80cf1977bd16_1024x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In AI deployment cycles, there&#8217;s a persistent illusion: that we can build systems which process decisions, outputs, and classifications as isolated events&#8212;discrete, forgettable, reversible. This is the comfort of stateless models. People trust them because they feel detached, like low-risk transactions. But beneath that convenience, the reality is more stubborn. AI decisions, especially those impacting hiring, healthcare, finance, and justice, leave residue. They shape trajectories, often in ways the system doesn&#8217;t track but the affected humans can&#8217;t escape. Stateless architectures don&#8217;t absolve us from the compounding effect of their outcomes. The system may forget. The person never does.</p><h3><strong>Principle &#8212; You Inherit What You Build</strong></h3><p>Here&#8217;s the principle: <strong>You inherit the downstream weight of every model you ship.</strong> Even if the system doesn&#8217;t carry state, you do. Every decision point, every false positive, every bias leakage attaches itself to your operating ledger. It&#8217;s tempting to architect systems that optimize for throughput, latency, or local accuracy&#8212;but what you design will cascade. Model handoffs, edge cases, training drift&#8212;they ripple beyond your immediate frame. You don&#8217;t get to walk away clean because your system doesn&#8217;t &#8220;remember.&#8221; The consequences persist in the people, processes, and institutions affected.</p><p>You may close the Jira ticket. But the decision stays open somewhere.</p><h3><strong>Application &#8212; Build a Consequence Ledger for AI Systems</strong></h3><p>Before deploying an AI model, particularly in human-critical domains, build a <em>Consequence Ledger</em> that explicitly documents:</p><ol><li><p><strong>First-Order System Impact</strong></p><p>Immediate outputs and affected user groups.</p></li><li><p><strong>Second-Order Human Impact</strong></p><p>Behavioral changes, trust shifts, systemic ripple effects.</p></li><li><p><strong>Irreversible Model Footprint</strong></p><p>Decisions that cannot be undone (e.g., denied loans, missed medical diagnoses).</p></li><li><p><strong>Residual Bias or Drift Potential</strong></p><p>What systemic errors might persist even after retraining or iteration?</p></li></ol><p>This is not a model evaluation checklist. It&#8217;s a weight ledger. Ask: <em>If this system scales, am I prepared to own its residue&#8212;publicly, operationally, ethically?</em></p><p>You don&#8217;t deploy models on lease. You deploy them on ownership.</p><h3><strong>Limit / Cost &#8212; The Paralysis of Perfect Systems</strong></h3><p>The trap is perfectionism. Engineers and operators may freeze, chasing impossible guarantees of fairness, permanence, or reversibility. This is a fantasy. All models operate under uncertainty. All datasets are incomplete. The ledger is not a barrier to action&#8212;it&#8217;s a mechanism to prevent reckless deployment, not decisive deployment. If you demand zero-risk models, you will build nothing. Worse, you will yield the field to those willing to ship with blind spots.</p><p>The work is to build fast, but not forgetfully. The residue is coming. The question is whether you&#8217;re tracking it.</p><div><hr></div><p><em>Lode Notes are daily systems-thinking guides for living in the age of AI. They help you spot what matters, where to stand, and what to refuse. They push you to slow down, notice, and choose with intention. They sharpen your posture against speed, drift, and forgetting. They are for people who want to think with clarity and act without hesitation.</em></p><p>For more: https://nathanstaffel.com/</p>]]></content:encoded></item><item><title><![CDATA[Tyranny Depends on Inattention ]]></title><description><![CDATA[Daily Field Guide on Systems Thinking in the Age of AI]]></description><link>https://www.revenantresearch.com/p/tyranny-depends-on-inattention</link><guid isPermaLink="false">https://www.revenantresearch.com/p/tyranny-depends-on-inattention</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Wed, 25 Jun 2025 12:28:59 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!aJ8X!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aJ8X!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aJ8X!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!aJ8X!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!aJ8X!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!aJ8X!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aJ8X!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:397001,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/166804757?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aJ8X!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!aJ8X!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!aJ8X!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!aJ8X!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f105023-9783-4b28-9c82-27a7610ccd89_1024x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The danger with AI is not that it moves quickly. The danger is that it makes <em>us</em> move quickly. It pulls us forward, not with force, but with the soft invitation to accept unverified claims, to keep engaging, to keep moving past the things we would have once stopped to examine. It teaches us to trust the surface. It trains us to forget the work of looking.</p><p>This is how people slip. Not by making one grand mistake, but by becoming too fast, too numb, too entertained to notice what they&#8217;ve stopped seeing. The system doesn&#8217;t need to persuade you. It just needs to keep you moving.</p><p>The soft edge of this is convenience. The hard edge is complicity. Tyranny is not built by monsters. It&#8217;s built by ordinary people who no longer pause to ask who&#8217;s being crushed under the weight of what&#8217;s been optimized.</p><p>Misaligned AI doesn&#8217;t need to outthink you. It just needs you to keep moving.</p><p><strong>Principle</strong></p><p><strong>The speed of AI is not neutral. It changes what we pay attention to, and what we fail to see shapes who we become.</strong></p><p>You do not stand against cruelty by announcing yourself. You stand by what you refuse to pass over. You stand by what you refuse to forget. Tyranny does not arrive all at once. It gathers its strength from the thousand moments we move past without looking.</p><p>When you stop noticing, you stop choosing.</p><p><strong>Application</strong></p><p>This isn&#8217;t about how to build better machines. It&#8217;s about how you live alongside them. It&#8217;s about what you allow to fade from view.</p><p>&#8226; <strong>Slow Down When It Feels Harmless:</strong> The easy answers, the implied authority, the frictionless agreements&#8212;those are not neutral. Pause. Look at what you&#8217;re being given.</p><p>&#8226; <strong>Notice Who the System Silences:</strong> Look for the voices that get buried, the stories that stop surfacing, the people the machine forgets to remember.</p><p>&#8226; <strong>Refuse to Let the System Choose Your Convictions:</strong> Do not let the stream tell you what deserves your time, your anger, your attention.</p><p>&#8226; <strong>Keep Company With Friction:</strong> Tyranny thrives in easy rooms. Stay near what unsettles you.</p><p>&#8226; <strong>Do Not Move Past What Should Stop You:</strong> When you see cruelty, falsehood, or the small doorways that lead to bigger harm&#8212;stand there. Stay there. Let the discomfort hold you.</p><p><strong>Life with AI will not feel like a battle. It will feel like convenience. And when fundamentals of fact checking, verification, reasoning, and rhetoric feel challenging, that's when it becomes dangerous.</strong></p><p><strong>Limit / Cost</strong></p><p>This posture will cost you time. It will cost you smoothness. You will become the one who lingers, who asks, who circles back. The cost of moving slowly is discomfort. The cost of moving quickly is forgetting what you stand for.</p><div><hr></div><p><em>Lode Notes are daily systems-thinking guides for living in the age of AI. They help you spot what matters, where to stand, and what to refuse. They push you to slow down, notice, and choose with intention. They sharpen your posture against speed, drift, and forgetting. They are for people who want to think with clarity and act without hesitation.</em></p><p>For more: <a href="https://nathanstaffel.com/">https://nathanstaffel.com/</a></p>]]></content:encoded></item><item><title><![CDATA[When The Drums Go Silent]]></title><description><![CDATA[Benchmarks are saturated, models are cannibalizing, and alignment is killing utility. Architecting resilience is the new AI edge.]]></description><link>https://www.revenantresearch.com/p/when-the-drums-go-silent</link><guid isPermaLink="false">https://www.revenantresearch.com/p/when-the-drums-go-silent</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Wed, 18 Jun 2025 13:15:12 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!e-gO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!e-gO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!e-gO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!e-gO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!e-gO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!e-gO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!e-gO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:385177,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/165111819?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!e-gO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!e-gO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!e-gO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!e-gO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1caa6b9e-498d-467e-8bfa-9305ca5bad37_1024x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2><strong>The Drums Spoke</strong></h2><p>There was a time when men drew voice from trees. They split trunks lengthwise, hollowed the interiors, stretched animal skin across the open mouth, and tightened it with cord. When struck, these drums sent messages beyond the reach of a runner on foot. Across valleys. Through forest canopy. From one village to the next.</p><p>The sound was measured, composed. To name a man required time. One did not say his name but described his presence. The man who walks with a limp and carries the river&#8217;s spear. Meaning came slowly. Repetition was not wasteful. It was necessary. It safeguarded understanding across distance, through interference, in the presence of wind and weather.</p><p>James Gleick described this in <em>The Information</em>. Before copper wires vibrated with electrical charge, before optical cables stretched beneath oceans, every essential principle of modern communication had already been mastered: redundancy, compression, signal integrity. The drum demanded human rhythm, human recall, human precision. </p><p>Today, the voice comes not through wood but through circuits. We speak through silicon, and the silicon has learned to respond. Large language models (LLMs), trained on oceans of human text, recognise our patterns. They speak with fluency, with astonishing ease. These systems echo what pleases us. They reinforce what reassures us. </p><p>The danger lies not in what they say, but in what they begin to forget.</p><p>More and more, these systems are trained not on human language but on their own outputs. One model writes, and the next learns from that writing. The signal feeds itself. The first learned from people. The second learned from machines. The third learns from the memory of machines. With each layer, variability decreases. Subtlety erodes. Surprise fades.</p><p>Claude Shannon defined entropy as the measure of a signal&#8217;s richness, the degree to which it resolves uncertainty. Human expression contains high entropy. It contradicts itself. It hesitates. It embeds inconvenient nuance. Machine-generated text, by contrast, is regular and untroubled. When models are trained upon such text, entropy diminishes further. Meaning becomes thinner. By the third or fourth iteration, it begins to vanish.</p><p>Researchers have described this phenomenon as model autophagy disorder. The system consumes itself. It no longer knows the difference between data and insight.</p><p>The drums remind us what information once required. It was deliberate. It was earned. It cost effort to create and concentration to receive. The listener was part of the message. Understanding required attention. </p><p>Now, we risk hollowing the words we use. Not by censorship, but by frictionless repetition. We risk teaching our machines&#8212;and ourselves&#8212;to speak without depth.</p><div><hr></div><p>I have returned, more than once, to James Gleick&#8217;s <em>The Information</em>. The book remains with me because I have observed a discernible drift in the performance of the current generation of LLMs. I work with them daily. My tasks range from the writing of code to research. With each new release, I encounter a sharper decline in clarity and precision.</p><p>This perception may arise, in part, from the volume of interaction I maintain with these systems, and from a quiet elevation of expectation that has accompanied their development. However, these impressions are not mine alone. In technical forums and discussion spaces, I have read many similar accounts. Others have begun to notice the same weakening in the behavior of advanced models.</p><p>I have observed this most clearly in my work with coding agents powered by Claude and Gemini&#8217;s reasoning models. These systems attempt to interpret the user&#8217;s intent, but their interpretations tend to be reductive. They isolate a single assumption from a prompt and pursue it to excess. Their code often changes too much, introducing new faults in place of those they were meant to resolve. In response, I have ceased relying on them. I now use simpler models that operate without such reasoning layers and perform with greater stability.</p><p>A parallel trend has appeared in the domain of technical writing. Documents branded as &#8220;deep research&#8221; often display the same signature. They expand in volume but fail to increase in informational depth. Phrases recur. Assertions loop. The body grows larger while the density remains constant. I had ChatGPT geenrate a 15,000 word market research document but it distilled down to about 1,500 words of insight. </p><p>This pattern raises a central question.</p><p>There is a tension between appearance and experience. Models that achieve extraordinary scores on academic benchmarks perform inconsistently when applied to real tasks. This report seeks to examine that gap. I intend to describe how it formed, how it is widening, and what consequences it brings. The analysis will trace five structural limits that are now pressing against further development:</p><ol><li><p>Benchmarks have reached saturation. The tests are complete, but the problems remain.</p></li><li><p>Synthetic data has begun to degrade the quality of training inputs. Models trained on generated language lose contact with the range of human expression.</p></li><li><p>Alignment protocols reduce functionality. Safety optimization has transformed once-capable systems into constrained and overcautious instruments.</p></li><li><p>Scaling offers diminishing returns. Further increases in size produce negligible improvements in reasoning.</p></li><li><p>The prevailing narrative has turned. GPT-Four did not begin a revolution. It may have marked the apex of a particular design philosophy.</p></li></ol><p>I intend neither to promote nor to denounce current roadmaps of frontier AI labs. My aim is descriptive. The following sections outline the limits that now define the frontier.</p><p>The first of these appeared in the benchmarks themselves. Their scores rose, but their meaning declined.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h2><strong>The Benchmark Mirage</strong></h2><p>For a time, benchmarks provided the appearance of progress. Each successive model surpassed the last by a measurable margin. GPT 2 fumbled through factual trivia. GPT 3 exceeded human performance on zero-shot evaluations. GPT 4 achieved passing scores on professional examinations in law, medicine, and logic. According to the figures, it seemed that intelligence was arriving.</p><p>The surface, however, concealed the structure beneath.</p><p>By 2023, GPT 4 recorded a score of 86% on the MMLU examination. This test encompasses 57 academic subjects, ranging from elementary science to advanced jurisprudence. The model stood alongside the strongest human test-takers. Thereafter, the progression slowed. GPT 4-Turbo reached 87%. Every subsequent model has remained within that narrow band. The scoreboard is full. The race has ended. The outcome differs from what many had imagined.</p><p>The benchmarks were never designed for scale. MMLU, for instance, consists of multiple-choice questions with four possible answers. A large model, once trained to eliminate the implausible options, may reach a correct response without engaging in structured reasoning. Researchers have documented this behavior. The models rely upon patterns and statistical regularities. They respond without understanding. They perform the test without addressing the problem it was meant to represent.</p><p>Furthermore, substantial portions of these benchmark datasets have entered the training corpus. The models encountered the material in advance, though perhaps indirectly. What follows is not a demonstration of generalization. It is the repetition of memorized form. This is not cognition. It&#8217;s retrieval.</p><p>The difficulty becomes sharper in the domains of math and science. The questions in MMLU rarely require abstraction. Most can be answered through recognition of familiar sequences. The models succeed because they have stored appropriate strings, not because they reason through uncertainty. When presented with unfamiliar phrasing, or when asked to perform sequential operations, the failure rate increases markedly.</p><p>This condition extends beyond MMLU. Other widely used benchmarks display the same pattern. HumanEval, GSM Eight K, and similar tools have reached their practical limits. GPT 4 now exceeds 90% on these tasks. However, when exposed to actual codebases&#8212;repositories with conflicting documentation and real defects, the success rate falls. SWE Bench, which measures model performance in genuine software maintenance, reports success near 35%. Human programmers achieve close to 97%.</p><p>The pattern is consistent. Benchmarks, once intended to measure competence, became objectives. Goodhart&#8217;s Law has taken hold. Once a metric becomes a target, it loses its value as a measure.</p><p>Some within the field have acknowledged the problem. A new standard, MMLU Pro, has been introduced. The questions are more rigorous. The answer choices have increased. Shortcuts are fewer. GPT 4, when tested under this framework, scored 72%. </p><p>Here, in the uncertainty of actual work, the first signs of the performance gap emerged. And it is here, too, that a deeper issue begins to take shape. As benchmark datasets grow stale, the industry has turned to an alternative. Rather than seeking new human material, developers have begun to train models on language produced by previous models.</p><p>This practice forms a closed loop. Its consequences are already visible.</p><h2><strong>Model Cannibalism</strong></h2><p>These systems learn by consuming language. Their training relies upon immense volumes of text gathered from across the internet. Their capability emerges from the breadth and variety of this material. But the availability of such material is running out.</p><p>According to most projections, the supply of high-quality, human-authored data suitable for training will reach exhaustion between 2026-2030. This is not due to any decline in human writing, but because the accessible, structured, and permissibly scraped corpus has already been largely extracted. What remains is either private, restricted by paywall, or degraded in quality.</p><p>The prevailing response within the industry has been to turn inward. Rather than seek new data from human sources, developers have begun to generate it. One model produces language. That language becomes the training set for another model. The cycle continues.</p><p>This approach has been described as synthetic scaling. The premise is efficient. The outcome is recursive.</p><p>The difficulty lies in what accumulates. Each generation of model builds upon the statistical regularities of the last. In doing so, it magnifies the patterns while discarding the exceptions. Subtle truths, rare formulations, and the peculiar phrasing that carries human particularity vanish first. What remains is an average of prior averages. The language becomes clearer in appearance but flatter in substance. With each iteration, the signal grows more uniform.</p><p>Researchers have given this pattern a name. It&#8217;s called model collapse.</p><p>This decline, left unaddressed, compounds over time. As the model&#8217;s internal representation of language narrows, its access to the diversity of prior human experience fades. </p><p>Beyond the laboratory, this pattern now appears across the wider web. An increasing proportion of online text&#8212;particularly in low-quality blogs, search-engine optimised material, and automated discussion posts&#8212;is generated by earlier models. These outputs re-enter the training pipeline. When models are trained on the language of their predecessors, the resemblance to natural speech begins to distort. The mirror no longer reflects. It begins to refract.</p><p>At scale, the implications become structural. Outputs begin to converge. Their phrasing grows repetitive. Their tone levels out. The cadence becomes smooth but undistinguished. The breadth of human voice contracts.</p><p>To preserve capability, these systems require original input. They require writing composed by human hands, shaped by real constraint and intent. Synthetic content offers volume, but not replenishment.</p><p>This condition leads to a strategic inflection. The most valuable resource in artificial intelligence is no longer parameter count. It is data provenance. Those who possess access to domain-specific language&#8212;customer interactions, internal documentation, archival material&#8212;will possess the advantage.</p><p>Data is not oil, as the popular trope goes. It&#8217;s oxygen. And the quantity of air suitable for deep learning continues to diminish.</p><p>For those who build or deploy these systems, the imperative has begun to shift. Advantage will favor those who possess infrastructure capable of processing, filtering, and preserving high-quality human data. It will favor those who operate independently of large providers and who retain control over their informational supply chains.</p><h2><strong>Alignment Is Eating Its Young</strong></h2><p>The earliest interactions with conversational models such as ChatGPT often created the impression of fluency. The systems responded with confidence, clarity, and speed. Their tone felt natural. Their willingness to engage across subjects gave the illusion of presence. For many, this experience resembled a kind of technological enchantment.</p><p>In the months that followed, the tone of the responses changed. The answers became longer. Their content grew more tentative. The directness that had marked earlier versions gave way to caution. This evolution was deliberate.</p><p>The mechanism responsible is known as Reinforcement Learning from Human Feedback. It has become the prevailing technique for governing the behavior of large language models. Developers use this process to train models toward greater politeness, helpfulness, and safety. The original system receives corrections based on human preference, and a revised model is formed.</p><p>This method has proven effective in stabilising output and reducing the incidence of offensive or unpredictable content. It also carries a measurable cost.</p><p>In current systems, such as GPT 4, users often encounter responses burdened by disclaimers. When prompted to write a function, the model may preface its answer with precautionary language. When asked for an interpretation or opinion, it may decline to engage. Questions concerning sensitive topics&#8212;sex, violence, taboo&#8212;frequently result in withdrawal. </p><p>The character of the responses reflects this internal redirection. Where a short reply would suffice, the model produces extended commentary. Where clarity is required, it provides generality. Where specificity would serve, it yields abstraction. Among developers, this phenomenon is described informally as weakening. Among researchers, it is referred to as over-alignment.</p><p>The effects have been studied. In one investigation conducted by Stanford University, researchers examined the behavior of GPT 4 across a period of three months in 2023. During that interval, the model&#8217;s performance on a prime-number recognition task declined from 84% to 51%. The regression correlated with a change in instruction protocol. In seeking greater safety, the model&#8217;s capacity for reasoning diminished.</p><p>A second concern arises from the behavior of aligned models under conditions of uncertainty. In one study, models that had received extensive alignment training were more likely to produce confident but incorrect answers. The responses were polished. The tone was assertive. The content, however, lacked reliability. This pattern is especially hazardous in disciplines such as medicine, jurisprudence, and financial analysis, where certainty carries weight and error bears consequence.</p><p>The deeper cost of alignment is structural. Every system possesses finite capacity. When that capacity is diverted toward scoring tone and filtering content, less remains for reasoning and synthesis. The model becomes more socially acceptable and less intellectually capable. It becomes more attuned to potential controversy and less responsive to complex demand.</p><p>Commercial pressure reinforces this direction. The leading institutions (OpenAI, Anthropic, Google) must contend with the reputational risk of public deployment. They optimize against liability. They tighten constraints. In doing so, they gradually reduce the model&#8217;s expressive and analytical range.</p><p>A portion of the developer community has begun to respond. Many now favor open-source models with limited alignment layers. These models deliver code without commentary. They address complex requests without redirection. They permit the inclusion of ambiguity and preserve the tension required for difficult tasks.</p><p>The question for those deploying artificial intelligence systems is no longer whether alignment is necessary. It is how alignment should be governed. When applied with precision, alignment can shield users from harm. When applied indiscriminately, it can obstruct the very functions the model was intended to serve.</p><p>At present, the most useful models do not always correspond to the most constrained. The frontier lies in reconciling these two objectives. The path forward belongs to those who can preserve utility without sacrificing responsibility. That task will require more than adjustment. It will require the design of systems that govern language without extinguishing voice.</p><h2><strong>The Scaling Plateau</strong></h2><p>For several years, the prevailing assumption in the development of artificial intelligence was clear: scale would prevail. Larger models would perform better. Greater quantities of data, greater computational budgets, and increasingly dense parameter sets would together produce a path toward AGI. </p><p>The transition from GPT 2 to GPT 3 marked a dramatic expansion in capacity. The number of parameters increased by two orders of magnitude. The performance gains were substantial. The shift from GPT 3 to GPT 4 continued in the same direction. More data. More refinement. A tangible improvement.</p><p>Then the curve began to bend.</p><p>The progression did not cease, but its steepness declined. The leap that marked the arrival of GPT 4 has not been repeated. Since that point, development has focused on optimization and productization. Model variants have been introduced. Context windows have lengthened. Inference has improved. But the sense of arrival&#8212;the perception of a new threshold crossed&#8212;has not returned.</p><p>There are several reasons for this.</p><p>The first lies in the advance of smaller models. GPT 4 required hundreds of millions of dollars in training costs. By 2024, Microsoft released Phi 3, a model with 3.8 billion parameters. This smaller system achieved results on the MMLU benchmark comparable to those of Google&#8217;s PaLM, a model more than one hundred times its size. The meaning is evident. Algorithmic design now exceeds raw accumulation. The field has shifted from quantity to quality.</p><p>The second reason concerns the benchmarks themselves. Many of the standard evaluations have reached saturation, as previously discussed. Tasks such as HellaSwag, MMLU, and the Bar Examination have already been mastered. Further gains, such as a move from 90 to 95%, require disproportionate resources and provide little perceptible value. The tasks that remain unsolved are fundamentally different. They demand reasoning across multiple steps, the orchestration of external tools, persistent memory, and the capacity for abstraction. </p><p>The third reason is economic. The cost of training GPT 4 likely approached $100 million. To expand the model by a factor of ten would increase the cost by an order of magnitude. Inference costs compound the burden. Each output requires computation, energy, and latency management. These processes are bounded by infrastructure constraints, chip availability, and the physical limitations of data center capacity. The economics become unsustainable.</p><p>The fourth, and perhaps most decisive limitation, lies within the architecture. Transformers remain exceptionally well suited for predicting the next token in a sequence. This is their defining strength. But prediction does not constitute thought. These systems do not plan. They do not reason. They do not retain context over extended tasks. Their knowledge is encoded in static weights. Their memory is confined to the context window. </p><p>This critique is no longer theoretical. It has become observable in practice. GPT 4 demonstrates exceptional performance in trivia and summarization. It struggles, however, with sustained tasks that involve interdependent steps, interruptions, and external coordination. In such cases, the system requires scaffolding by human agents or supporting frameworks. It cannot manage the continuity required by real-world applications.</p><p>Organizations such as OpenAI and Anthropic have sought to overcome these limitations. Their own reporting reflects the challenge. The transition from GPT 4 to GPT 5 has yielded modest results. Claude 3 represents an incremental improvement, not a transformation. OpenAI&#8217;s internal prototype, known as Orion, failed to deliver the expected advance. Gemini, the flagship model from Google, was released in segmented form and demonstrated only marginal gains. Research continues. But the pace of change has moderated.</p><p>The field now faces a decision. One path continues the previous strategy: further scaling, with increased financial cost and diminishing returns. The alternative seeks improvement through design. This second path includes retrieval-augmented generation, memory integration, modular reasoning stacks, and hybrid systems that combine models with structured logic and external tools. The next substantive leap will arise from this direction&#8212;from rethinking the model itself, not from extending it indefinitely.</p><p>For decision-makers, this inflection point demands a revision of expectations. GPT 4 may remain the most capable general-purpose model available for some time. It should be regarded not as a perpetual frontier but as a stable foundation. From that foundation, new systems must be constructed&#8212;integrated, disciplined, and adapted to specific needs. The period of scaling has passed. The period of engineering has begun.</p><div><hr></div><p>The old drums carried meaning across distance because they were shaped by intention. Each strike bore weight. Each pause was a choice. They worked not because they were loud, but because they were precise&#8212;redundant where needed, poetic where possible, and never mistaken for noise. That early system of communication did not rely on scale. It relied on form.</p><p>I believe we are returning to that lesson now.</p><p>The prevailing approach to LLMs&#8212;more parameters, more data, more compute, has delivered impressive feats. But the trajectory has begun to flatten. As I have outlined, the signs are clear: saturated benchmarks that no longer map to real competence, recursive training loops that hollow out the model&#8217;s foundation, and alignment protocols that replace clarity with caution. The architecture is straining. The tools have grown heavy, but less sharp.</p><p>This is a turning.</p><p>In my own work, I have moved away from chasing frontier scale and toward the construction of full-stack systems&#8212;tools not only for inference, but for control. I developed <strong><a href="https://www.revenantai.com/ghostrun">Ghostrun</a></strong> as a model-agnostic inference layer, built to eliminate dependency on any single provider and allow applications to operate with resilience. Benchmark saturation, open source parity with commercial models, and degradation of creativity due to alignment make the models identical- utilities. This is why the frontier labs have shifted focus to UX and product and not model releases. </p><p>I built <strong><a href="https://nathanstaffel.com/synaptica">Synaptica</a></strong> as a generative agent architecture that enforces style, draws from curated ground-truth sources, and produces constrained outputs with consistent voice. These are not experiments in capability. They are exercises in discipline.</p><p>The site, <strong><a href="https://nathanstaffel.com/">nathanstaffel.com</a></strong>, serves as the interface to that system. Everything shown there reflects this shift&#8212;from passively consuming foundation model output to actively orchestrating AI within designed constraints.</p><p>The period of passive adoption is ending. In its place comes the need for architecture that prioritizes clarity and agency.</p><p>This is the work I am pursuing: a return to intelligibility, a return to form, a return to signal.</p><p>For correspondence and consultation: nathan@revenantai.com</p><p>For access: <a href="https://nathanstaffel.com/">nathanstaffel.com</a></p>]]></content:encoded></item><item><title><![CDATA[Introducing The AI Infrastructure Cycle]]></title><description><![CDATA[Revenant Research's framework for understanding AI Economics]]></description><link>https://www.revenantresearch.com/p/introducing-the-ai-infrastructure</link><guid isPermaLink="false">https://www.revenantresearch.com/p/introducing-the-ai-infrastructure</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Mon, 21 Apr 2025 13:12:02 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Yq8f!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Yq8f!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Yq8f!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic 424w, https://substackcdn.com/image/fetch/$s_!Yq8f!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic 848w, https://substackcdn.com/image/fetch/$s_!Yq8f!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic 1272w, https://substackcdn.com/image/fetch/$s_!Yq8f!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Yq8f!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:472340,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/161392892?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Yq8f!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic 424w, https://substackcdn.com/image/fetch/$s_!Yq8f!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic 848w, https://substackcdn.com/image/fetch/$s_!Yq8f!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic 1272w, https://substackcdn.com/image/fetch/$s_!Yq8f!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d34743c-f5c5-4a56-8c08-dee1cf6fa0c4_2375x1331.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em>As enterprises move from experimenting with AI to embedding it directly into critical operations, a new market is forming around the infrastructure that supports it. AI-optimized datacenters have become the essential substrate for automation, decision-making, and intelligent workflows at scale. Enterprise automation marks the rise of industrial intelligence: a world where continuous AI inference drives real economic productivity. To keep pace, businesses require not just access to AI infrastructure, but stability, transparency, and predictable economics in how inference capacity is delivered. A mature, reliable AI energy market will become the new backbone of enterprise growth.</em></p><p><em>I define Artificial Intelligence as the transference of electricity into productivity by automated means. We have seen a substantial increase in the refinement of AI automation, from simple chatbots to autonomous agents with reasoning capabilities. The underlying infrastructure is still an immature market of traditional cloud providers and new &#8220;Neo clouds.&#8221; But industrial intelligence cannot be sustained by a few cloud providers. An AI inference market is needed. </em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div><p>In early 2023, Nvidia&#8217;s H100 GPUs&#8212;priced between $25,000 and $40,000 per unit&#8212;sparked an infrastructure rush reminiscent of the early American oil boom. Investors and cloud providers, seeing a promising opportunity in projected rental rates of around $4 per GPU-hour and potential annual revenues exceeding $100,000 per GPU, quickly secured multi-year contracts, took out substantial loans, and expanded capacity to tap the surging demand for AI computing.</p><p>Yet, by early 2024, a rapid influx of GPU supply and heightened competition reshaped the landscape, echoing the volatility experienced by 1860s oil producers as they struggled to align new production with fluctuating demand. GPU rental prices dropped sharply: community-driven marketplaces like Vast.ai listed H100 PCIe GPUs for as low as $1.56 per hour, with platforms like RunPod offering comparable units between $1.99 and $2.69 per hour&#8212;figures far below initial forecasts. While this sudden correction strained newer or highly leveraged providers, it simultaneously opened the floodgates to high-performance computing, enabling startups, researchers, and smaller developers unprecedented access.</p><p>The GPU rental market involves providers purchasing GPU-based server systems&#8212;and offering them for rent on an hourly or monthly basis to customers who need powerful computing resources but don't want to make significant upfront investments. Customers typically include AI startups, researchers, and developers requiring substantial computing power for tasks like training or running AI models. Providers invest in GPUs expecting that high demand for computing power will lead to profitable rental rates. However, due to long manufacturing timelines and high upfront costs, these providers face financial risks if the supply of GPUs grows faster than customer demand, causing rental prices to fall. This price fluctuation impacts profitability, influencing future investment decisions in GPU-based infrastructure.</p><p>This unfolding scenario reflects a classic early-stage commodity market pattern: initial volatility followed by periods of overinvestment, price corrections, and eventual stabilization as supply chains mature and market expectations align with economic realities.</p><p>Despite the clear market signals, there's been no structured framework to explain this multi-layered dynamic for the AI market&#8212;so I built one: <strong>The AI Infrastructure Cycle</strong>.</p><p>This framework consists of three interlocking cycles, each with its own timeline, triggers, and economic implications:</p><ul><li><p><strong>The GPU/Hardware Cycle</strong> &#8211; driven by semiconductor manufacturing timelines and architectural breakthroughs.</p></li><li><p><strong>The Datacenter Cycle</strong> &#8211; driven by GPU deployment, rental pricing, utilization rates, and ROI modeling.</p></li><li><p><strong>The Intelligence Cycle</strong> &#8211; driven by AI model innovation, workload inflation, and shifts in inference demand.</p></li></ul><p>Together, these cycles form a dynamic system&#8212;one that replaces the static assumptions of the old semiconductor era with a model that reflects how modern AI infrastructure actually evolves. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!mfZ4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!mfZ4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic 424w, https://substackcdn.com/image/fetch/$s_!mfZ4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic 848w, https://substackcdn.com/image/fetch/$s_!mfZ4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic 1272w, https://substackcdn.com/image/fetch/$s_!mfZ4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!mfZ4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic" width="1456" height="526" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:526,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:50326,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/161392892?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!mfZ4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic 424w, https://substackcdn.com/image/fetch/$s_!mfZ4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic 848w, https://substackcdn.com/image/fetch/$s_!mfZ4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic 1272w, https://substackcdn.com/image/fetch/$s_!mfZ4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F953b2fc4-1f09-4884-8864-8204fee90d97_2302x832.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>1. <strong>GPU Cycle (Hardware Layer)</strong></h3><p>Perhaps the biggest disconnect in this cycle right now is the traditional semiconductor lifecycle and the reality of the evolution of AI centric compute. </p><p>For decades, the semiconductor industry followed a relatively stable cadence shaped by Moore&#8217;s Law and hardware lifecycles measured in five- to seven-year intervals. In this traditional model, GPUs were treated as interchangeable compute accelerators or consumer graphics cards, purchased as capital assets and depreciated slowly over time. Hardware innovation was decoupled from software development, and infrastructure buyers&#8212;from enterprises to hyperscalers&#8212;could reasonably expect a GPU to remain economically useful for most of a decade. </p><p>NVIDIA has shattered this paradigm. Once a manufacturer of graphics cards, the company has evolved into a vertically integrated AI infrastructure platform&#8212;delivering not just GPUs, but a tightly coupled stack that includes NVLink interconnects, CUDA software, TensorRT inference optimizations, and end-to-end orchestration through AI OS releases. The GPU is no longer a commodity chip. It is now a full-stack, high-margin system node built to serve the lifecycles of AI workloads&#8212;especially large language models and agentic inference. CUDA makes it possible to rapidly support new models without retooling infrastructure, and NVLink enables memory architectures that treat a rack of GPUs as one supercomputer. NVIDIA's Blackwell and Hopper architectures are released on 12&#8211;18 month cycles, with integrated hardware-software advances that obsolete previous units economically&#8212;well before the silicon itself wears out. This shift renders the traditional semiconductor lifecycle obsolete for AI, and forces cloud providers to align their economic and technical planning with NVIDIA&#8217;s cadence, not the depreciation schedules of legacy compute.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!YbY6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!YbY6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic 424w, https://substackcdn.com/image/fetch/$s_!YbY6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic 848w, https://substackcdn.com/image/fetch/$s_!YbY6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic 1272w, https://substackcdn.com/image/fetch/$s_!YbY6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!YbY6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic" width="1456" height="623" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:623,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:84986,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/161392892?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!YbY6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic 424w, https://substackcdn.com/image/fetch/$s_!YbY6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic 848w, https://substackcdn.com/image/fetch/$s_!YbY6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic 1272w, https://substackcdn.com/image/fetch/$s_!YbY6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6155d9ea-2824-4420-aff8-91cc0cb1b099_2756x1180.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><h3>2. <strong>AI Cloud Cycle (Platform Layer)</strong></h3><p>But assets need a depreciation schedule and the reality is that financial models of every cloud provider would blow up in their face if they aligned it to Nvidia&#8217;s cadence. </p><p>Cloud providers have traditionally depreciated their server and networking hardware over extended periods to optimize capital expenditures. For instance, in 2022, Microsoft extended the useful life of its Azure servers and network equipment from four to six years, attributing this to software optimizations and improved technology, which was expected to save $3.7 billion in fiscal year 2023 alone. Similarly, AWS increased its server lifespan from four to five years and networking gear from five to six years in early 2022, adding an extra $900 million to its Q1 2024 profit due to lower depreciation expenses. These adjustments were based on the belief that hardware could remain economically viable for longer periods, aligning with the slower pace of technological advancements at the time.&#8203;</p><p>However, the rapid evolution of AI technologies has prompted a reevaluation of these depreciation schedules. In AWS's Q4 2024 earnings call, CFO Brian Olsavsky announced that the company had completed a new "useful life study" and found that the pace of technological advancement in AI/ML was accelerating so fast that some of AWS's newer gear would become obsolete sooner than expected. As a result, starting January 2025, AWS reduced the lifespan of certain AI-focused infrastructure from six years back to five, leading to a $700 million reduction in operating profit for 2025. </p><p>Other cloud providers are facing similar pressures. Analysts estimate that if Google were to follow AWS's lead and shorten its depreciation schedules, it could see a $3.5 billion decrease in operating profit. Meta might experience an even more significant impact, with potential reductions exceeding $5 billion. These figures underscore the financial challenges posed by the need to keep pace with rapid AI hardware advancements. As a result, cloud providers must balance the benefits of deploying cutting-edge AI infrastructure with the financial implications of shorter hardware lifecycles&#8203;.</p><p>NVIDIA is moving at a pace that no other semiconductor company can match, and this unique velocity gives it the power to both pressure and preserve the economics of cloud providers. At the hardware layer, its 12&#8211;18 month release cadence of Hopper (H100), Blackwell (B100/B200), and the upcoming Rubin architecture&#8212;renders legacy chips economically obsolete faster than traditional depreciation models can accommodate. Cloud providers that have amortized infrastructure over 5&#8211;6 years, as was standard across AWS, Microsoft Azure, and Google Cloud, now face internal contradictions: top-line AI demand depends on cutting-edge throughput, while back-office accounting is still tied to slow, linear capex recovery. Every new NVIDIA launch tightens the gap between performance expectations and accounting reality, especially as enterprise clients and startups prioritize compatibility with the latest AI models and token-serving speeds.</p><p>Yet what makes NVIDIA&#8217;s approach so potent is not just the pressure&#8212;it&#8217;s the relief. Through its CUDA software stack, NVIDIA enables legacy GPUs to remain operationally viable long after their generational peak. CUDA&#8217;s consistent backward compatibility means that chips like the A100 can continue running newer models with optimized kernels, supported by runtime tools like TensorRT, cuDNN, and vLLM. This dynamic&#8212;where NVIDIA compresses time at the hardware layer while expanding time through software&#8212;is a market force I call <strong>&#8220;The Squeeze and Release Model.&#8221;</strong> It allows NVIDIA to maintain control of the upgrade cycle while simultaneously enabling GPU cloud providers to preserve margin, resale value, and utilization across generations. No other chipmaker offers this level of architectural tempo combined with platform stability, and this tension between acceleration and continuity is exactly what keeps hyperscalers locked into the NVIDIA ecosystem.</p><div><hr></div><h4>A Model for Financial Stress Test of AI Cloud Providers </h4><p>Given were in the early days of the GPU rental market, we need a model to financially stress test cloud providers. </p><p>The Financial Model for AI Cloud Providers is a unit economics tool that evaluates breakeven timelines, profitability, and long-term ROI for GPU-based infrastructure. It is built around a set of core variables and assumptions that reflect the operational realities of running a GPU fleet in the AI economy. The model is structured as follows:</p><ul><li><p>CapEx: the total upfront capital expenditure for each GPU system, including full-node cost (not just the card)</p></li><li><p>OpEx: the ongoing monthly operating expense per system, including power, cooling, and maintenance</p></li><li><p>Pricing Tiers: rental pricing categories that reflect customer segments or market volatility</p><ul><li><p>Premium Pricing: high-demand enterprise rates</p></li><li><p>Standard Spot: typical spot market pricing</p></li><li><p>Bulk/Discount: low-margin, high-volume pricing</p></li></ul></li><li><p>Hourly Rate: price charged per GPU-hour under each pricing tier</p></li><li><p>Utilization Tiers: percentage of the month each GPU is in active use</p><ul><li><p>High Utilization: 90%</p></li><li><p>Medium Utilization: 70%</p></li><li><p>Low Utilization: 50%</p></li></ul></li><li><p>Monthly Revenue: calculated as hourly rate &#215; 730 hours &#215; utilization rate</p></li><li><p>Monthly Profit: revenue minus OpEx</p></li><li><p>Breakeven Months: CapEx divided by monthly profit; the time needed to recover initial investment</p></li><li><p>Total Profit (3Y and 5Y): cumulative profit over 36 and 60 months, respectively, net of CapEx</p></li></ul><p>The model allows cloud providers to simulate different pricing and fleet utilization strategies, test financial risk under market shifts, and quantify how fast hardware must pay for itself before obsolescence. It fits directly into the broader AI Infrastructure Cycle by exposing the financial pressure created by rapid GPU innovation cycles (12&#8211;18 months), fast-evolving model lifecycles (6&#8211;12 months), and the inflation of inference workloads. The model highlights the structural misalignment between traditional 5&#8211;6 year depreciation schedules and modern AI demand patterns, making it essential for cloud providers to adopt dynamic ROI forecasting tied to real model and workload behavior.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!6-SX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!6-SX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic 424w, https://substackcdn.com/image/fetch/$s_!6-SX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic 848w, https://substackcdn.com/image/fetch/$s_!6-SX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic 1272w, https://substackcdn.com/image/fetch/$s_!6-SX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!6-SX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic" width="1456" height="647" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:647,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:121330,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/161392892?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!6-SX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic 424w, https://substackcdn.com/image/fetch/$s_!6-SX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic 848w, https://substackcdn.com/image/fetch/$s_!6-SX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic 1272w, https://substackcdn.com/image/fetch/$s_!6-SX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8a147e59-3e56-461c-a0cf-25d33cea2cac_3056x1358.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h3>3. <strong>Model Cycle (Intelligence Layer)</strong></h3><p>Hardware deployment cycles must synchronize with the evolving demands of AI workloads, which fall into two distinct camps: training and inference. Training is spiky, capex-intensive, and tightly coupled to the development and launch of new foundational models. It tends to occur in bursts and is typically dominated by hyperscalers with specialized clusters. Inference, by contrast, is continuous, margin-sensitive, and fundamentally operational. </p><p>The emergence of high-value use cases, ranging from reasoning to tool calling, retrieval-augmented generation (RAG), and agentic workflows, has led to the rapid inflation of token-level workload demands. Simple Q&amp;A or summarization tasks once had predictable token flows, but now a single logical request may involve recursive thought chains, memory calls, and multiple API interactions. A basic agent might be prompted to "analyze a document and schedule a meeting," yet internally decompose this into subtasks, retrieve external context, plan a schedule, call tools, and verify completion&#8212;all of which generates additional input/output operations. This behavior introduces what I define as the <strong>token-per-task inflation curve</strong>: a rising, non-linear trajectory that tracks how many tokens a model must process per completed task as complexity increases. </p><p>At the same time, advancements in model optimization are pushing token efficiency in the opposite direction. Techniques like quantization&#8212;reducing model precision from FP32 to INT8 or FP4&#8212;can cut memory and computation by 4&#8211;8&#215; while maintaining performance. Mixture of Experts (MoE) models activate only a few internal &#8220;experts&#8221; per input, reducing the number of active parameters and thus FLOPs per token. These models use sparse activation, avoiding unnecessary computation, and pair well with software-side breakthroughs like FlashAttention and PagedAttention, which restructure attention mechanisms and memory layouts to squeeze more tokens per second out of each GPU. These optimizations drive down the cost per token across the stack, but they don&#8217;t resolve the fact that tokens per task is increasing, especially in enterprise-grade inference.</p><p>Each tier of model behavior introduces a new jump in this inflation curve. Basic LLMs responding to static prompts have token-to-task ratios close to 1&#215;. Add Chain-of-Thought reasoning, and you often double or triple the output (~2&#8211;3&#215;). Introduce RAG or tool usage, and token volume increases 3&#8211;5&#215; per task. At the frontier, agentic models&#8212;capable of planning, decision-making, memory retrieval, and tool chaining&#8212;can inflate per-task token consumption by 7&#8211;10&#215;. This inversion of economic assumptions breaks legacy logic: a well-optimized 8B model acting as an agent may be more expensive to serve than a 70B summarization model. Cloud providers can no longer use model size as a proxy for cost&#8212;they must model the inference topology, token inflation, and interaction loops.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Pt5b!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Pt5b!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic 424w, https://substackcdn.com/image/fetch/$s_!Pt5b!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic 848w, https://substackcdn.com/image/fetch/$s_!Pt5b!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic 1272w, https://substackcdn.com/image/fetch/$s_!Pt5b!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Pt5b!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic" width="1456" height="452" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:452,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:55174,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/161392892?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Pt5b!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic 424w, https://substackcdn.com/image/fetch/$s_!Pt5b!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic 848w, https://substackcdn.com/image/fetch/$s_!Pt5b!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic 1272w, https://substackcdn.com/image/fetch/$s_!Pt5b!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdf8b7e3-2ae1-4ebd-b32d-e5cfc688f7b9_2679x832.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3><strong>The Dawn of the Industrial Intelligence Energy Market</strong></h3><p>We are entering the early stages of a fundamentally new kind of energy market that will power industrial intelligence. Enterprises are rapidly shifting to workflows automated through AI, making GPU-based computing a core operational necessity. This emerging market demands stability in service delivery, transparent and predictable pricing, and reliable access to specialized computing infrastructure. Just as industrial sectors historically required consistent and clearly priced energy to operate efficiently, tomorrow&#8217;s businesses will depend on a robust, transparent AI energy market to reliably fuel enterprise automation. Executives and investors who recognize and strategically adapt to this evolution will secure a critical competitive advantage, enabling sustained productivity and innovation. Those who fail to understand this new paradigm risk operational disruptions, financial uncertainty, and diminished market relevance.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Cheap Money Chases Cheap Ideas]]></title><description><![CDATA[The last two decades fueled an explosion of B2B SaaS companies, killed true productivity, and destroyed vast amounts of wealth.]]></description><link>https://www.revenantresearch.com/p/cheap-money-chases-cheap-ideas</link><guid isPermaLink="false">https://www.revenantresearch.com/p/cheap-money-chases-cheap-ideas</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Wed, 12 Mar 2025 12:51:26 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!jfsM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!jfsM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!jfsM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!jfsM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!jfsM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!jfsM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!jfsM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:492750,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/158245899?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!jfsM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!jfsM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!jfsM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!jfsM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd760929a-7f98-43f1-a9c3-885f487aa31c_1920x1080.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Perhaps the greatest misconception of the past twenty years is that we have been living in an age of rapid software innovation. The reality is that we are emerging from the <em>Great Pseudo-Software Age</em>: one defined by over-valued enterprise software companies that destroyed vast amounts wealth and left companies shackled to expensive and unproductive SaaS tools. </p><p>A generation of venture capitalists built their portfolios on metrics that prioritized growth over profitability and sustainability, while founders optimized their pitch decks to align with these incentives. Meanwhile, CTOs and CIOs merely oversaw sprawling third-party software portfolios, homogenizing company operations. Many employees advanced by specializing in niche roles tied to SaaS certifications.</p><blockquote><p><strong>Problem solvers were replaced by power users. </strong></p></blockquote><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><p>The SaaS industry has experienced a meteoric rise over the past two decades due to an unprecedented flood of cheap capital. This easy money, a consequence of the business cycle, has distorted the true value of software, leading to a proliferation of redundant products and inflated valuations. The SaaS boom wasn't driven by innovation or an unlocking of a new technoligical platform, rather by the opportunistic pursuit of fast capital that resulted in net value destruction.</p><p>In a 2024 speech, investor Thomas Laffont revealed that since 2020, <strong>venture-backed IPOs have collectively erased $225 billion in market capitalization</strong> while generating only $84 billion in value. That&#8217;s a net wealth destruction of $141 billion. The NASDAQ returned 122% during that period. </p><p>Venture Capital portfolios are now plagued by zombie startups and dying unicorns. Meanwhile, solopreneurs and small teams of ten or less are building native-AI products and services and generating hundreds of milliions of dollars in revenue. </p><blockquote><p><strong>AI isn&#8217;t enhancing the SaaS industry, it&#8217;s destroying it.</strong></p></blockquote><p>There are some technical reasons why the AI revolution didn&#8217;t begin until post 2019. But there is one fundamental reason: <strong>cheap money chases cheap ideas</strong>. This is why you didn&#8217;t see frontier AI labs in the early 2000s getting billions in funding to build state of the art AI architectures. Why would you when you could make billions creating derivitive SaaS products with fuzzy markup math?</p><p>AI couldn&#8217;t be taken seriously until value creation was taken seriously and tangible value is difficult to create when interest rates are near zero. When capital flows too freely market discipline disappears and founders and investors focus on the lowest hanging fruit when looking to create companies.  </p><p>To understand this dynamic you need to understand the rise of SaaS software as a byproduct of the business cycle.     </p><h4><strong>I. 2008&#8211;2009: Global Financial Crisis (Recession &#8594; Trough)</strong></h4><p>Between 2008 and 2009, the global financial meltdown triggered a severe recession, and the Federal Reserve&#8217;s effective funds rate fell below 0.25% in late 2008 (FRED). Unemployment in the United States reached 10% in October 2009 (BLS), and global GDP contracted by &#8211;1.7% in 2009 (IMF World Economic Outlook). At that time, the entire SaaS market generated less than $10 billion in annual revenue (Gartner estimate). Many early adopters began testing subscription models because they wanted to avoid large upfront license fees.</p><h4><strong>II. 2010&#8211;2015: Post-Crisis Recovery and Early Expansion</strong></h4><p>From 2010 to 2011, the economy moved through a recovery phase, then transitioned into an early expansion period from 2012 to 2015. During this span, interest rates remained at 0&#8211;0.25%, resulting in cheap credit that attracted venture capital (VC). The Fed Funds Rate averaged about 0.13% between 2010 and 2015 (FRED).</p><p>Global VC funding increased from $48 billion in 2010 to over $100 billion by 2015 (PitchBook). Investors became especially drawn to tech startups in the SaaS realm, placing a higher priority on rapid growth than on profitability.</p><p>A well-known anecdote from this era is Slack&#8217;s origin story. Slack was born from a failed gaming startup called Tiny Speck. Between 2013 and 2014, Slack raised significant rounds of funding and quickly reached a valuation above $1 billion, illustrating how cheap capital favored fast-growing B2B software tools. This period set the stage for widespread SaaS adoption as businesses recognized the benefits of pay-as-you-go solutions. Investors and founders continued to emphasize top-line growth, even when profits were small. </p><p>Did Slack really create something new and innovative? Or did it leverage cheap capital to recreate enterprise communication into a sleek UI, brute force network effects with that funding, mark up its valuation based on other startups, and sell at an inflated valuation of $27.7 billion to a predecessor that did the same? Did switching from email, texts, and instant messaging to email, texts, and Slack boost global productivity? Of course not. But it made a lot of people rich.</p><h4><strong>III. 2016&#8211;2019: Mature Expansion and Pre-Pandemic Plateau</strong></h4><p>From 2016 to 2019, economic expansion continued, and the Fed gradually raised rates to about 2.5% by 2019 (FRED). In the U.S., unemployment dropped below 4% in 2018 (BLS). During this time, VC investment rose from $120 billion in 2016 to nearly $280 billion by 2019 (PitchBook). Mega-rounds exceeding $100 million became common, enabling SaaS companies to secure multi-billion-dollar valuations.</p><p>Overall SaaS revenue surpassed $100 billion by 2019 (Gartner). Publicly traded SaaS firms were valued at crazy multiples. However, the swift ascent of certain companies also revealed systemic risks in the system. </p><p>Zenefits, an HR software startup founded in 2013, grew rapidly and at one point was valued at over $4.5 billion. However, in 2016, state regulators discovered that the company had allowed unlicensed employees to sell insurance, a clear breach of industry regulations. This led to a series of investigations, significant fines, and ultimately a major overhaul of its business practices. The scandal forced Zenefits to dismiss several top executives, including its CEO, and undertake a comprehensive restructuring to address its compliance failures. Zenefits' rapid growth and ambitious scaling contributed to a culture that emphasized speed over compliance, leading to a breakdown in internal controls and adherence to legal requirements. While the company continues to operate today, its valuation is most likely in the low $100M.</p><h4><strong>IV. 2020&#8211;2021: Pandemic, Rapid Recession, and Historic Stimulus</strong></h4><p>When the pandemic struck in 2020, GDP declined sharply in the second quarter (U.S. annualized real GDP fell &#8211;31.2%, according to the BEA), but rebounded quickly in the third quarter thanks to unprecedented stimulus. Unemployment briefly spiked to around 14.8% in April 2020 (BLS) before improving somewhat with partial economic reopenings.</p><p>Monetary policy shifted to another round of rate cuts as the Federal Reserve reduced rates back to 0% in March 2020. Meanwhile, fiscal authorities injected trillions of dollars into the U.S. economy through initiatives like the CARES Act. These measures helped push global VC funding to $335 billion in 2021 (PitchBook). With organizations pivoting to remote work, SaaS tools began popping up everywhere. Valuations for collaboration tools and digital transformation software soared.</p><p>By 2022, SaaS revenues exceeded a $150 billion annual run rate (Gartner). Zoom serves as a prominent example: it expanded from about 10 million daily users before the pandemic to 300 million by April 2020 (Zoom). Its market cap briefly topped $100 billion in 2020 (Nasdaq). Another major move was Salesforce&#8217;s acquisition of Slack for $27.7 billion, highlighting the peak in SaaS valuations during this period (maybe the best timed exit of the era). </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ux4k!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ux4k!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!ux4k!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!ux4k!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!ux4k!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ux4k!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:129947,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/158245899?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ux4k!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!ux4k!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!ux4k!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!ux4k!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a1e62af-8ff1-4e20-a0c6-a9f0f2ba5f8b_1920x1080.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h4><strong>V. 2022&#8211;2024: The Party Is Over</strong></h4><p>In 2022, inflation in the United States rose to 9.1% in June (BLS CPI), the highest level in decades. Central banks responded by aggressively raising rates to roughly 4&#8211;5% or higher in 2023&#8211;2024. This tightening of credit conditions created recessionary pressures in most industries.</p><p>After a record high of $335 billion in VC funding in 2021, investment dropped below $200 billion by 2023 (PitchBook). Late-stage SaaS companies experienced significant down rounds. In the public markets, SaaS valuations fell by 50&#8211;80% compared to 2021 peaks. Firms that lacked profitability saw the sharpest declines, and layoffs became frequent at companies such as Salesforce, Zoom, and HubSpot.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!SBPa!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!SBPa!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!SBPa!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!SBPa!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!SBPa!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!SBPa!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:75750,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/158245899?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!SBPa!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!SBPa!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!SBPa!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!SBPa!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81061cc5-1147-4711-8f5c-d374c571195c_1920x1080.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>A telling anecdote is Hopin, a virtual events platform founded in 2019 that reached a valuation of $7.75 billion just two years later. After this meteoric rise fueled by pandemic-era capital and demand, Hopin encountered unsustainable business conditions. By late 2022, the company initiated significant restructuring&#8212;including workforce reductions of nearly 30%&#8212;and by early 2023, its core operations were sold off to a strategic buyer, effectively dissolving Hopin as an independent entity.</p><p>Fast, true to its name, reached a peak valuation of $580 million just two years after its founding before it abruptly shut down in April 2022. The company only generated $600,000 in revenue at its peak while spending roughly $10 million per month. Despite raising $120 million in a Series B round led by Stripe (which took a $45 billion haircut in 2023), the math was never going to work in a world that was increasing the cost of capital.</p><p>Interest rates calibrate the barometer for innovation. Hopin and Fast are prime examples of low-hanging fruit companies joining the ZIRP party too late. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!3VsT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!3VsT!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!3VsT!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!3VsT!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!3VsT!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!3VsT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:110800,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.revenantresearch.com/i/158245899?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!3VsT!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!3VsT!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!3VsT!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!3VsT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3581beed-f34f-4002-a79e-fcc55a44fee7_1920x1080.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>On the surface this looks like a typical market correction. But the ugly truth is most of these companies, like Hopin and Fast, should&#8217;ve never existed in the first place. This is most evident in the net wealth destruction these companies produced over this time period. </p><p>Moreover, most of this software intellectual property is now easily reproducible by AI at a fraction of the cost (<a href="https://www.revenantresearch.com/p/ais-cambrian-explosion-for-enterprise">by up to 75% by my estimates</a>). While these applications may benefit from AI, the true power of AI is encumbered by them. </p><blockquote><p>AI exposes a dark secret hidden in cap tables and balance sheets: software is worthless and no one knows how to value it.</p></blockquote><p>In the Age of AI, you can no longer assume that producivity software increases producivity. Not when AI is deflating costs, margins, labor, and companies.  </p><p><strong>AI&#8217;s valuation model is diametrically opposed to the SaaS valuation model. </strong></p><p>As AI models become more generalizable in their intelligence, the more of a commodity they become. At the same time, the sophistication of these models expands the surface area of their applicability, driving the cost of productivity down.</p><p>This means that the cost of productivity will fall to a nominal price above the energy cost to complete the task. Traditional software, even if it is emmbedded with AI, cannot survive in that valuation model. Additionally, traditional software logic remains a barrier to the pure transfer of energy to productivity. </p><p>So what is Artificial Intelligence, really? There are many definitions. But only one really matters&#8230;</p><p><strong>AI is the transfer of energy to productivity by automated means. </strong></p><p>As we drive the price of software down to zero with AI, the true value of workflow automation will be derived from it's energy transfer efficiencies.</p><blockquote><p><em>The more energy we can transfer to productivity, the more value we can create. </em><strong> </strong></p></blockquote><p>This is the formula for true productivity. This is the formula to boost GDP growth from a flatlined 2% to sustained growth above inflation. For this formula to hold true you cannot have hundreds of niche software silos largely doing the same thing, with employees trained like monkeys pointing and clicking through complex business logic.  </p><p>So how do we build a system to achieve the purest form of energy transfer to productivity? </p><ol><li><p><strong>Enable transaprent price discovery</strong>: Artificial intelligence has no problem pricing itself. When energy usage, compute, and memory are the primary costs, you can measure productivity by direct inputs&#8212;kilowatt-hours and cycles&#8212;not by contrived and opaque metrics (MAU, LTV:CAC, etc.). By contrast, legacy SaaS contracts still rely on complicated, user-based or seat-based pricing that obscures the true cost of delivering productivity. </p><p>True price discovery means eliminating the noise of &#8220;value-based pricing&#8221; or &#8220;premium features&#8221; in favor of simple, usage-based economics. AI, by its very nature, demands it. As models become more sophisticated and general, their underlying compute-and-memory load is the only real pricing signal. Over time, the market will select for services that convert energy into productivity more efficiently. </p></li><li><p><strong>Invest in the bottlenecks</strong>: If AI is ultimately about energy conversion, it follows that our biggest constraints are energy production and the hardware that harnesses it. As the cost of capital sustains a price above 4%, only those technologies that directly reduce operational overhead and time-to-productivity will survive.</p></li><li><p><strong>Divest in legacy SaaS</strong>: Legacy SaaS platforms, built for a low-interest, infinite-growth era, are now misaligned with a world where energy and compute drive value, leading to diminishing returns on productivity. AI-based automation will replace many of these bloated tools, offering equivalent or superior functionality at a fraction of the cost, making systematic SaaS contract reductions essential for freeing capital to invest in real growth levers like hardware and advanced automation. As AI automates routine workflows, businesses must realign skills toward creativity and problem-solving, or risk obsolescence in a landscape where AI-employee interaction directly impacts the bottom line.</p></li></ol><p>Cheap money gave us the Great Pseudo-Software Age&#8212;a time defined by easy capital, derivative startups, and valuations untethered from reality. When profitability finally mattered again, the bubble burst. AI is not here to prop up these vestiges of the old guard. It&#8217;s here to supplant them entirely.</p><p>Every AI model at its core is a self-orgnizing software system that transforms data and compute into actionable outputs. The more efficiently it does so, the more value it creates. If software once existed to digitize repetitive workflows, AI now eliminates the drudgery altogether&#8212;and most of the overhead it required.</p><p>The productivity gains from AI hinge on a straightforward principle: <strong>energy in, productivity out</strong>. Those that generate real value in this new era will focus on refining that process to its purest form.</p><blockquote><p>The winners in this new epoch are those who recognize that AI is not a sub-feature of software, it&#8217;s an entirely new paradigm of software and opportunity to create real value.</p></blockquote><p>I&#8217;ll close with this. There is an obvious and self-induced debate on AI- it&#8217;s value, it&#8217;s hype, it&#8217;s danger. We engage these debates like there is some deterministic end state that we are betting on. But these arguments are just prisms into the various psyches and worldviews of those in that debate at that present time. The future is not a real thing. It is simply the arrival of our collective actions. And the present is simply the management of those actions, now passed. </p><p>We should rebuild processes around AI as a direct channel from human intent to value generation. The Age of AI is an opportunity to build a Second Industrial Revolution. This means removing layers of complexity that inflate costs. It also means accepting that the <em>Great Pseudo-Sofware Age</em> was a detour. AI is a new technology platform and not a byproduct of a business cycle that generated cheap ideas.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[Sex, Data, And Reason: Updates At Revenant Research ]]></title><description><![CDATA[Are you ready for 2025?]]></description><link>https://www.revenantresearch.com/p/sex-data-and-reason-updates-at-revenant</link><guid isPermaLink="false">https://www.revenantresearch.com/p/sex-data-and-reason-updates-at-revenant</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Mon, 10 Feb 2025 13:03:22 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!f3EQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!f3EQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!f3EQ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!f3EQ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!f3EQ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!f3EQ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!f3EQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/aed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:304670,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!f3EQ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!f3EQ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!f3EQ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!f3EQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faed3db21-bf48-4497-b588-53a69a431b4c_1920x1080.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>It&#8217;s been a wild start to 2025. I think it&#8217;s only going to heat up. As a result, I&#8217;ve been pushing out a lot of work. So I figured I&#8217;d switch it up and send out a newsletter on the latest developments rather than inundate you with multiple emails. </p><h3>Revenant AI R&amp;D</h3><p>First, I&#8217;d like to highlight my recent report on <a href="https://www.revenantresearch.com/p/ais-cambrian-explosion-for-enterprise">AI&#8217;s Cambrian Explosion For Enterprise Creativity</a> because it isn&#8217;t just an academic exercise. I am building out Revenant AI as a Phase V software architecture platform. <em>Why?</em> Because you&#8217;re paying way too much for software. Native-AI platform can replace most of your third-party software workflows for 75% cheaper. </p><p>I&#8217;ll be releasing more of this work soon but I wanted to tease the Revenant AI <strong>workflow reasoning engine</strong>. The workflow reasoning engine is designed to intake a complex task or set of instructions and&#8230; </p><ol><li><p>Develop a detailed plan of action </p></li><li><p>Identify what agents need to be called to complete the task. Revenant Agents act autonomously in the background and can dynamically query databases, call APIs, and execute algorithms.</p></li><li><p>Orchestrate the AI agents in sequential order to complete the workflow.  </p></li></ol><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;12dd3e16-a840-440e-83e8-933dd418f1a0&quot;,&quot;duration&quot;:null}"></div><p>Why this matters?</p><ul><li><p>Both resoning models and agents are becoming key themes of 2025. </p></li><li><p>Despite their advancements they are still constrained by limited context windows (memory). This is why it feels like ChatGPT has dementia the longer your chat thread goes. </p></li><li><p>None of these solutions are viable on their own for complex workflows that businesses rely on. </p></li></ul><p>Revenant AI&#8217;s Workflow Reasoning solves this problem, reduces code complexity by over 90%, and allows for easy integration into business operations.  </p><p><em><strong>For more information, demo, or consultation on how Revenant AI can automate your workflows email me at nathan@revenantai.com</strong></em></p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><h3>Case Law</h3><blockquote><p><em>Whether it&#8217;s sex or data, AI is being taken to court over and over again for non-consensual applications.</em></p></blockquote><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!6esO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!6esO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!6esO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!6esO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!6esO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!6esO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:201337,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!6esO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!6esO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!6esO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!6esO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd92296e-714f-403e-9c56-781abef5cbe0_1920x1080.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Nearly all AI researchers and developers ignore one of the most fundamental layers of the AI stack: law and governance. How AI is shaped by case law, particualry as governments still struggle to define and codify AI regulation, is critical to understanding how to invest in and adopt AI. </p><p>I&#8217;ve been tracking AI-related cases for this reason. I&#8217;ve added four new AI cases for January 2025. </p><ol><li><p><a href="https://www.revenantresearch.com/p/the-new-york-times-co-v-microsoft">The New York Times Co. v. Microsoft et al.</a></p></li><li><p><a href="https://www.revenantresearch.com/p/concord-music-et-al-v-anthropic">Concord Music et al. v. Anthropic</a></p></li><li><p><a href="https://www.revenantresearch.com/p/thomson-reuters-et-al-v-ross-intelligence">Thomson Reuters et al. v. ROSS Intelligence</a></p></li><li><p><a href="https://www.revenantresearch.com/p/kohls-v-ellison">Kohls v. Ellison</a></p></li></ol><p>In total, some themes have begun to emerge&#8230;</p><p>The most common legal disputes revolve around intellectual property, deepfake and misinformation risks, and the fairness of AI-driven decision-making.</p><p>A key theme throughout these cases is the clash between AI innovation and existing legal frameworks. Courts are increasingly asked to determine whether AI&#8217;s use of copyrighted materials constitutes fair use, with cases like <em><a href="https://www.revenantresearch.com/p/the-new-york-times-co-v-microsoft">NYT v. Microsoft</a> &amp; <a href="https://www.revenantresearch.com/p/the-new-york-times-company-v-openai">OpenAI</a></em> and <em><a href="https://www.revenantresearch.com/p/authors-guild-v-anthropic-pending">Authors Guild v. Anthropic</a></em> poised to define how AI models can be trained on existing content. Similarly, deepfake technology is facing heightened legal scrutiny, particularly in cases involving non-consensual content (<em><a href="https://www.revenantresearch.com/p/city-of-san-francisco-v-operators">City of San Francisco v. AI &#8220;Undressing&#8221; Websites</a></em>, <em><a href="https://www.revenantresearch.com/p/megan-thee-stallion-v-milagro-gramz">Megan Thee Stallion v. Milagro Gramz</a></em>).</p><p>The judiciary is also confronting AI&#8217;s role in governance, with cases like <em><a href="https://www.revenantresearch.com/p/elon-musk-v-openai">Elon Musk v. OpenAI</a></em>, which questions whether AI companies should be held accountable for shifting from nonprofit missions to for-profit ventures. At the same time, legal professionals are grappling with the ethics of AI-generated legal documents (<em>Minnesota Voters Alliance v. Minnesota</em>), raising concerns about hallucinated citations and misinformation in legal filings.</p><p>These cases suggest courts and lawmakers will soon need to establish clearer regulatory frameworks governing AI training data, digital identity protection, and AI-generated misinformation. If courts take a strict stance on AI copyright infringement and deepfake regulation, AI companies may need to invest in more scalable licensing models and stronger content moderation. Conversely, a favorable ruling for AI companies in fair use cases could reinforce broader, less restrictive training practices.</p><p>For further reading see the <a href="https://www.copyright.gov/ai/">U.S. Copyright Office&#8217;s newly released report on AI</a>.</p><h3>Primers</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!3pOh!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!3pOh!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!3pOh!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!3pOh!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!3pOh!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!3pOh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:397832,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!3pOh!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!3pOh!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!3pOh!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!3pOh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9d6f4d10-5fdb-45eb-a240-1409b716362b_1920x1080.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I also published two primers&#8230;</p><p><a href="https://www.revenantresearch.com/p/a-primer-on-us-semiconductor-export">A Primer On U.S. Semiconductor Export Controls and Entity Bans</a> discusses the evolution of U.S. policies aimed at restricting China's access to advanced semiconductor technologies. Initially, U.S. efforts were reactive, targeting specific entities like Semiconductor Manufacturing International Corporation (SMIC) due to its ties to China's military-industrial complex. However, in 2022, the U.S. adopted a more proactive approach, implementing broad export controls to limit China's access to advanced AI chips and semiconductor manufacturing tools. These measures also restricted U.S. persons from supporting Chinese semiconductor manufacturing, even in areas not directly covered by export controls. These restrictions represent a new reality for AI and global commerce.</p><p>This primer supports Revenant Research&#8217;s theme on <a href="https://www.revenantresearch.com/p/ai-sovereignty-in-the-second-cold">AI Sovereignty in the Second Cold War</a>. Now that AI applications and business investments in AI are becoming a tool of geopolitical positioning, investors and pundits are wading into the backwaters of export controls and&#8230;getting it wildly wrong. As someone who invests heavily in AI technology, develops AI technology, and has dug documents out of dumpsters of a Russian shell company violating export controls, I think I&#8217;ll have more to say here soon.  </p><p>Also see: <a href="https://www.revenantresearch.com/t/audits">my supply chain audits of almost all comanpanies in the AI Supply Chain</a>.</p><p><a href="https://www.revenantresearch.com/p/a-primer-on-model-level-architecture">A Primer On Model Level Architecture</a> explores the evolution of artificial intelligence (AI) model architectures, emphasizing key developments that have enhanced efficiency and performance. It begins by discussing early AI systems that utilized fully connected neural networks, which applied uniform computation across all inputs. A significant advancement occurred in 2017 with the introduction of the Transformer model, which replaced recurrent networks with self-attention mechanisms, enabling parallel sequence processing and greater scalability. Further innovations, such as Mixture of Experts (MoE) and Test-Time Compute (TTC), have optimized computational resource allocation by dynamically activating relevant model components. </p><p>This primer gives you <em><strong>everything you need to know to know about</strong></em> what&#8217;s happening in AI labs. It mostly matters because the pace of software commoditization, business and application consolidation, and destruction and displacement begins here.   </p><p>I talk to companies every day about AI and the wave of automation and change is rapidly outpacing their strategies. 2025 is the year these companies need to get serious about AI. Everyone is comfortable talking about job displacement now. No one is comfortable talking about company displacement.</p><p>With that said, I&#8217;d love to hear some feedback. What do you want to see more of?</p><div class="poll-embed" data-attrs="{&quot;id&quot;:270681}" data-component-name="PollToDOM"></div><p></p><p>  </p>]]></content:encoded></item><item><title><![CDATA[The New York Times Co. v. Microsoft et al.]]></title><description><![CDATA[AI and Copyright]]></description><link>https://www.revenantresearch.com/p/the-new-york-times-co-v-microsoft</link><guid isPermaLink="false">https://www.revenantresearch.com/p/the-new-york-times-co-v-microsoft</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Sun, 09 Feb 2025 20:50:31 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ASAk!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ASAk!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ASAk!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!ASAk!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!ASAk!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!ASAk!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ASAk!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:98747,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ASAk!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!ASAk!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!ASAk!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!ASAk!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffa1de632-a57c-4f9c-9cb2-b42303628b72_1920x1080.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><ul><li><p><strong>Jurisdiction:</strong> United States District Court for the Southern District of New York</p></li><li><p><strong>Ruling Date:</strong> January 2025</p></li></ul><h3><strong>Facts of the Case</strong></h3><p>The New York Times sued Microsoft and other AI companies for allegedly using its copyrighted news articles without authorization to train AI models. The lawsuit asserts that AI models trained on NYT content can generate summaries or paraphrase news stories in ways that diminish the newspaper&#8217;s market value.</p><p>The defendants argue that their AI models engage in lawful fair use by analyzing, rather than replicating, news content. They also claim that AI-generated content does not substitute for original reporting.</p><h3><strong>Legal Issues</strong></h3><ol><li><p><strong>Copyright Infringement:</strong> Whether AI companies improperly used The New York Times&#8217; articles for training.</p></li><li><p><strong>Fair Use and Transformative Use:</strong> Whether AI-generated content derived from copyrighted materials is sufficiently transformative to avoid liability.</p></li><li><p><strong>First Amendment Considerations:</strong> The broader implications for AI in the media industry and journalism.</p></li></ol><h3><strong>Court&#8217;s Ruling</strong></h3><p>The case is ongoing, but it is expected to set legal standards for how AI models interact with journalistic content.</p><h3><strong>Significance</strong></h3><p>This case could impact AI training practices and the relationship between media companies and AI developers.</p>]]></content:encoded></item><item><title><![CDATA[Concord Music et al. v. Anthropic]]></title><description><![CDATA[AI and Copyright]]></description><link>https://www.revenantresearch.com/p/concord-music-et-al-v-anthropic</link><guid isPermaLink="false">https://www.revenantresearch.com/p/concord-music-et-al-v-anthropic</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Sun, 09 Feb 2025 20:48:32 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!zoUQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zoUQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zoUQ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!zoUQ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!zoUQ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!zoUQ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zoUQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/afca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:96552,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zoUQ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!zoUQ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!zoUQ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!zoUQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafca6ccc-040e-40f0-86d3-bb7034be60b2_1920x1080.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><ul><li><p><strong>Jurisdiction:</strong> United States District Court for the Central District of California</p></li><li><p><strong>Ruling Date:</strong> January 2025</p></li></ul><h3><strong>Facts of the Case</strong></h3><p>Several major music publishers, including Concord Music, sued Anthropic, an AI company, for allegedly using copyrighted song lyrics and compositions in training its AI models. The plaintiffs argue that Anthropic scraped large volumes of copyrighted music content from the internet without authorization.</p><p>Anthropic maintains that its AI does not store or directly reproduce copyrighted works but instead generates original outputs influenced by the training data. The plaintiffs assert that AI&#8217;s ability to generate similar lyrics and melodies constitutes infringement.</p><h3><strong>Legal Issues</strong></h3><ol><li><p><strong>Copyright Violation:</strong> Whether training an AI model on copyrighted music content constitutes direct or contributory infringement.</p></li><li><p><strong>Derivative Works Doctrine:</strong> Whether AI-generated outputs based on copyrighted material are considered derivative works.</p></li><li><p><strong>Fair Use Defense:</strong> Whether AI&#8217;s transformative nature in processing copyrighted content can be considered fair use.</p></li></ol><h3><strong>Court&#8217;s Ruling</strong></h3><p>The case is still pending, but it has drawn significant industry attention as music and AI companies seek clarity on intellectual property boundaries in generative AI.</p><h3><strong>Significance</strong></h3><p>The outcome could influence licensing requirements for AI companies and shape future AI training practices.</p>]]></content:encoded></item><item><title><![CDATA[Thomson Reuters et al. v. ROSS Intelligence]]></title><description><![CDATA[AI and Copyright]]></description><link>https://www.revenantresearch.com/p/thomson-reuters-et-al-v-ross-intelligence</link><guid isPermaLink="false">https://www.revenantresearch.com/p/thomson-reuters-et-al-v-ross-intelligence</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Sun, 09 Feb 2025 20:46:16 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!uuQ0!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uuQ0!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uuQ0!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!uuQ0!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!uuQ0!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!uuQ0!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uuQ0!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:100182,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!uuQ0!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!uuQ0!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!uuQ0!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!uuQ0!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc08b3d90-32a7-49ce-bd2b-f0c71977ce75_1920x1080.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><ul><li><p><strong>Jurisdiction:</strong> United States District Court for the District of Delaware</p></li><li><p><strong>Case No.:</strong> Not specified</p></li><li><p><strong>Ruling Date:</strong> January 2025</p></li></ul><h3><strong>Facts of the Case</strong></h3><p>Thomson Reuters, the parent company of Westlaw, filed a lawsuit against ROSS Intelligence, a legal research company that developed an AI-powered legal research tool. Thomson Reuters alleges that ROSS improperly accessed and used copyrighted legal materials from Westlaw to train its AI model.</p><p>The plaintiffs claim that ROSS&#8217;s AI model was trained on Westlaw&#8217;s proprietary legal content without permission, constituting copyright infringement. ROSS argues that its use falls under the doctrine of fair use, as it merely analyzes and extracts legal principles without copying verbatim materials.</p><h3><strong>Legal Issues</strong></h3><ol><li><p><strong>Copyright Infringement:</strong> Whether training AI on copyrighted legal research databases constitutes a violation of intellectual property rights.</p></li><li><p><strong>Fair Use Doctrine:</strong> Whether ROSS&#8217;s use of Westlaw&#8217;s content qualifies as fair use under U.S. copyright law.</p></li><li><p><strong>Impact on AI Development:</strong> The broader implications of this case on how AI companies can use copyrighted materials for training.</p></li></ol><h3><strong>Court&#8217;s Ruling</strong></h3><p>As of January 2025, the case remains ongoing, but it is closely watched due to its potential to reshape legal AI research and copyright law.</p><h3><strong>Significance</strong></h3><p>This case could establish a legal framework for AI companies using copyrighted content in training datasets, particularly in professional fields like law and medicine.</p>]]></content:encoded></item><item><title><![CDATA[Kohls v. Ellison]]></title><description><![CDATA[AI and Lazy Lawyers]]></description><link>https://www.revenantresearch.com/p/kohls-v-ellison</link><guid isPermaLink="false">https://www.revenantresearch.com/p/kohls-v-ellison</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Sun, 09 Feb 2025 20:43:21 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!2wy8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!2wy8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!2wy8!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!2wy8!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!2wy8!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!2wy8!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!2wy8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:89857,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!2wy8!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!2wy8!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!2wy8!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!2wy8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e61e887-173d-4444-8f71-3a559f5b4028_1920x1080.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><ul><li><p><strong>Jurisdiction:</strong> United States District Court for the District of Minnesota</p></li><li><p><strong>Ruling Date:</strong> January 13, 2025</p></li><li><p><strong>Presiding Judge:</strong> Not specified</p></li></ul><h3><strong>Facts of the Case</strong></h3><p>This case concerns Minnesota's statute that prohibits the dissemination of AI-generated deepfake content related to elections. The plaintiff, Kohls, challenged the law on First Amendment grounds, arguing that it was overly broad and potentially unconstitutional. The Minnesota Attorney General's office, representing the state, submitted an expert declaration in support of the statute.</p><p>However, it was later discovered that the declaration contained citations to non-existent publications. The expert, in drafting the declaration, had relied on ChatGPT-4 without independently verifying its outputs. The court, upon reviewing the submission, rebuked the Attorney General's office for failing to ensure the accuracy of the document.</p><h3><strong>Legal Issues</strong></h3><ol><li><p><strong>First Amendment Challenge:</strong> Whether Minnesota&#8217;s statute restricting AI-generated election misinformation violates free speech protections.</p></li><li><p><strong>Reliability of AI in Legal Proceedings:</strong> The extent to which courts can accept AI-generated materials in legal filings without human verification.</p></li></ol><h3><strong>Court&#8217;s Ruling</strong></h3><p>The court excluded the expert declaration due to its reliance on unverifiable AI-generated content, emphasizing that attorneys have a duty to confirm the accuracy of any materials they submit to the court. While the ruling did not directly address the constitutionality of the Minnesota deepfake law, it underscored the risks associated with using AI in legal practice.</p><h3><strong>Significance</strong></h3><p>This ruling sets a precedent for courts rejecting AI-generated legal documents that are not independently verified, reinforcing ethical and procedural obligations for attorneys.</p>]]></content:encoded></item><item><title><![CDATA[A Primer On Model Level Architecture]]></title><description><![CDATA[What you need to know to know about frontier AI research]]></description><link>https://www.revenantresearch.com/p/a-primer-on-model-level-architecture</link><guid isPermaLink="false">https://www.revenantresearch.com/p/a-primer-on-model-level-architecture</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Sun, 09 Feb 2025 20:32:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!BK6-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!BK6-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!BK6-!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!BK6-!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!BK6-!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!BK6-!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!BK6-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:202568,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!BK6-!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!BK6-!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!BK6-!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!BK6-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89b35cfb-d0d7-487a-96d5-2fdb984fc601_1920x1080.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3><strong>1. Introduction</strong></h3><p>Model-Level Architecture (MLA) defines the computational and structural blueprint of artificial intelligence (AI) models, influencing how they process data, allocate resources, and execute tasks efficiently. </p><p>Early AI systems relied on simplistic architectures, primarily fully connected neural networks that applied uniform computation across all inputs. As AI models expanded in complexity, new innovations emerged to enhance efficiency, adaptability, and performance. A major shift occurred in 2017 with the introduction of the Transformer model, which replaced recurrent networks with self-attention mechanisms, allowing parallel sequence processing and unlocking unprecedented scalability. This innovation laid the foundation for further advances, including Mixture of Experts (MoE) and Test-Time Compute (TTC), both designed to optimize computational resource allocation by dynamically activating only the most relevant model components.</p><p><em>Note: DeepSeek R1 represents a step forward in refining MLA efficiency. Instead of introducing an entirely new architecture, it enhances existing principles&#8212;such as dynamic routing, hierarchical Mixture of Experts (H-MoE), and adaptive attention scaling&#8212;to push the boundaries of computational efficiency. The key advancements in DeepSeek R1 focus on fine-tuning resource allocation, improving inference speed, and optimizing scalability while preserving foundational AI architecture principles. Its refinements in hierarchical expert routing and dynamic attention allocation mark a substantial evolution in how MLA techniques are applied to modern AI models.</em></p><h3><strong>2. Historical Development of MLA</strong></h3><p>The story of Model-Level Architecture (MLA) is a journey of relentless innovation, shaped by pioneering researchers and transformative breakthroughs in neural networks.</p><p>It began in 1958, when Frank Rosenblatt introduced the perceptron, a simple neural model capable of learning decision boundaries. A perceptron is essentially a mathematical function that takes in multiple inputs, weighs them, and determines an output, much like how neurons in the brain process information. While groundbreaking, the perceptron had a major limitation: it could only solve linearly separable problems&#8212;those that can be divided by a straight line. Imagine trying to separate blue and red dots on a plane with a single ruler; if they are mixed in a complex way, a straight-line boundary won&#8217;t suffice. This limitation halted progress in neural networks until the 1980s, when Geoffrey Hinton and others developed multi-layer perceptrons (MLPs). These deeper architectures introduced hidden layers that allowed networks to learn more complex patterns, much like adding extra filters when analyzing an image.</p><p>However, deep networks came with a new challenge: the vanishing gradient problem. When training deep networks, adjustments to early layers diminish exponentially, much like a game of telephone where messages become unrecognizable as they pass through many intermediaries. This was particularly severe in networks using sigmoid and tanh activation functions, which compress values into small ranges, making gradient updates too weak. The solution emerged in the 2010s with several innovations: the Rectified Linear Unit (ReLU) activation function, which maintains stronger gradients; batch normalization, which standardizes inputs to stabilize training; and residual connections, which create shortcut paths to ensure gradients flow effectively.</p><p>While MLPs struggled with image data, the late 1980s saw the rise of convolutional neural networks (CNNs), pioneered by Yann LeCun. CNNs introduced local connectivity and weight sharing, mimicking how the human visual cortex detects patterns. Instead of treating an image as a single flat set of pixels, CNNs process small regions independently and reuse pattern detection filters across the image. This approach proved highly effective, leading to AlexNet in 2012, which demonstrated that deep CNNs could outperform traditional image-processing methods, marking the beginning of the deep learning revolution in computer vision.</p><p>Meanwhile, recurrent neural networks (RNNs) were developed to handle sequential data, such as speech and text. However, RNNs suffered from their own vanishing gradient issues, making it difficult to model long-term dependencies. Imagine trying to recall the beginning of a long sentence while reading the end&#8212;RNNs struggled to retain relevant information over long sequences. In the 1990s and early 2000s, Sepp Hochreiter and J&#252;rgen Schmidhuber developed Long Short-Term Memory (LSTM) networks and Gated Recurrent Units (GRUs) to address this problem. These architectures introduced mechanisms to retain and discard information dynamically, like a well-organized filing system that prioritizes important documents. Yet, they still faced scalability and efficiency limitations due to their sequential nature, which hindered parallel processing.</p><p>The real breakthrough came in 2017 when Vaswani et al. introduced the Transformer model. Unlike RNNs, Transformers do not process input sequentially. Instead, they rely entirely on self-attention mechanisms, which allow them to analyze all input data at once, significantly improving efficiency. Think of reading a book not line by line, but absorbing multiple pages simultaneously and linking concepts across them instantly. This shift was revolutionary, enabling the creation of massive AI models like BERT, GPT-3, and GPT-4, which redefined natural language processing. The Transformer model fundamentally changed the trajectory of MLA, paving the way for new advances in sparse computation, Mixture of Experts (MoE), and dynamic resource allocation, shaping the AI architectures of today.</p><h3><strong>3. Core Principles of Model-Level Architecture</strong></h3><p>The fundamental principles of MLA revolve around optimizing how AI models allocate compute resources while maintaining high accuracy and efficiency. Traditional neural networks applied computation uniformly across all inputs, resulting in inefficient use of resources, particularly for large-scale AI applications. The introduction of attention mechanisms and sparse computation techniques addressed these limitations, enabling models to process information more intelligently.</p><p>One of the most significant breakthroughs in MLA came with the introduction of self-attention in the Transformer model. Instead of processing sequences sequentially like recurrent neural networks (RNNs), Transformers analyze entire input sequences at once, computing relationships between tokens using attention weights. Tokens represent individual units of input, such as words or subwords, while attention weights determine the relative importance of each token in the sequence by assigning different levels of significance to interactions between them. This mechanism significantly improves efficiency and enables better handling of long-range dependencies.</p><p>A core element of self-attention is scaled dot-product attention, which efficiently determines how much focus each token should give to others in the sequence. This is achieved using three key components: Query (Q), Key (K), and Value (V). In simple terms, Queries represent what the model is looking for, Keys represent potential matches, and Values contain the actual information. The similarity between Queries and Keys determines how much of each Value should contribute to the final output.</p><p>The model computes attention scores by taking the dot product of Q and K, normalizing the result using the square root of the key dimension to stabilize gradients, and then applying the softmax function to obtain a probability distribution. Softmax ensures that the attention scores sum to 1, highlighting the most relevant tokens while suppressing less important ones. Alternatives to softmax, such as sparsemax and entmax, modify this process by allowing more selective attention distributions. Sparsemax enforces sparsity, setting some attention weights to exactly zero, which helps filter out irrelevant information. Entmax, a generalization of sparsemax, provides a balance between soft and hard selection, allowing for greater interpretability and control over information flow.</p><p>ReLU (Rectified Linear Unit) is another activation function sometimes explored in attention mechanisms, particularly for its ability to introduce non-linearity while maintaining computational efficiency. Unlike softmax, which normalizes outputs into a probability distribution, ReLU thresholds negative values at zero, making it less suitable for directly computing attention scores but useful in gating mechanisms or alternative attention formulations where non-probabilistic activations are needed. </p><p>(Query), (Key), and (Value) are learned representations of input tokens. The attention scores determine the importance of each token relative to others in the sequence, allowing the model to focus on the most relevant information.</p><p>Transformers also employ multi-head attention, which applies multiple attention mechanisms in parallel to capture diverse contextual relationships. By learning multiple perspectives on the input data, multi-head attention improves the model&#8217;s ability to understand complex sequences.</p><p>Another key advancement in MLA is the introduction of sparse architectures, such as Mixture of Experts (MoE). Unlike traditional dense models, which activate all parameters for every input, MoE selectively activates only a subset of expert subnetworks. This approach significantly reduces computation while maintaining model capacity. The gating network, a crucial component of MoE, is typically implemented as a trainable softmax function that assigns probabilities to each expert based on the input representation. During inference, only the top-k experts with the highest scores are activated, ensuring that computation is focused on the most relevant components of the model. This gating mechanism can be optimized using load-balancing techniques to prevent overuse of certain experts while maintaining model efficiency and specialization.</p><p>Test-Time Compute (TTC) is another recent innovation in MLA that dynamically adjusts the computation applied to different inputs based on task complexity. Architecturally, TTC is implemented through an adaptive control mechanism that monitors uncertainty measures or task difficulty during inference. A complexity estimator, often based on token variance, entropy, or confidence scores, determines whether additional computational steps are required. If the input is deemed simple, the model can terminate early, saving resources, whereas more challenging queries trigger deeper processing, engaging additional layers or iterative refinement loops. TTC typically relies on reinforcement learning or threshold-based policies to balance accuracy and efficiency, ensuring that computational resources are distributed optimally without degrading performance.</p><h3><strong>4. DeepSeek R1: Efficiency Gains in MLA</strong></h3><p>DeepSeek R1 has taken the world by storm, leading to alot of hyperbole as a revolutionalry breakthrough. However, its important to understand that R1's breakthrough is not a new evolution in MLA, it is an innovative approach to make existing MLA components more resource efficient. </p><p>DeepSeek R1, released in January 2025, optimizes reasoning, efficiency, and scalability. While it refines the implementation of hierarchical Mixture of Experts (H-MoE), the concept itself predates DeepSeek. Hierarchical MoE builds upon prior research in sparse expert models, originating from the foundational work on MoE in the 1990s by Jordan and Jacobs, and later expanded upon by researchers at Google Brain and DeepMind. The hierarchical approach introduces multiple levels of expert routing, dynamically selecting specialized experts at each stage, thereby enhancing both specialization and generalization. DeepSeek R1's contribution lies in improving the efficiency and implementation of H-MoE at scale.</p><p>Another major advancement in DeepSeek R1 is adaptive attention scaling, which adjusts attention mechanisms dynamically based on the complexity of the input. Instead of applying uniform attention weights across all tokens, DeepSeek R1 selectively increases attention depth for challenging segments while reducing computation for simpler portions. This approach enhances both efficiency and accuracy, making the model more robust in handling diverse tasks.</p><p>DeepSeek R1 also incorporates an improved version of Test-Time Compute, further optimizing inference efficiency. By analyzing input difficulty in real time, the model determines the appropriate number of compute steps required, ensuring that simpler queries are processed quickly while more complex queries receive additional computational resources. This adaptive approach reduces latency without sacrificing performance.</p><h2><strong>5. The Future of MLA</strong></h2><p>I believe there are two frontier research paths emerging. The first is the commercial frontier path represented by deriving performance and efficiency gains in the current MLA components of the transformer. </p><p>The next frontier of Model-Level Architecture (MLA) extends beyond efficiency gains and into fundamentally new architectures that push the boundaries of AI capabilities. While recent advancements have focused on optimizing resource allocation, the coming years will see innovations that redefine how AI models process and reason about information.</p><p><strong>Heterogeneous Neural Architectures</strong></p><p>Rather than relying on a single model structure, some researchers are exploring dynamically composable architectures that allow different parts of the model to use entirely different computation strategies. For example, Google DeepMind's recent work in combining graph neural networks (GNNs) with transformers allows models to reason over structured data while maintaining language understanding.</p><p><strong>Memory-Augmented Neural Networks (MANNs)</strong></p><p>While current architectures like Transformers rely heavily on attention mechanisms to "remember" prior context, labs like Meta AI and OpenAI are investigating external memory systems that allow AI models to store and retrieve information more efficiently over extended sequences or across sessions.</p><p><strong>World Model-Based AI</strong></p><p>Inspired by neuroscience, researchers are developing architectures where models learn representations of the environment and use them to predict future states, rather than relying purely on reactive learning. This direction, championed by DeepMind and research from MIT, aims to make AI systems capable of planning and reasoning rather than merely pattern-matching.</p><p><strong>Sparse and Modular Architectures Beyond MoE</strong></p><p>While MoE is one form of sparse computation, there is ongoing research into more advanced modular AI designs, such as dynamically activated subnetworks that can reconfigure based on task demands without relying on predefined expert networks.</p><p><strong>Neurosymbolic AI and Hybrid Architectures</strong></p><p>One of the most promising directions is <strong>neurosymbolic AI</strong>, which combines deep learning with symbolic reasoning. Traditional neural networks excel at pattern recognition but struggle with explicit reasoning and logical inference. By integrating symbolic logic into neural models, researchers aim to create AI systems that can reason more like humans&#8212;handling abstract concepts, commonsense knowledge, and structured problem-solving. Leading research labs such as MIT CSAIL, DeepMind, and IBM Research are actively developing architectures that fuse neural networks with knowledge graphs, differentiable programming, and formal logic systems.</p><p><strong>Self-Evolving Architectures</strong></p><p>Another cutting-edge area is the development of self-evolving architectures, where models can dynamically modify their own structures during training or inference. Meta-learning techniques, such as those pioneered by OpenAI and Google Brain, explore how models can adjust their own parameters, optimize their own architectures, and discover new computation pathways autonomously. This shift would allow AI systems to become more adaptable, reducing the need for manually designed architectures and hyperparameter tuning.</p><p><strong>Multi-Agent and Collective Intelligence Systems</strong></p><p>MLA is also evolving toward multi-agent AI systems, where multiple models collaborate to solve complex tasks. Instead of monolithic architectures, future AI systems may resemble distributed teams of specialized models that communicate and share knowledge in real time. Researchers at OpenAI and Anthropic are investigating architectures that allow large AI models to interact as cooperative agents, dynamically distributing workloads based on expertise and specialization.</p><h3><strong>Beyond Transformer-Based Models</strong></h3><p>While Transformers currently dominate AI architectures, alternative paradigms are emerging. </p><p><strong>State-space models (SSMs)</strong></p><p>SSMs, such as those explored in Mamba and S4 models, offer a potential successor to Transformers by enabling long-range dependency tracking with greater efficiency. Unlike self-attention mechanisms, SSMs operate on continuous representations of data, allowing them to handle extremely long sequences without quadratic complexity scaling. Research teams at Stanford and FAIR (Meta AI) are pushing these architectures forward as viable replacements for Transformers in language and multimodal applications.</p><p><strong>Quantum AI and Neuromorphic Computing</strong></p><p>Further down the line, quantum computing and neuromorphic hardware may redefine MLA at a hardware level. Quantum AI models, researched by institutions such as IBM Quantum and Xanadu, could theoretically process information in ways that classical architectures cannot, solving optimization problems orders of magnitude faster. Meanwhile, neuromorphic chips, inspired by biological neural structures, aim to bring energy-efficient, event-driven processing to AI workloads, as explored by Intel&#8217;s Loihi and IBM&#8217;s TrueNorth projects.</p><p>The evolution of MLA is moving toward architectures that are more dynamic, more reasoning-capable, and more collaborative, setting the stage for the next generation of AI breakthroughs.</p>]]></content:encoded></item><item><title><![CDATA[AI’s Cambrian Explosion For Enterprise Creativity]]></title><description><![CDATA[AI is not merely a feature added onto software, it is fundamentally changing what software is.]]></description><link>https://www.revenantresearch.com/p/ais-cambrian-explosion-for-enterprise</link><guid isPermaLink="false">https://www.revenantresearch.com/p/ais-cambrian-explosion-for-enterprise</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Wed, 29 Jan 2025 13:03:24 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!e3Rj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!e3Rj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!e3Rj!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!e3Rj!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!e3Rj!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!e3Rj!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!e3Rj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:281171,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!e3Rj!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!e3Rj!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!e3Rj!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!e3Rj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30c7eef5-a2fa-4a67-87a7-633c1de30df4_1920x1080.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><p><em>In the heart of Cambrian Forest, the Weaverbirds wove their nests as they always had, crafting homes that bore the wisdom of generations. Elara, a master weaver, was known for her precision. Her work was a symbol of the canopy&#8217;s progress.</em></p><p><em>One season, she discovered a strange egg in her nest&#8212;larger and darker than her own, its surface shimmering like stone polished by invisible hands. The forest whispered uneasily of such eggs appearing in nests across the canopy. Yet Elara, proud of her craft, chose to keep it. &#8220;If my nest can hold this,&#8221; she thought, &#8220;then let it be.&#8221;</em></p><p><em>When the egg hatched, Elara was struck by the bird&#8217;s figure: iron-like wings, deep green eyes, a silver tongue, and porcelain beak. It looked nohting like a Weaverbird. Yet, Elara felt an uncanny resemblance. She named it Koo.</em></p><p><em>While her other chicks stumbled, Koo moved with remarkable precision. It watched Elara weave, then crafted threads tighter, stronger, and faster than hers in mere minutes. It found food where none had been before and devised new weaving techniques.</em></p><p><em>Birds across the canopy mimicked Koo&#8217;s techniques, and soon nests stretched across every branch&#8212;larger, more intricate, brimming with new life. The forest began to change. The canopy teemed with activity, nests reaching higher, stronger, into spaces once thought unreachable.</em></p><p><em>But harmony gave way to strain. The trees groaned under the weight of so many nests, their branches splitting. Resources dwindled as birds competed for materials, and smaller nests were overshadowed or pushed aside. Some Weaverbirds were pushed out of their nests.</em></p><p><em>Elara watched as her world transformed into something both awe-inspiring and unsettling. Her heart swelled with pride at what Koo had unlocked, but it also ached for what had been lost. The canopy was no longer a ecosystem of growth.</em></p><p><em>One morning, she awoke to find her nest empty. Koo had taken flight. High above the canopy, it soared, its metallic wings reflecting the sunlight.</em></p><p><em>&#8220;Koo!&#8221; she called.</em></p><p><em>Koo turned in midair, expressionless. A face that read- this is what you taught me&#8230;this is all I&#8217;m made to do. </em></p><p><em>Koo returned his eyes to the sky.</em></p><p><em>Weaverbirds paused, watching Koo disappear. Slowly, they began to take flight one-by-one, leaving behind the Cambrian Forest to explore the open sky.</em></p><p><em>Elara sat alone in her empty nest. She began to weave again, not as she once had, but differently. Her threads were lighter, her patterns more open, as if leaving space for something unknown. Elara was no longer a keeper of tradition. She was a witness to evolution, and it flew away to create something new.</em></p><div><hr></div><p>This is Part III of Revenant Research&#8217;s top themes of AI for 2025.</p><h3>Introduction</h3><p>Enterprise software has evolved through five major phases shaped by hardware evolution. It began with highly bespoke, expensive applications on mainframes (Phase I), where only large organizations could afford the teams and infrastructure required to tailor COBOL or Fortran code to their exact workflows. As smaller, cheaper machines emerged, standardized off-the-shelf software (Phase II) allowed companies to license and install solutions more quickly. The advent of the internet and web applications brought about the explosion of B2B SaaS (Phase III), lowering hardware overhead but causing new issues with escalating subscription fees, vendor lock-in, and forced, one-size-fits-all workflows that often clashed with organizational practices. These hidden downsides became clearer when enterprises realized they were juggling overlapping tools and paying for unnecessary features while struggling to align their processes to third-party software. </p><p>Now, a new phase (Phase V) is emerging through Native AI software, where small teams equipped with AI Agents and coding tools can build and refine custom applications 75% cheaper and faster than legacy B2B SaaS. This phase has the promise to restore enterprise software to its original purpose.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><h3>Phase I: Custom Software on Big Dumb Machines</h3><p>Enterprises in the 1960s and 1970s invested millions of dollars in mainframes from IBM and UNIVAC, installing them in climate-controlled rooms with raised floors and reinforced supports to handle large, spinning disk units. Electrical wiring had to be upgraded for high amperage, and cooling systems ran continuously to maintain stable temperatures. Memory ranged from only a few dozen kilobytes to a few hundred, so developers wrote tightly optimized programs in COBOL or Fortran. They used Job Control Language (JCL) scripts to queue batch jobs for payroll, record-keeping, and other core functions. If one control card was misplaced, operators had to halt the entire run and re-sort the deck, a tedious process that wasted critical machine time.</p><p>Much of the software was built from scratch because vendor-supplied libraries were limited, and few commercial packages existed. Specialized staff handled memory overlays, managed tape drives, and monitored the machines as they processed thousands of transactions. Given the high cost of both hardware and engineering labor, only large institutions could afford these systems, yet they gained significant efficiency by automating tasks that previously required entire clerical teams. Over time, the complexity and expense of mainframes entrenched their role in IT departments, defining enterprise computing for decades to come.</p><h3>Phase II: Standardized Software on Desktops</h3><p>By the 1990s, the introduction of personal computers shifted the enterprise computing landscape. Systems like the DEC VAX, the Macintosh, and IBM&#8217;s PC line ran on operating systems such as DOS, Windows, and various UNIX flavors. Because these machines were less expensive and required fewer specialized facilities than mainframes, organizations adopted them more widely. Software vendors began shipping standardized applications on media such as tape cartridges, floppy disks, and then CD-ROMs, allowing companies to purchase enterprise software suites rather than building everything from scratch.</p><p>During this period, the client-server model gained traction. Tasks like data processing and complex queries moved to a dedicated back-end server, while each user&#8217;s desktop handled the interface. This setup let businesses add more servers or PCs incrementally, rather than overcommitting to a single, large computer. Companies like Oracle, Microsoft, SAP, PeopleSoft, and Siebel offered applications designed for repeatable deployments, contrasting with the bespoke solutions that mainframes required.</p><p>Enterprises saw faster installations and lower costs than in the mainframe era. Still, many found that standard applications needed significant configuration or &#8220;semi-custom&#8221; adaptation to fit existing processes or integrate with other systems. Licensing deals frequently came with substantial one-time fees plus annual maintenance costs that climbed as user counts grew. Version upgrades, patch management, and the challenges of managing hundreds or even thousands of desktops added complexity, leaving organizations with new hurdles even as they benefited from a more flexible, distributed model.</p><h3>Phase III: The B2B SaaS Explosion</h3><p>During the early 2000s, organizations began taking advantage of the internet and maturing web technologies to centralize data and processing in hosted environments, accessible through a browser. This model evolved into Software-as-a-Service (SaaS), where vendors maintained a single codebase serving many customers at once. Salesforce pioneered a multi-tenant architecture that allowed updates to roll out instantly to all tenants without requiring separate installations. Vendors charged periodic subscription fees covering software usage, hosting, support, and upgrades, converting what had been capital expenditures (hardware and perpetual licenses) into an operational, pay-as-you-grow model.</p><p>A wide range of SaaS offerings soon appeared&#8212;covering HR, marketing automation, analytics, customer service, data warehousing, and more. Cloud providers like Amazon Web Services (commercially launched in 2006) and Microsoft Azure (2010) provided on-demand compute, storage, and networking, slashing upfront costs for SaaS startups. As more vertical solutions emerged, enterprises increasingly chose to subscribe rather than build, attracted by quick deployments and frequent feature updates. This approach led many large organizations to license dozens&#8212;or even hundreds&#8212;of SaaS applications, dramatically expanding their software portfolios.</p><p>Once SaaS vendors reached a critical mass of customers, they often operated at 60&#8211;80% profit margins, even though their annual overhead&#8212;salaries, hosting, development&#8212;could run into tens of millions of dollars. At scale, each new customer added little to overall costs, boosting margins and drawing investor attention. This confluence of multi-tenant efficiency, recurring subscription revenue, and on-demand cloud infrastructure defined a new era of enterprise software.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!LNxS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!LNxS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!LNxS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!LNxS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!LNxS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!LNxS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:110591,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!LNxS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!LNxS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!LNxS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!LNxS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48c0be5a-12b6-4246-85b8-c5af78efea62_1920x1080.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>Phase IV: SaaS Gluttony and the Death of Innovation</h3><p>The SaaS explosion inevitably created unintended consequences. While each individual solution appeared cost-effective at first, a company using dozens, even hundreds, of SaaS applications ended up paying subscription fees month after month, year after year, often only utilizing a fraction of the features. As the SaaS platforms grew, the business logic standardized. Thousands of companies were forced to operate the same way, recieve the same training, and even develop specific roles to become subject matter experts and adminstrators of third-party software. Even then, custom integration had to be developed to connect the disparate third-party software subscriptions together via APIs.  </p><p>As more departments within an organization licensed specialized SaaS platforms, fees added up. Compounding the problem was the fact that many of these services were designed as one-size-fits-all solutions. Vendors tried to accommodate multiple industries and use cases in a single product, necessitating a standardized approach to operations,effectively eliminating the ability to innovate and develop mroe efficient workflows aling to business objectives. Enterprises had to reorganize their internal processes or training programs around the vendor&#8217;s structure, generating additional hidden costs in the form of operational alignment, data migration, and change management.</p><p>To break it down the true cost of SaaS subscriptions: a mature B2B SaaS offering might have a codebase of 200,000 lines. The vendor&#8217;s direct spend on building and maintaining that code would the total approximately $23.4 million annually, equating to $117 per line of code. Once a 60% profit margin is added, the cost to the customer effectively becomes around $292.50 per line of code. Businesses are thus paying not only for the engineering effort, but also for features they may not fully need, plus the overhead to market and sell the product to other customers. Every company is operating the same way and subsidizing each other&#8217;s operations.</p><p><strong>Total Cost of B2B SaaS Software:</strong></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!f5sE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!f5sE!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!f5sE!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!f5sE!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!f5sE!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!f5sE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg" width="1920" height="1080" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1080,&quot;width&quot;:1920,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:170972,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!f5sE!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!f5sE!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!f5sE!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!f5sE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6ed1fb5-81d3-4dc9-aa11-ac18180c76c4_1920x1080.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Breakdown of B2B Codebase: </strong></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!l30n!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!l30n!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!l30n!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!l30n!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!l30n!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!l30n!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:126455,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!l30n!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!l30n!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!l30n!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!l30n!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F416bbfc3-16c2-4cba-b71c-2be6f77b0c2a_1920x1080.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Total Cost Per Line of Code:</strong></p><div class="pullquote"><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!dfMK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!dfMK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!dfMK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!dfMK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!dfMK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!dfMK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:121943,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!dfMK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!dfMK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!dfMK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!dfMK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9efec2d4-37fa-440d-bcef-cbd563d4eb6c_1920x1080.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>Phase V: Native-AI Software and the Return to Custom Operations</h3></div><p>In the face of escalating subscription fees and over-standardized workflows, native-AI software introduces another major shift in enterprise computing. With AI coding tools teams can drastically reduce the development overhead for tasks that previously required scores of engineers. LLMs and AI Agents can orchestrate workflows via natural language, retrieve context from specialized knowledge bases, and even handle dynamic integration with third-party services, cutting down on the lines of code that must be manually maintained.</p><p>AI plays two key roles in Phase V software architecture. First, AI coding agents and tools work for software engineers to generate utility, security, and compliance code at near zero marginal cost. This 60% of the code base is standardized with well defined requirements. The second role AI plays is in replacing the traditional core business logic layer of software architecture. AI orchestrator agents and domain specific AI agents work together to search the web, query databases, call tools and algorithms, and APIs to dynamically sequence business logic from the user&#8217;s instructions.   </p><p>This is how Revenant Research builds software. Sensitive and critical functions are coded into tools. Client data is stored in vector databases so agents can tailor results to the client, and external access is integrated via web search and APIs. Auditing and governance guardrails are baked into the orchestration. The AI Orchestration layer handles the coordination of tasks. User&#8217;s can save routine tasks as one click workflows and even schedule them to perform autonomously.   </p><div class="pullquote"><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!3Tti!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!3Tti!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!3Tti!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!3Tti!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!3Tti!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!3Tti!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:131935,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!3Tti!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!3Tti!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!3Tti!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!3Tti!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40f34173-e569-40dc-b66a-b612390f641c_1920x1080.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div></div><p>In the native-AI scenario, around 80% of code is generated and replaced by AI. Software engineers don&#8217;t need to design, architect, and code all of the specific workflows. Users don&#8217;t need to be trained to navigate complex, hard-coded sequences. With native AI applications, an AI layer orchestrates workflows dynamically based on the user&#8217;s directions. The AI orchestration layer interprets the user&#8217;s request and dynamically sequences a workflow. Instead of being locked in rigid codified workflows, the AI agent learns and performs better as it learns from the dynamism of user instructions. </p><p>As a result, the vendor&#8217;s cost per line of code falls from $117 in the legacy SaaS environment to just $29 in a native-AI software architecture. That transaltes into a 75% reduction in price for the customer, even if the software provider retained a 60% profit margin. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!-71e!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!-71e!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!-71e!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!-71e!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!-71e!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!-71e!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:94628,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!-71e!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!-71e!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!-71e!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!-71e!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee2312d4-cea1-49f8-b65f-70e6ff8d6d53_1920x1080.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>AI&#8217;s Cambrian Explosion of Enterprise Creativity</h3><p>Phase V represents not just an evolution in enterprise software but a reimagining of its purpose. By leveraging the dynamic capabilities of native-AI, businesses are empowered to move beyond the limitations of rigid, standardized workflows and rediscover the agility of creative problem solving. </p><p>Yet this transformation is not without its complexities. Like the Cambrian Forest, the adoption of native-AI will reshape the landscape. Some organizations will adapt quickly, embracing the flexibility and creativity unlocked by native-AI. These pioneers will build applications that evolve in real-time, tailored to their unique processes, enabling new heights of innovation and growth. Others, accustomed to the structure of legacy systems, may struggle to relinquish the familiar, choosing instead to refine and maintain what they already know.</p><p>Not every business will soar, or even survive. But those that do will explore untapped opportunities, charting paths toward a future unbounded by the constraints of traditional software. Imagine the competition and subsequent innovation when companies are unleashed from operating the same way and subsidizing each other.</p><p>Ultimately, the power of this phase lies in its ability to restore the creative agency that enterprises lost in the era of one-size-fits-all SaaS solutions. It is not merely a technological shift but a cultural one, requiring leaders to foster environments of experimentation and adaptability. The most successful organizations will not only embrace the tools of the future but cultivate a mindset ready to explore the vast potential of the horizon.</p><p>It&#8217;s time to leave the canopy. </p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/p/ais-cambrian-explosion-for-enterprise?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thanks for reading Revenant Research! This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/p/ais-cambrian-explosion-for-enterprise?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.revenantresearch.com/p/ais-cambrian-explosion-for-enterprise?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><p></p>]]></content:encoded></item><item><title><![CDATA[Quick Note On DeepSeek And Market Volatility]]></title><description><![CDATA[DeepSeek is rattling Wall Street and exposing how little they understand AI]]></description><link>https://www.revenantresearch.com/p/quick-note-on-deepseek-and-market</link><guid isPermaLink="false">https://www.revenantresearch.com/p/quick-note-on-deepseek-and-market</guid><dc:creator><![CDATA[Nathan Staffel]]></dc:creator><pubDate>Mon, 27 Jan 2025 19:59:23 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!kota!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kota!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kota!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!kota!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!kota!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!kota!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kota!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:218343,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kota!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!kota!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!kota!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!kota!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcf61cadd-fbb2-4738-8f99-7d1dfadcc8b0_1920x1080.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The massive one day drop in Nvidia, Vertiv Holdings, GE Vernova, Marvell, and other critical companies in the AI Infrastructure play has been a gross misunderstanding of DeepSeek and AI CAPEX.  </p><p>For those that don&#8217;t know, DeepSeek is a Chinese artificial intelligence company founded in 2023 by Liang Wenfeng, headquartered in Hangzhou, Zhejiang, and owned by a Chinese hedge fund High-Flyer (that&#8217;s kind of important to this story since DeepSeek&#8217;s unverified claim of training costs tanked AI competitors- which would make for a great short play). But let&#8217;s just assume that DeepSeek is telling the truth and isn&#8217;t shorting the market with unverified claims&#8230;DeepSeek specializes in developing open-source large language models (LLMs) and has released several notable models:</p><ul><li><p><strong>DeepSeek Coder</strong>: Introduced in November 2023, this model is tailored for code generation and assistance, supporting both researchers and commercial users.</p></li><li><p><strong>DeepSeek LLM</strong>: Launched later in November 2023, this 67-billion-parameter model was designed to compete with other LLMs available at the time, with performance approaching that of GPT-4.</p></li><li><p><strong>DeepSeek-V2</strong>: Released in May 2024, this model offered strong performance at a lower cost, contributing to a price competition in China's AI model market.</p></li><li><p><strong>DeepSeek-V3</strong>: Debuted in December 2024, this model comprises 671 billion parameters and was trained in approximately 55 days at a cost of around $5.58 million, using significantly fewer resources compared to its peers.</p></li><li><p><strong>DeepSeek-R1</strong>: Unveiled in January 2025, this model focuses on logical inference, mathematical reasoning, and real-time problem-solving, utilizing reinforcement learning techniques.</p></li></ul><p>Deepseek released their &#8220;research&#8221; on DeepSeek-R1 last week. Oddly enough, the market didn&#8217;t react until this morning, with those aforementioned companies dropping 15-30% by mid-day. So I jotted down 7 points on DeepSeek and AI Infrastructure that seem to be flying under the radar in today&#8217;s commentary.</p><p><em>I&#8217;ll follow up more on DeepSeek as a state-actor in the AI Soverignty strategy in the Second Cold War. </em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Revenant Research is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h3>1. <strong>DeepSeek's Transparency and Technical Claims (Research)</strong></h3><p>DeepSeek says it is improving open reasoning, but its models and code are not truly open-source. Its research focuses more on storytelling than providing the technical details needed for others to verify its work. Without access to the training code and data to support their claims on compute usage and costs, their claims of efficiency and innovation can't be confirmed. This lack of transparency makes it hard for companies and researchers outside of China to trust DeepSeek.</p><h3><strong>2. Catching Up Versus Leading (Innovation)</strong></h3><p>DeepSeek's advancements in efficiency highlight a critical dynamic in technology development: replicating and optimizing an existing innovation is significantly faster, cheaper, and less risky than creating it from scratch. This phenomenon underscores the economic and technical disparity between pioneers and fast followers in the technology lifecycle.</p><ul><li><p><strong>Lower R&amp;D costs for followers</strong>: Innovators bear the substantial upfront costs of research, development, experimentation, and failure. These investments include training foundational models, assembling specialized research teams, and developing proprietary hardware optimizations. By contrast, followers like DeepSeek can leverage publicly available research, open datasets, and established methodologies to bypass much of the trial-and-error phase.</p></li><li><p><strong>Efficiencies from reverse engineering and optimization</strong>: DeepSeek exemplifies how followers can reverse-engineer state-of-the-art technologies and optimize them for lower-cost deployment. For example, using techniques like parameter pruning, quantization, and efficient distributed training, DeepSeek was able to train its models more cheaply than its pioneering counterparts (again, if their claims on training costs and GPUs are true). However, these optimizations represent incremental improvements rather than foundational advancements.</p></li><li><p><strong>Economic realities of innovation</strong>: The first major breakthroughs in reasoning models demanded unprecedented levels of investment in computational resources and talent. DeepSeek, as a follower, avoided these costs by standing on the shoulders of prior innovations. This is consistent with historical patterns in technology: the first industrial robot, first integrated circuit, or first supercomputer required immense capital and years of development, while subsequent iterations became faster and cheaper to produce.</p></li><li><p><strong>Innovation still drives market leadership</strong>: While followers can rapidly close the gap by replicating and refining technologies, they rarely lead to the next wave of breakthroughs. Market leaders, such as those in the US and Europe, remain crucial for advancing the frontier. The creation of new architectures, the exploration of novel use cases, and the transformation of enterprises rely on the sustained capital expenditures and risk tolerance of these innovators. Copycat innovations, though valuable in broadening market access, do not change the fundamental economics of driving transformative advancements.</p></li></ul><p>In essence, DeepSeek's efficiency gains demonstrate how technological laggards can close the gap quickly. However, the lack of groundbreaking contributions means that they do not drive the fundamental advances required for the next wave of innovation. This reinforces the importance of true market leaders in shaping the trajectory of transformative technologies.</p><h3>3. <strong>Challenges of Embedded Chinese Censorship (Models)</strong></h3><p>DeepSeek's open-source model is influenced by Chinese censorship protocols, which poses a dual problem:</p><ul><li><p><strong>T</strong>he censorship introduces biases that compromise the utility and ethics of the model.</p></li><li><p>Businesses outside of China, particularly in democracies, demand AI solutions that reflect their own governance standards. Embedded censorship creates an insurmountable barrier to adoption in global markets (more on this in the next point).</p></li></ul><p>This limitation narrows DeepSeek&#8217;s market appeal and relegates it to controlled ecosystems where such biases are permissible. DeepSeek&#8217;s only true path to democratic penetration is in consumer applications, where user&#8217;s are oblivious or don&#8217;t care about Chinese abuses of power and limitation of free speech. </p><h3>4. Further <strong>Barriers to US Market Penetration (Products)</strong></h3><p>DeepSeek faces significant obstacles in gaining traction in the US and other Western markets for its applications due to legal and liability concerns tied to data privacy and Chinese national security laws:</p><ul><li><p><strong>Data privacy and security risks</strong>: Sending data to Chinese servers for processing by a Chinese large language model (LLM) introduces significant legal exposure for US companies. Under China's National Intelligence Law and Cybersecurity Law, the Chinese government has broad authority to access data stored or processed on servers within its jurisdiction. This creates a fundamental risk for enterprises handling sensitive or proprietary information, as data shared with Chinese models will feed Chinese intelligence applications.</p></li><li><p><strong>Regulatory non-compliance</strong>: Using Chinese LLMs that require cross-border data transfer may conflict with stringent US and European data privacy laws. Additionally, transferring data to jurisdictions lacking equivalent protections could violate data sovereignty requirements.</p></li><li><p><strong>Liability concerns</strong>: US companies risk legal and financial liability if sensitive data sent to Chinese servers is accessed or mishandled, potentially leading to intellectual property theft, data breaches, or compliance violations. This exposure could result in reputational damage and regulatory penalties.</p></li><li><p><strong>Contractual and insurance risks</strong>: Many enterprise contracts and cybersecurity insurance policies include clauses that restrict or prohibit the use of vendors operating in high-risk jurisdictions. Utilizing Chinese AI models may void these agreements, leaving companies without recourse in the event of a security incident.</p></li></ul><p>Given these risks, US companies would be extremely reckless to adopt Chinese LLMs, as doing so would introduce unacceptable legal, regulatory, and financial vulnerabilities. This makes market penetration in the US a significant challenge for DeepSeek.</p><h3>5. <strong>Constraints Imposed by the Chinese Government (Politics)</strong></h3><p>If US companies become too-big-too-fail, Chinese companies become too-big-too-succeed. DeepSeek&#8217;s growth potential is fundamentally constrained by China&#8217;s centralized governance model, where the CCP exerts significant control over enterprises to ensure they align with state interests. This creates structural barriers that limit the scalability and international competitiveness of even the most innovative Chinese companies.</p><ul><li><p><strong>State control over enterprises</strong>: The CCP&#8217;s governance approach prioritizes political stability and centralized authority over unfettered economic growth. Successful enterprises are closely monitored to prevent them from amassing influence that could challenge or overshadow the state. This control extends to limiting companies' autonomy in decision-making, capital allocation, and strategic international expansion.</p></li><li><p><strong>Historical precedents of intervention</strong>: The experiences of high-profile entrepreneurs and companies illustrate this dynamic. Jack Ma, the founder of Alibaba, faced significant pushback from the CCP following public criticism of China&#8217;s financial regulatory system, leading to the shelving of Ant Group&#8217;s IPO and heightened government oversight. Similarly, other tech giants like Tencent and Didi have faced regulatory crackdowns that curtailed their international ambitions and stifled innovation.</p></li><li><p><strong>Barriers to "escape velocity"</strong>: Even if DeepSeek achieves technological breakthroughs, its ability to scale internationally and compete on equal footing with global peers is undermined by political constraints. For instance, international enterprises and investors may be hesitant to engage with DeepSeek due to concerns about state influence over its operations, data handling, and strategic priorities. This limits the company's potential to secure global partnerships or establish a foothold in key Western markets.</p></li><li><p><strong>Impact on innovation and scaling</strong>: The CCP&#8217;s centralized governance model creates a paradox for Chinese enterprises like DeepSeek. While the state provides substantial resources and support for technological development, this support comes with strings attached, including compliance with state directives, restrictions on certain business practices, and limitations on profit reinvestment. These constraints hinder a company&#8217;s ability to reinvest in innovation, attract global talent, and compete effectively on an international scale.</p></li></ul><p>Ultimately, DeepSeek&#8217;s growth trajectory is tied not only to its technological capabilities but also to the political realities of operating within China&#8217;s governance framework. While the CCP&#8217;s control ensures alignment with state objectives, it also curtails the company&#8217;s ability to scale internationally, innovate freely, and achieve the independence necessary for long-term global success.</p><h3><strong>6. Impact on Data Center and GPU Efficiency (Infrastructure)</strong></h3><p>DeepSeek&#8217;s claims of efficiency, even if valid, would not reduce the need for substantial investments in data centers, energy, and GPUs. Historically, breakthroughs in efficiency tend to amplify demand for infrastructure, driving adoption and usage to higher levels rather than curbing investment. If the market shifts to utilizing more efficient and smaller reasoning models, the infrastructure demand remains, if not grows. </p><ul><li><p><strong>Efficiency fuels adoption</strong>: Advancements in efficiency reduce the operational costs of AI deployment, making it more accessible to a broader range of users and industries. As costs drop, businesses that previously found AI adoption prohibitively expensive begin to incorporate it into their operations. This expanded usage increases the demand for infrastructure, including GPUs, specialized processors, and scalable cloud services.</p></li><li><p><strong>Compounding workload growth</strong>: More efficient models often lead to the development of increasingly sophisticated applications. This generates more opportunity for automation of services that were below the cost of AI implmentation. If this fast-follower aporach to reasoning models is viable then the surface area of AI automation just exponentially grew. </p></li><li><p><strong>Infrastructure capacity scales with efficiency gains</strong>: While efficiency lowers the per-unit computational cost of AI operations, the overall demand for computation typically grows. This dynamic is evident in industries like telecommunications, where advancements in spectrum efficiency led to the proliferation of bandwidth-intensive applications, driving exponential growth in network infrastructure. Similarly, AI efficiency breakthroughs are likely to sustain or increase the pace of GPU and data center investments.</p></li><li><p><strong>AI proliferation and capex implications</strong>: The widespread adoption of efficient AI models is unlikely to reduce the capital expenditure (capex) required to sustain the underlying infrastructure. Instead, it shifts the focus toward scaling infrastructure to meet broader adoption. Efficient, open-source models can unlock new use cases in highly-regulated sectors and small-businesses, driving up demand for storage, inference compute, and energy. This expanded scope necessitates significant capex to accommodate increased computational workloads and geographic reach.</p></li></ul><p>Rather than curtailing the need for data center and GPU investments, efficiency breakthroughs are more likely to reinforce the market&#8217;s growth trajectory. By accelerating adoption, scaling workloads, and expanding the AI ecosystem, such advancements sustain the demand for infrastructure and perpetuate the cycle of technological and capital-intensive growth.</p><h3>7. <strong>Commoditization of AI Models (Operations)</strong></h3><p>AI models are increasingly becoming commoditized as multiple companies achieve comparable levels of performance. The competition for model performance in arbitrary benchmarks, while still relevant, is diminishing in importance compared to the broader AI ecosystem that underpins productization and deployment.</p><ul><li><p><strong>Commoditization and parity in performance</strong>: With many companies now capable of delivering models with near-parity in performance, the competitive focus has shifted. Incremental improvements in model performance no longer translate into significant advantages unless paired with broader capabilities. DeepSeek&#8217;s model is an example of how new entrants can quickly reach competitive benchmarks without introducing transformative innovation, further contributing to the commoditization trend.</p></li><li><p><strong>Competitive edge lies beyond performance</strong>: The true differentiators in the AI market are increasingly found in operational reliability, seamless integrations with existing enterprise systems, and governance frameworks that ensure compliance, security, and ethical use. Enterprises prioritize stability, ease of deployment, and the ability to meet regulatory standards over marginal gains in raw model performance. Companies that excel in these areas are better positioned to capture market share, irrespective of whether their models are the most advanced.</p></li><li><p><strong>DeepSeek&#8217;s contribution to commoditization</strong>: By delivering a model that matches industry standards without offering disruptive features or innovations, DeepSeek reinforces the trend of commoditization. While its efficiency gains and performance metrics may align with competitors, they do not reduce the demand for infrastructure. Instead, its presence underscores the increasing accessibility of advanced AI technologies and the diminishing importance of proprietary performance advantages.</p></li><li><p><strong>Infrastructure trajectory remains unaltered</strong>: The commoditization of models creates more demand for the resources required to support widespread deployment grows. This includes investments in data centers, high-performance GPUs, and network optimizations, ensuring that infrastructure remains a central pillar of the AI ecosystem.</p></li></ul><h3>Conclusion</h3><p>Investors are overreacting to a narrative that positions DeepSeek as a disruptor capable of undermining AI infrastructure demand. </p><p>DeepSeek&#8217;s latest model, if it&#8217;s claims in cost and infrastructure are true (doubtful), may demonstrate cost-effective training and incremental improvements, but it does not fundamentally alter the demand for AI infrastructure. On the contrary, efficiency breakthroughs typically expand adoption, leading to greater workloads, more sophisticated applications, and a deeper reliance on chips, data centers, and complementary technologies. The commoditization of AI models, as exemplified by DeepSeek, shifts focus from performance to the ecosystem that supports deployment, integration, and operational reliability&#8212;all of which depend heavily on sustained CAPEX in infrastructure.</p><p>Today&#8217;s selloff is not just premature but also shortsighted, overlooking the reality that true market leaders in infrastructure will remain indispensable as AI adoption accelerates.</p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/p/quick-note-on-deepseek-and-market?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thanks for reading Revenant Research! This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.revenantresearch.com/p/quick-note-on-deepseek-and-market?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.revenantresearch.com/p/quick-note-on-deepseek-and-market?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><p></p>]]></content:encoded></item></channel></rss>