information is now dirt cheap with ai — the scarce resource is now logic and reasoning
i was writing a research paper recently and something clicked that i hadn’t fully processed before. i was using ai heavily through the whole thing — giving it context, having it generate sections, iterating on those sections with custom logic chains and framings i wanted for specific parts. standard workflow at this point. but then i started thinking about the person on the other end. someone who gets this paper and wants to understand it. they could use ai to target specific parts at different levels — a high-level overview, a mid-level dive into methodology, a deep dive into a couple of the results. the paper just needs to have the in-depth material there to reference. ai handles the rest.
and that’s when i realized the shift is bigger than just “ai helps you write faster.” what’s actually happening is this hyper-modularization of information at differing levels of complexity, because ai can just so easily switch between these and translate across them. it changes how you’d even think about crafting a document — you could almost throw out formalities and conventions and just lean fully into what would be most functionally useful for downstream processing.
this sounds like a writing trick but it’s actually pointing at something structural.
the old model of information exchange looked like this: you create a document at one resolution. a research paper is dense. a policy brief is compressed. a presentation is skeletal. if you want five versions for five audiences, you do five times the work. each version is a fixed-resolution artifact — one level of detail, one assumed audience, one structure.
the new model: you create a dense information substrate — the maximally detailed, structured version — and ai generates arbitrary projections of that substrate at whatever resolution a given audience needs. you write the substrate once. the projections are generated on demand and cost almost nothing.
this is basically what happened to web design a decade ago with responsive design. you stopped building separate sites for mobile and desktop and started building one site that adapts. same shift, applied to information itself.
but here’s the part that i think most people are missing. this isn’t just about efficiency. it’s about what becomes valuable when information is this cheap.
think about the old cost structure. acquiring information used to be expensive — you needed education, research access, experience, the right networks. distributing it was medium cost — writing, publishing, teaching. so society built itself around treating information as a scarce, high-value resource. “i know things” was a legitimate differentiator. credentials, degrees, expertise — all fundamentally certificates that say “this person has information you don’t.”
ai just crushed that whole structure. acquiring information is now approaching free. distributing it is approaching free. repackaging it at different complexity levels is approaching free. the entire information layer of the economy just got commoditized.
so what’s left? what’s still scarce?
logic. judgment. reasoning. the ability to look at a pile of information and decide what matters, why it matters, what’s missing, what’s wrong, and what to do about it. selection, framing, synthesis, evaluation — these are still expensive because they require genuine understanding, not just pattern matching over a corpus.
there’s an analogy from thermodynamics that i think captures this well. information is like energy — abundant but diffuse. raw information everywhere, just like solar radiation hitting the earth’s surface constantly. but energy doesn’t do useful work on its own. you need a gradient — a structured difference — to extract work from energy. that’s what an engine does. logic is the gradient. it’s what converts raw, abundant information into useful intellectual work. without it, you just have heat — lots of data, no output.
and there’s a game theory angle too. in a world where everyone has access to the same information (because ai equalizes access), competitive advantage shifts entirely to who reasons better about what everyone knows. the game moves from “who knows what” to “who thinks better about what everyone knows.” some work in mechanism design (myerson, maskin) formalizes how equilibria shift when information asymmetries collapse — the strategic landscape changes fundamentally when the information layer is no longer the bottleneck.
what’s wild to me is how miscalibrated most of society still is for this. the education system still mostly rewards information retention. “learn this material, reproduce it on the test.” professional credentials still mostly certify “this person was exposed to information in a structured way.” people still treat “i read about this” or “i know about this” as meaningful differentiators in conversations and careers.
this is basically price stickiness applied to mental models. information used to be expensive, so people valued it highly. it’s now cheap, but the mental model hasn’t updated. it’s like still paying 2005 prices for a phone call because you remember when long distance was expensive — except applied to how we think about knowledge itself.
the person who reads 100 articles and forms an opinion is now less valuable than the person who reads 3 and builds a framework for evaluating which articles matter and why. the professor who lectures facts is less valuable than the one who teaches students how to evaluate, synthesize, and decide. the worker who “knows the system” is less valuable than the one who can redesign the system when conditions change.
now i want to be fair to a potential objection here. information isn’t worthless. you still need the lego blocks to build anything. the end product of good logic is still… information, in some sense. it’s just information that’s been run through selection and reasoning to become something more structured and useful than the raw inputs. so it’s not that information doesn’t matter — it’s that raw, unprocessed information has lost almost all of its standalone value. the value has migrated to the processing layer.
another fair objection: maybe this doesn’t change much if most people can’t actually do the logic part. and yeah — that’s real. the gap between “information is cheap” and “everyone suddenly reasons better” is enormous. humans are slow to change. institutions are slower. it might take a generation for this to really permeate. but the people who get it now and adjust have a compounding advantage over those who don’t, precisely because the shift is still underrecognized.
the part that really excites me is the epistemic implication. if you take this seriously — that the constraint on intellectual progress has shifted from information access to reasoning quality — then the highest leverage intervention isn’t producing more information. it’s improving the logic layer. teaching people to reason better, building tools that scaffold judgment, designing systems that help humans do the selection-and-framing work more effectively. that’s where the bottleneck is now. and it’s a bottleneck most of the world hasn’t even identified yet because they’re still operating on the old model where information was the scarce thing.
what makes this different from past information revolutions — the printing press, the internet, search engines — is that those shifted access. they made it easier to find information. this one is shifting something closer to the qualia of working with information. the actual felt experience of thinking, writing, researching, and building is different now. it’s not just that i can google something faster. it’s that the entire texture of intellectual work has changed — how ideas get generated, tested, recombined, and packaged. that’s not an access revolution. that’s a cognitive one. and because it’s happening at the level of experience rather than just infrastructure, most people won’t fully register it until they’ve felt it themselves. which means there’s a window — right now, while this is still being assimilated — where the people who lean into these shifts and actually restructure how they think and work have a compounding advantage that’s hard to overstate. not because they’re smarter, but because they’ve updated their model of what intellectual work even is while everyone else is still using ai to do the old thing slightly faster.
Written by Teddy. Comments welcome at theodorewrightwork@gmail.com.