The Headline

Source: Business Insider

Translation: The co-creator of one of the internet’s foundational web frameworks is describing his own professional obsolescence in real time, and the metaphor he reaches for tells you more than the prediction does.

What’s Actually Happening

Simon Willison, who co-created Django, the web framework that helped build Instagram and thousands of other sites, says that 95% of the code he now produces he did not type himself. He is describing not a future scenario but his current working reality. The progression he outlines is specific: first, professionals tell AI what they want; then they monitor its progress; then they review the finished output. The dark factory is the logical endpoint of that sequence. It is the moment when human oversight is removed from the loop entirely, and the system operates without requiring anyone in the building.

The factory automation analogy is precise. Automated factories run in darkness not because darkness is the goal but because darkness is what remains when the last human has been removed from a process that no longer requires one. Willison is applying that logic to software development: when AI can write, review, test, and deploy code without human intervention, the developers are not augmented. They are the lights being switched off.

The article notes that some companies are already telling human staffers to stop writing code. Willison says he found this unthinkable six months ago and now finds it credible. That is a six-month horizon on a professional identity that took decades to build.

The Distortion

The primary distortion is the optimistic addendum the article appends to the dark factory concept: “Having an original, creative idea is just as important as having good tech to make it happen.” This is the standard human-value consolation inserted into AI displacement narratives (i.e., the claim that creativity, vision, and original thought will remain irreducibly human). It may be true. It is also unfalsifiable in the short run and inconsistent with the trajectory Willison is describing. A developer who no longer writes code is not a creative visionary with better tools. They are a prompt engineer whose leverage depends entirely on the continued willingness of the system to require their direction — a dependency that the dark factory concept explicitly imagines removing.

The secondary distortion is the framing of the dark factory as a future scenario rather than a present trajectory. Willison’s own account (95% of his code produced without his typing, companies already telling developers to stop writing) suggests the trajectory is not hypothetical. It is in progress. The dark factory is not a prediction. It is a description of where the current sequence terminates, stated by someone who is living through the intermediate stages.

The deepest distortion is the absence of the accountability question. Factories that run in darkness without human intervention have had decades of regulatory, legal, and insurance infrastructure built around their failure modes. When a dark factory malfunctions (e.g., injures a worker, produces a defective product, causes an environmental incident) there is a legal framework for determining responsibility. When a dark software factory ships code that fails in production, exposes user data, introduces security vulnerabilities, or makes consequential automated decisions incorrectly, the accountability infrastructure does not exist. The lights are going off before anyone has built the framework for what happens when something goes wrong in the dark.

The Incentive

For Willison, the incentive to name this concept publicly is intellectual honesty from someone with the credibility and proximity to speak accurately about where the technology is. He is not an AI booster selling a product or a catastrophist generating attention. He is a practitioner describing his own workflow and following its logic to its conclusion. That makes his account more credible and more uncomfortable than most commentary on AI and work, because he is not predicting what will happen to other people. He is describing what is happening to him.

For the companies already telling developers to stop writing code, the incentive is the same unspoken math we identified in the Amazon labor piece: if AI can produce the output, the human producing it is a cost that can be removed. The dark factory is not a philosophical position for these companies. It is a financial target. The question of what developers do when they no longer write code is a question those companies are not structurally incentivized to answer.

For the AI industry, the dark factory concept is simultaneously a marketing claim and a liability acknowledgment. It is a marketing claim because it demonstrates capability (i.e., look how far the automation has advanced). It is a liability acknowledgment because it names the endpoint of a trajectory that makes the human-in-the-loop safety argument increasingly difficult to sustain. If the goal is a factory that operates in complete darkness, the humans whose oversight currently justifies autonomous AI deployment are not a permanent feature. They are a transitional one.

For the broader workforce, the incentive to understand the dark factory concept clearly rather than through the softening lens of “creative ideas still matter”, is significant. The sequence Willison describes is already operating in software development. The same sequence ( i.e., tell the AI what you want, monitor progress, review output, remove the reviewer) is the logic being applied in legal research, financial analysis, medical diagnosis support, content production, and customer service. The dark factory is not a software development story. It is a white-collar work story, and the lights are going off sector by sector.

The Consequence

The immediate consequence is visible in Willison’s own account: professional identity built around a specific skill (i.e., writing code) is being decoupled from the output that skill produces. A developer who did not type 95% of their code today is still called a developer, still paid as a developer, and still reviewed as a developer. The category persists while the activity it describes is being transferred to the system. That decoupling is not stable. At some point, the category is renegotiated ( in job descriptions, in compensation structures, in hiring decisions etc.) and the people whose identity and livelihood were built on the original definition of the role find themselves on the wrong side of the renegotiation.

The structural consequence is the accountability vacuum the dark factory creates at scale. Software is not a contained industrial product. It governs financial systems, medical devices, transportation infrastructure, electoral processes, and social communication at a scale and speed that factory products never achieved. Dark factories running in the physical world produce defective goods at the rate of one factory’s output. Dark software factories can deploy defective code across millions of systems simultaneously. The absence of human oversight in the loop is not a productivity feature in that context. It is a systemic risk that scales with capability.

The longer-term consequence is the one the AI brain fry research, the Amazon labor piece, and the education pieces all point toward from different directions: the human capacities that make oversight meaningful such as the ability to review code and identify errors, to evaluate AI output and catch failures, to maintain the professional judgment that distinguishes a good decision from a plausible-looking one, are being degraded by the same deployment that is removing the need for them. Early-career developers who never write code do not develop the ability to review it. Knowledge workers whose cognitive capacity is being eroded by AI supervision load are not maintaining the judgment that makes their oversight valuable. The dark factory removes the humans from the floor at exactly the moment when the humans’ capacity to operate safely in the dark has been most diminished.

The Calibration

The dark factory metaphor is the most honest framing of the AI automation trajectory that has appeared in mainstream business coverage, precisely because it does not soften the endpoint. Factories run in darkness not because the humans are doing more creative work elsewhere. They run in darkness because the humans are gone.

The calibration that the concept requires is to ask who is accountable for what happens in the dark. Physical factory automation took decades to develop the legal, regulatory, and insurance infrastructure that governs its failure modes. Software development is being automated on a timescale of months, in an environment where the accountability infrastructure for AI-generated code does not exist, where the legal framework for AI-caused harm is unsettled, and where the professional categories that would bear responsibility are being dissolved at the same time.

The calibration for developers and knowledge workers is to understand clearly what the sequence terminates in, and to avoid confusing the intermediate stages with a stable destination. Being the person who tells the AI what to do is a viable role until the AI no longer needs to be told. Being the person who reviews the output is a viable role until the review is automated. The dark factory is not a metaphor for a different kind of work. It is a metaphor for the absence of workers.

The calibration for organizations and policymakers is the question the article does not ask: what governance infrastructure needs to exist before the lights go out? Physical factories are required to have emergency systems, safety protocols, and human override capabilities even in their most automated configurations. The equivalent requirements for dark software factories (think auditable decision trails, human intervention mechanisms, liability frameworks, output verification systems) are not being built at anything like the pace the automation is advancing.

The honest version of Willison’s observation is not that the dark factory is the next big thing. It is that the dark factory is the current trajectory, the accountability infrastructure for it does not exist, and the people best positioned to notice when something goes wrong in the dark are being removed from the building while it is still lit.

Next calibration: 1 pm (GMT). Stay sharp.