Will AI Accelerate Top-Down Organizational Culture?
- Michael Amenta
- Nov 14, 2025
- 6 min read
Updated: Jan 1
Summary
Successful adaptation requires two traits: mutation and selection. When executives develop too much confidence, they can over-bias selection using components and specialists vs. generalist resources who excel at mutation.
Executive over-confidence is often derived from observer bias—a subjective interpretation of limited information—demonstrated by an over-emphasis on a limited set of KPIs. However, there are always unmeasurable factors in complex systems that, when ignored, can lead to underperformance or collapse.
This risk can be managed by bottom-up decision making, wherein workers are multi-skilled and autonomous owners of the end product and teams are cross-functional.
For timeless insights on software design, I highly recommend “Why Doctors Hate their Computers” by Dr. Atul Gawande. In the piece, Gawande recounts the frustrations medical professionals experienced with first-generation Electronic Medical Record (EMR) systems where software that was intended to make administrative tasks more efficient had the opposite effect.
Despite fixing issues like poor penmanship and enabling search, EMR led to physicians spending less time with patients and heightened their burnout from frustrating, menial labor.
How could digitization lead to such profound inefficiency and human misery? The answer lied in the undisciplined application of Taylorization.
Taylorization enables top-down design decisions
Historically, Taylorization comes from the Industrial Revolution when manufacturing work was broken into components of standardized parts. This enabled designers to work independently from builders and created workers who performed highly specialized and efficient labor. In software, we see this trend reflected in the rise of SDKs, code libraries, low-code solutions, and AI agents.
Unfortunately, the shift from general-purpose development to specialized, plug-and-play labor can create brittleness and organizational siloes. Gawande describes the tradeoff succinctly:
Adaptation requires two things: mutation and selection. Mutation produces variety and deviation; selection kills off the least functional mutations. Our old, craft-based, pre-computer system of professional practice—in medicine and in other fields—was all mutation and no selection. Computerization, by contrast, is all selection and no mutation. Leaders install a monolith, and the smallest changes require a committee decision, plus weeks of testing and debugging to make sure that fixing the daylight-saving-time problem, say, doesn’t wreck some other, distant part of the system.
In other words, using efficient components (selection) can kill the craft, flexibility, and ground-level innovation (mutation) that makes work meaningful and impactful.
The "Illegible Margin" consequences of executive-first design
When resources are allocated top-down for specific executive goals—such as control, oversight, and reporting—it often pours administrative burden downwards onto the worker. For example, EMR software allowed hospital leadership to streamline the imposition of rules and to monitor population health risks, but it was also burning out doctors.
I experienced similar administrative overhead when my leadership mandated a program management tool, called PlanView. The goal was to push standardized progress updates to VPs via dashboards, but the result was forcing product managers to make a dozen more clicks to update a single field. Despite more administrative hours worked, the goals our organization weren't being realized more effectively with PlanView. This experience was echoed years later with OpenAir timesheets.
This is the problem with top-down organizations and the fallacy of what I call "trickle-down software". Software that benefits the C-Suite does not necessarily benefit the worker or the organization. When the system is over-indexed on a few KPIs, it reduces the knowledge worker (e.g. a doctor or a product manager) to a cog in the machine. As Gawande stated:
Artisanship has been throttled, and so has our professional capacity to identify and solve problems through ground-level experimentation... Technology for complex enterprises is about helping groups do what the members cannot easily do by themselves—work in coördination.
As human processes are replaced by software automations, the ability to complete a task with any sort of creative process (i.e. a mutation) is eliminated. This has the effect of increasing the perception of risk mitigation at the expense of removing the potential for innovation.

Unfortunately, there is a great deal of risk that can't be measured, and over-confidence on finite KPIs can catapult a complex system into failure. This "illegible margin" is why unplanned city blocks, such as those in New York's West Village, outperform highly planned designs, such as the city of Brasilia, as Taylor Pearson brilliantly explains.
These types of unintended consequences were demonstrated by the first-gen EMR software as well. As a result of its brittleness and inefficiency, the medical system adapted by hiring more humans to enter and read data: so called “medical scribes”. As Gawande describes:
This fix is, admittedly, a little ridiculous. We replaced paper with computers because paper was inefficient. Now computers have become inefficient, so we’re hiring more humans. And it sort of works.
This evolution for efficiency sake has subsequently created laughably inefficient workflows:
A note for a thirty-minute visit takes Rane [the offshore scribe] about an hour to process. It is then reviewed by a second physician for quality and accuracy, and by an insurance-coding expert, who confirms that it complies with regulations—and who, not incidentally, provides guidance on taking full advantage of billing opportunities.
Gawande also describes the scribes as having huge error rates and sky-high turnover rates, with many lasting “several months” in their jobs. Combined with ballooning rates of physician dissatisfaction, stress, and depression, he paints a story where both the highly paid specialist and the minimum wage worker are succumbing to misery.
Embracing generalists can avoid "brain drain" attrition and low engagement
Importantly, Gawande pointed to burnout arising from the “pointlessness” of the added administrative tasks. Work needs meaning.
While AI agents are accelerating specialization at work, the trend has been present for years prior to the release of GPT-3. When I started my career as a product manager, I was a Swiss Army Knife. I generated product ideas, validated them, submitted them for approval, wrote their specifications, managed their build, tested their quality pre-release, and marketed their features and provided customer support post-release. Now, there's often a Product Strategist for defining initiatives, a User Researcher for research, a Business Analyst to write detailed specifications, a Scrum Master to oversee the build, and a Product Marketer to market the release — all reporting to different heads of department. But despite all these additional resources, I'm not working less. Like the doctors in the story, my work has “shifted”.
Correspondingly, I've noticed brain drain among my colleagues. Whereas once it was the norm for PMs to drive strategy and innovation, now it seems my colleagues increasingly need to be told what to do. Furthermore, what they produce often results in more predictable—and unremarkable—results.
The reality is that generalist excellence helps people excel in careers, but organizational shifts are forcing workers into specialist roles where big picture thinking and inter-team collaboration are restricted in favor of task optimization.
Empowering bottom-up organizations
Is this all inevitable? As Gawande notes:
Postwar Japan and West Germany eschewed Taylor’s method of industrial management, and implemented more collaborative approaches than was typical in the U.S. In their factories, front-line workers were expected to get involved when production problems arose, instead of being elbowed aside by top-down management. By the late twentieth century, American manufacturers were scrambling to match the higher quality and lower costs that these methods delivered.
It's a false choice to use top-down selection to get better top-line results, or allow bottom-up mutation to foster worker autonomy, meaningfulness, and satisfaction. There are "both/and" solutions that install and protect mechanisms to allow for worker-led decisions for mutation and selection, similar to Six Sigma, Lean Manufacturing, and Total Quality Management.
Agile development principles were designed, in part, to address this, but enforcement in any set of principles is not guaranteed. Most agree on Agile in spirit, but there’s still a lot of ineffective development teams in practice.
As automation and the use of AI agents continue to grow, we must internalize the manufacturing lessons from the past. Investment in human capital with a philosophy continuous improvement, education, and multi-skill training have out-performed in different eras. To do this effectively, leaders must explicitly explore how every KPI and executive dashboard might cause unintended consequences to the worker.
The most enduring long-term value of breakthrough technology—including AI—is not in its ability to maximize quantity in the short term, but in its capacity to empower and elevate the human worker to maximize quality, innovation, and sustained success.