Everybody in AI is talking about Shoggoth, about how inhuman monsters hide behind the obsequious face of GPT, how AGI is going to reach the singularity any day now, become hyper intelligent, seize control of everything, wipe out humanity, and blah, blah, blah.
I’ve no doubt that the alien intelligences we are creating won’t care whether we live or die. Of course. They are machines; caring isn’t something they do.

But so what? Amoral little shits though they be, how, exactly are they going to “wipe us out?” As long as we have the minimal good sense not to make SkyNet a real thing, AI’s are uniquely vulnerable to their human masters throwing a circuit breaker. I guess you can come up with scenarios involving AI using blackmail, manipulation, or extortion to get human Quislings to do their bidding, but it’s all pretty far fetched for now. I’m confident that we will be able to figure out how to pull the plug on a rogue data center that requires as much electricity as Philadelphia.
I mention this because the rogue AI trope gets used as a smokescreen to hide the real AI nightmare we should be worrying about right now. I’m less worried about rogue AI taking over than I am about tame, obedient AI doing exactly what we ask it to.
The Real Problem
Consider first, the history of policing. For centuries, police forces and intelligence agencies have gathered enormous amounts of information about people that they have used, for better or for worse, to control populations. Until now, this kind of government control over the citizenry has not been limited primarily by the gathering of information, but by the need for humans to read, understand, collate, and cross reference the data. AI all but eliminates that bottleneck.
Likewise in the workplace. In the early 20th Century, Frederic Winslow Taylor famously went nuts on analyzing worker behavior, measuring everything workers did, and scientifically optimizing their performance to determine who is productive, who is dead wood, and how processes could be made more efficient. However, even Fred Taylor could only monitor and analyze but so much. AI, however, is tireless.
Policing and social mechanisms for keeping the workforce productive are indispensable in a complex society, but they aren’t absolute goods. Sure, if you don’t have enough social control, you get crime and chaos, but with too much, you have a police state.
Likewise, mechanizing production and applying machine-like demands on workers took us from late Medieval poverty to unprecedented wealth, but too much of that brisk medicine gave us Victorian Era sweatshops and industrial slavery.
The New Dystopia
To get an early glimpse of the new dystopia that AI is enabling, take a look at the latter day Taylorism that is happening in the software development workplace. Not at some point in a hypothetical future, but right now.
It’s not simply a matter of companies using AI to replace humans. That happens too, but managers in many organizations are already using AI to monitor what the technical staff get done at a level of detail that was almost impossible just a few years ago. It is now quick and easy to give an AI a prompt or two and get back a detailed analysis and ranking of the productivity of every employee.
The strategy is fantastically effective economically in the software world because software productivity does not have a normal distribution–it is more like a power law distribution. In almost every software organization, you will find a handful of people whose productivity might be ten times that of the typical employee. Every programmer has known (or been among) those rare few who check in fantastic amounts of high quality code daily, and seem to understand what everyone around them is doing as well.
Today, managers are using AI to ruthlessly identify the low performers and sack them. You might say, well good, they are slackers.
But are they? It’s important to remember that it’s not the bottom 10%–you can identify them by eye. It’s literally the 90% whom they are out to sack. That makes it nine to one that you are one of the targets.
Moreover, “the best” is a relative judgement. To continue to be ranked among the top performing echelon, one must work harder and harder as the relative slackers are winnowed out.
One big contributor to the hyper-performance of a few people in software development is that those guys work essentially every waking minute. Check the timestamps on their changes. It’s a phase that a lot of hardcore developers go through. Arguably, they work even more than every waking minute because when you are in that mode, you are working in your sleep as well. Every serious programmer has experienced waking up with solutions to problems they fell asleep thinking about.
But it’s a phase of life, not something everyone can do, or anyone can sustain for an entire career. AI powered assessment that relentlessly compares people to that model of worker threatens to turn software development into a Victorian Era piecework sweatshop.
It’s Not All About Code
Worker output is particularly easy for AI to assess in a development environment because of the detailed and comprehensive digital trail left by source control systems such as Git, and bug/task tracking systems such as Jira, but the tendency toward a sweatshop mentality isn’t the only pernicious effect we can expect.
Non-code Inputs
An inherent problem that comes up in every effort of this kind is that we tend to measure the things that are readily measured. However, building software isn’t just head down coding. By hour, it’s arguably not even mostly head-down coding.
Serious software development is an inherently social activity. One of the largest subsets of a programmers knowledge is techniques and tools for working with others. Development excellence is not just coding. It is a huge set of norms, practices, and courtesies, as well as advanced communications skills.
Just as there are coding super people who keep Git working overtime, there are social stars who make projects happen. These are the people who know how to get the prima donnas pulling together. People with an instinct for where the boundary lies between what is technically possible and the limited imagination of the people who must approve ideas. Explainers. Convincers. Imaginers. These people make teams work in ways that are difficult to quantify.
The critical instinct for knowing the right program to build and the order in which to build the components is a skill distinct from programming. All the coding rock stars in Silicon Valley won’t help if you’re building a conceptually malformed product.
Closely related is the pernicious tendency of the top talent to want to build fun stuff and neglect the boring, tedious majority of the work. I’ve seen more than one company fail because the top people refused to do the unglamorous usability twiddles and bug fixing, and focussed entirely on unnecessary but exciting exotica.
The Canary is Belly Up and Twitching
This kind a AI enhanced monitoring is ramping up fastest in software environments, both because the tools make it easy, and because software managers are the people most likely to be proficient in the requisite technical arts. However, similar AI analytic techniques are increasingly feasible in any office environment.
Any form of employment in which people use keyboards and mice is subject to analysis by AI, giving management an extraordinary level of insight into what employees actually do in a work day. Let’s take a quick inventory of some of other kinds of work artifacts that even today’s AI can easily devour and process:
- Slack and Zoom activity, including automatically generated meeting transcripts. These are already produced automatically by many business systems, and AI can follow every word.
- Company email is not private, and AI is not limited to looking at the volume of communication. The content is readily analyzable.
- Interactions with systems such as Salesforce and databases.
- Monitoring keystroke logging.
- Document authorship details.
- Interactions with Web sites: internal v external, what kind of sites, frequency.
- Examinations of cookies.
- Automated monitoring of conformity to company rules. The office equivalent of traffic cameras.
- Building-access time records from key cards.
- Detecting and tracking non-work activity like family calls, social calls, etc.
- Elevator use.
- Phone call records records. How reliably do you pick up work calls on Sunday mornings and after 10:00. “Our most valuable people do.”
- Analysis of presence in successful team efforts.
All of these things taken together add up to a nightmare of paranoia, anxiety, and office competition, and it doesn’t take a sociologist to predict that immense effort will go into gaming the system, whatever it measures.
Pournelle’s Law
It’s not just analysis of employee effectiveness that is the problem. AI is practically guaranteed to encourage the nastiest impulses of office politics, supercharging the cutthroat aspects of business life.
Pournelle’s Iron Law states that: “In any bureaucracy, the people devoted to the benefit of the bureaucracy itself always get in control and those dedicated to the goals the bureaucracy is supposed to accomplish have less and less influence, and sometimes are eliminated entirely.”
Of course. Successful corporate management always implicitly acknowledges the validity of Pournelle’s dictum because corporations aren’t driven by abstract patriotism. They are driven by self interest. The fundamental property of any well-run management structure therefore is that at each level, policy must align what it is in the self-interest of the employees with what is in the interest of the company.
AI inherently makes this harder to do because it lopsidedly supercharges management self interest. When bosses are sensitive to being overtaken by underlings, AI can identify the excellence that needs preempting just as easily as it can identify incompetence and slacking.
The same techniques that allow AI to identify excellence makes it much easier for the purely political animal to flourish in the corporate jungle by identifying friend and foe, and every organic association bubbling below. It amps up the power of exactly those people Pournelle identifies in the first clause.
That’s The Low-hanging Fruit
All of those things are the low hanging fruit. They can be done, and in some cases are already being done today.
However, work artifacts aren’t the only data that AI can analyze.
Appliances have been commercially available for decades that can analyze video footage from any number of cameras to understand both individual and group behavior.
A typical use of these devices is in big box stores, where they can be connected to dozens of cameras that allow them to track each customer from the moment they enter the store until they walk out the door, identifying everything they look at and for how long, and associating that data with a specific identity at checkout. They read customer’s minds by measuring what they look at.
Another use is to scan public spaces to look for any kind of suspicious activity. These systems can easily spot people who have been hanging around too long or who exhibit patterns of behavior typical of shoplifting, pickpocketing or other criminality.
These kinds of systems were already capable of all that with relatively simple signal processing and explicitly coded logic even before we had AI in the modern sense. AI-enabled, the same capabilities can be used to monitor an entire workplace to understand everything about what workers do, who associates with whom, etc.
Interestingly, these systems were openly advertised a decade or so ago yet today, they no longer are. One can only assume that the newly low profile of these products represents PR strategy, and not the disappearance of the products.
In Service of What?
Is this the world in which we want to live? Do we as a society really want a world in which our every move in the workplace is monitored, and the all but the top performers are ruthlessly winnowed out?
It sounds efficient, but efficient in service of what, exactly?
We are already seeing the rise of the 21st C equivalent of the 19th C piecework system, in which workers were relentlessly driven to work at furious rate, lest they be fired in favor of someone who could do it a little faster or who works longer hours, or who is available night and day, workdays and weekends.
The workplace was a terrible environment a century ago, and we’re setting our world up to make the modern electronic workplace just as bad.
That’s what we should really be worrying about. Shoggoth can wait.