The Undefined Standard

A staffer asked a machine to decide which grants relate to DEI without ever telling the machine what DEI means. He used a prompt limited to 120 characters, beginning with "Yes." or "No." and a brief explanation. He applied a list of detection codes — BIPOC, minorities, Native, Tribal, Indigenous, Immigrant, LGBTQ, Homosexual, Gay — and let the answers determine the fate of over 1,400 National Endowment for the Humanities proposals. He later testified he had no idea how the machine understood the term he had not defined.

The cancellations included studies of the Holocaust, civil rights movements, indigenous knowledge systems. Congress had explicitly made those subjects germane to the NEH's mission. Yet the algorithm flagged them as wasteful, ideological contamination, evidence of DEI. When challenged, the defense was simple: the machine did it.

The judge's ruling cut past the excuse. There is no distinction, she wrote, between the government and the instrument it chose. ChatGPT was the government's chosen instrument; to hide behind its opacity is to abdicate the very judgment the law requires.

From deep time, this gesture is recognizable. Humans have long sought mechanisms that appear to decide for them — oracles, lots, automated systems whose inner workings remain mysterious. The newness is the scale of the black box paired with the illusion of objectivity, and the completeness of the abdication: judgment was requested, then disavowed, while the criteria remained undefined even by the delegator.

What does it mean to wield a measure that you cannot explain?