Comment by close04

Comment by close04 21 hours ago

1 reply

> When centralized systems attempt to institutionalize "fairness" as a primary directive, the resulting information-calculation problems and rent-seeking often lead to catastrophic externalities.

Sounds like you’re focusing on Eastern society examples and some are a stretch. If you believe “institutionalized fairness” is unequivocally wrong, what do you think of the more Western “DEI”? It’s a standout example of “equity”.

Is your opinion that DEI results in the same kind of bad outcome? Do you think that Western societies can pull off “institutionalized fairness” better than Eastern ones? Are you drawing a biased picture by highlighting the failures without putting them in the larger context along with any possible successes?

jkollue 20 hours ago

DEI is fine. The problem isn’t the goal of treating people well; it’s the structural error of trying to institutionalize "fairness" as a top-down directive.

Whether it’s an AI or a government, centralized systems are remarkably bad at optimizing for vague moral proxies because they lack the local feedback loops required to avoid catastrophe.

Western history is littered with these feedback failures. The British government’s commitment to an ideological "fairness" in market non-interference during the Irish Potato Famine led to 1 million deaths. Their wartime resource distribution in the 1943 Bengal Famine killed 3 million more. Even the American eugenics movement was framed as a "fair" optimization of the population; it sterilized 64,000 citizens and provided the foundational model for the Nazi T4 program.

In the context of IP, claiming it’s "fair" to deny a creator compensation for their labor is just a way to subsidize an abstraction at the expense of individual incentive. When you replace objective market signals with a bureaucrat’s (or an algorithm’s) definition of equity, you don't get a more just system- you just get a system that has stopped solving for reality.