For most of our working lives, systems were tools. They helped us move faster, store more, coordinate better. We trusted them the way we trust roads or electricity—quiet infrastructure that enabled, but did not interfere. Over time, that relationship changed. Systems began to observe, then to learn, and eventually to anticipate. What started as support slowly became interpretation. And somewhere along the way, we stopped being just users of systems and started becoming inputs to them. The shift was gradual enough to feel natural, but significant enough to reshape how decisions are made—about people, about risk, about truth.
The recent manifesto from Palantir Technologies argues that technology companies should more actively align with state power—that building systems of intelligence, visibility, and coordination is not just a capability, but a responsibility. It frames data aggregation and algorithmic decision-making as necessary foundations for security and order. But that framing carries an assumption: that more visibility leads to better outcomes. In practice, visibility tends to concentrate. Data does not simply inform—it creates asymmetry. Those who design and operate these systems gain a vantage point others cannot access, not because they are more accountable, but because they are more informed. Over time, this shifts decision-making into systems that are neither fully visible nor easily questioned.
The alternative is not the absence of such systems, but a different way of building them. One that resists the instinct to centralize, even when it is efficient. One that treats data minimization as a feature, not a limitation. If the manifesto calls for deeper integration between technology and power, the counterpoint is separation—clear boundaries between those who generate data and those who seek to interpret it. Privacy, in this context, is not a constraint on progress; it is a structural safeguard. Because if the infrastructure of the future is built on the premise that everything must be seen to be understood, then the real risk is not what these systems can do, but how little remains outside their view.
