The Feeding Machine Is Back (and This Time It Works)

Contemporary AI is framed as aiding decisions. This article argues it reshapes how decisions form by organising attention and pre-structuring choices. The space for deliberation narrows, making judgment appear efficient while altering agency itself.
The Feeding Machine Is Back (and This Time It Works)
Like

Share this post

Choose a social network to share with, or copy the URL to share elsewhere

This is a representation of how your post may appear on social media. The actual post will vary between social networks


The feeding machine introduced in the factory in Modern Times was not merely a comic contraption. It was a sincere proposal. Its purpose was to remove the interruption of lunch from the working day and replace it with a continuous flow of production. The logic behind it was plain and, in its own way, rational. Eating took time. Time was money. If the body could be fed without stopping the machine, productivity would rise. The device failed in practice, and the scene remains memorable because of its farce. Yet the deeper lesson lies not in the malfunction but in the calm seriousness with which the proposal was made. No one in the room questioned whether removing the pause of lunch altered the meaning of work itself. The only question asked was whether the machine could be made to function.

That logic did not perish with the feeding machine. It has matured, refined itself, and returned in more subtle form. Today it appears in the language of intelligent systems that promise to streamline decision, remove friction, and ease the burdens of human judgment. Contemporary systems are offered as helpers, assistants, and guides. They promise to save time, reduce effort, and make choice more efficient. The promise is always the same. Human attention is scarce. Human patience is limited. Human judgment is slow. The solution is to design environments in which fewer decisions need to be made and those that remain can be made quickly.

The danger does not lie in the presence of machines that assist human activity. It lies in the quiet redefinition of what counts as a decision, and in the gradual disappearance of the pause in which judgment once took shape.

The lunch break in the factory was not merely a biological necessity. It was a temporal boundary. It marked a moment in which the worker stepped outside the rhythm of the machine. Even if only briefly, the body reclaimed its own tempo. The feeding machine sought to erase that boundary. It aimed to make the human body conform entirely to the rhythm of production. Its failure was physical and visible. The body resisted the machine. The body choked, faltered, and broke the illusion that efficiency could be imposed without remainder.

The contemporary feeding machine does not attempt to feed the body. It feeds the mind. It does not impose its rhythm through gears and levers but through interfaces, rankings, notifications, and automated suggestions. It does not force itself upon the subject. It persuades. It does not break down in public. It succeeds in private. It works precisely because it trains attention to move at its pace. The pause is not removed by decree. It is made to feel unnecessary.

Modern systems present themselves as instruments of convenience. They anticipate needs, complete sentences, rank options, and filter information. In doing so, they reshape the conditions under which perception operates. What appears first appears important. What appears frequently appears normal. What appears easily appears preferable. The user remains free to refuse any particular suggestion. Yet the field within which refusal occurs has already been prepared. The environment has been arranged so that certain paths appear natural and others recede into obscurity.

 

We continue to speak of freedom as though it were exhausted by the moment of choice. Did the user click or not click. Did the officer approve or deny. Did the administrator accept or reject. This way of speaking assumes that the space of possible action presents itself neutrally. In truth, the space of possible action is always shaped. It is shaped by what is made visible, by what is made salient, and by what is made timely. The feeding machine did not ask whether the worker was free to accept the spoon. It assumed that the only relevant question was whether the spoon could be delivered efficiently. Contemporary systems assume that the only relevant question is whether options can be delivered in a form that reduces effort.

The moral weight of judgment lies not only in the act itself but in the conditions under which the act becomes thinkable. A decision that emerges after deliberation is not the same as a decision that emerges from pre-arranged convenience. The former involves the slow work of weighing, hesitating, and reconsidering. The latter involves the quick recognition of what has already been placed before the mind as relevant. When environments are designed to reduce the need for hesitation, they do not merely assist decision. They transform the nature of judgment.

Institutions are particularly susceptible to this transformation. Administrative systems prize regularity, speed, and consistency. Delays are treated as inefficiencies. Friction is framed as failure. Over time, institutional practices adapt themselves to the tools that promise to remove friction. Procedures are redesigned to fit automated pipelines. Human review is recast as an obstacle to throughput. The pause that once belonged to judgment becomes a procedural inconvenience. Accountability shifts from substantive engagement with outcomes to formal confirmation that a system has been followed correctly.

This shift has profound consequences for how responsibility is lived. Responsibility, in its deeper sense, is not merely the assignment of blame after an outcome. It is the lived experience of standing behind a decision. It presupposes that the decision was genuinely confronted, that alternatives were considered, and that the burden of choosing was felt. When decision environments are shaped to make certain outcomes appear obvious and others remote, the experience of responsibility thins. The subject becomes a conduit through which pre-arranged options flow. The system has not removed agency, but it has narrowed the space in which agency can be meaningfully exercised.

The language of efficiency disguises these changes by presenting them as natural progress. To save time appears uncontroversial. To reduce effort appears humane. Yet time saved is time taken from somewhere. Effort reduced is effort displaced. When the effort of deliberation is reduced, what replaces it is not neutrality but design. The environment decides in advance which burdens are worth carrying and which are not. The feeding machine promised to remove the burden of eating from the working day. Contemporary systems promise to remove the burden of thinking from the act of choosing. In both cases, what is lost is not merely time but a mode of relation to one s own activity.

There is a temptation to describe these developments as inevitable. Technology advances. Systems become more capable. Humans adapt. Yet history suggests that the forms technology takes are shaped by values long before they are embedded in machines. Design choices encode judgments about what kinds of human capacities deserve time and which are to be treated as inefficiencies. The defense of the lunch break was once a defense of dignity. It was a claim that the body could not be reduced entirely to an instrument of production. A similar defense is now required for the pause in judgment. It is a claim that the mind cannot be reduced entirely to an instrument of optimization.

The pause is not a luxury. It is a condition of moral life. In the pause, uncertainty is allowed to speak. In the pause, competing considerations can be held together without immediate resolution. In the pause, the subject encounters the weight of choosing. Systems that compress this temporal space do not merely accelerate decision. They reshape the form of agency that decision presupposes.

This does not require a rejection of intelligent systems. It requires a refusal to allow convenience to become the sole measure of value. In domains where decisions bear moral weight, slowness can be a civic good. Friction can serve as a safeguard. The design of decision environments can be guided by principles that preserve spaces for reflection rather than eliminate them. Temporal requirements can be imposed alongside technical ones. Procedures can be structured to protect moments of human hesitation rather than penalize them as inefficiencies.

The feeding machine failed because the body could not accommodate the rhythm imposed upon it. Contemporary systems succeed because they cultivate accommodation within perception itself. They teach attention to move as the system moves. They make speed feel natural. The success of these systems is therefore ambiguous. A society that learns to decide without stopping risks forgetting why stopping once mattered. When every hesitation is framed as waste, the very capacity to hesitate comes to appear as a defect.

The task of ethical reflection is not to lament the presence of machines but to recover the value of the pause. The question is not whether systems can make decision easier. The question is whether they leave room for judgment to remain human. Convenience, left unchecked, reshapes the conditions of agency in ways that are difficult to perceive from within. The feeding machine is back, and this time it works because it no longer needs to force itself upon the body. It works by teaching the mind to move at its pace. If this succeeds without resistance, we may find that what has been optimized away is not inefficiency but the very space in which moral life takes form.

References

  • Simon, H.A. (1957). Models of Man. New York: Wiley.
  • Thaler, R.H. and Sunstein, C.R. (2008). Nudge: Improving Decisions About Health, Wealth, and Happiness. Yale University Press.
  • Zuboff, S. (2019). The Age of Surveillance Capitalism. PublicAffairs.

 

Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Follow the Topic

Philosophy of Artificial Intelligence
Humanities and Social Sciences > Philosophy > Philosophy of Science > Philosophy of Technology > Philosophy of Artificial Intelligence
Humanistic Psychology
Humanities and Social Sciences > Behavioral Sciences and Psychology > Clinical Psychology > Humanistic Psychology

Related Collections

With Collections, you can get published faster and increase your visibility.

Responsible use of AI in Art Creation and Archival Practice

This topical collection seeks contributions from the broader scholarly international community conducting research in the responsible use of AI in art creation and in archival practice. It aims to draw together emerging debates on the responsible use of AI in the arts and cultural heritage by focusing on the whole lifecycle of an artwork from creation to preservation and archiving. It will also promote implementable frameworks based on theoretical insights. Importantly, we would like to explore a broad range of AI tools used in the creation, reactivation or preservation of artworks. This incorporates current trends and interests around Generative AI as well as more longstanding AI techniques and applications, plus potential future ones, such as in the use of pervasive technologies.

Please consult the detailed call for papers before submitting at https://link.springer.com/journal/146/updates/27850936

Publishing Model: Hybrid

Deadline: Jun 30, 2026

Epistemic Transformations in Defence: Knowing About, With, and Through Artificial Intelligence

This Topical Collection focuses on the triadic framework of knowledge about, with, and through AI as a lens to analyse these developments. Knowledge about AI concerns the conceptual, technical, and normative understanding required to critically evaluate the capabilities, limits, and societal implications of AI systems in defence. Knowledge with AI refers to the epistemic and operational practices that emerge when AI is used as an analytic, diagnostic, or decision-support instrument, thereby reshaping modes of reasoning, situational awareness, and human-machine interaction. Knowledge through AI captures the novel conditions of information production and interpretation introduced by generative and predictive systems, raising questions about epistemic authority, professional competence, trust, and the transformation of military institutional norms. We particularly welcome submissions that illuminate the interrelation of these dimensions or explore their implications for ethical governance, regulatory debates, and democratic control of military technologies.

Please find the detailed call for papers here.

Publishing Model: Hybrid

Deadline: Jul 31, 2026