The Algorithmic Takeover of Australia's Care Systems
A profound and troubling transformation is quietly unfolding across Australia's care infrastructure. Every single day, hundreds of elderly citizens and individuals living with disabilities undergo assessments for essential supports. These include critical services like home care modifications, mobility aids, and various therapies designed to help them live with dignity and safety within their own homes and communities.
From Clinical Expertise to Computerized Questionnaires
Historically, these vital care decisions were the exclusive domain of trained health professionals. They relied on a sophisticated blend of clinical expertise and the fundamental human capacity to recognize and empathetically respond to the nuanced needs of others. Computers inherently lack both of these essential qualities. Yet, in our current era dominated by AI hype and an almost fetishistic worship of automation, society is increasingly turning to machines for guidance on deeply human questions concerning care, vulnerability, and individual need.
The shift toward algorithmic decision-making is starkly evident in the newly implemented Integrated Assessment Tool (IAT). Introduced on November 1, 2025, under the Albanese government's Aged Care Act, this rules-based algorithm sorts aged care applicants into one of eight predetermined funding levels. It unilaterally determines both the quantity of home care a person receives and their position in the often-lengthy queue for these essential services.
The Promise Versus the Painful Reality
The IAT was ostensibly designed to create a faster, fairer, and more consistent process for determining eligibility for government-subsidized aged care. It functions as a computerized questionnaire, using scored questions and rigid rules to categorize applicants by their perceived level of need. While assessments are conducted face-to-face, the human assessor's role has been drastically diminished, often reduced to merely inputting data into the algorithm's framework.
This tool mirrors similar systems deployed within the National Disability Insurance Scheme (NDIS), where human discretion is also being systematically reduced in favor of standardized, algorithmic assessments. Across both critical support systems, a perverse role reversal is occurring. The profoundly human experience of aging or living with a disability is relegated to a machine's calculation, while the professional assessor is increasingly robotized, becoming a mere ancillary component of the algorithm.
Systemic Neglect and the Removal of Human Override
In both disability and aged care sectors, the promised efficiency of algorithms has been completely overshadowed by widespread stories of delay, profound frustration, and systemic neglect. Aged care clinicians and frontline carers have described these automated tools as "cruel" and "inhumane," arguing they strip away essential clinical expertise and leave elderly individuals with dangerously inadequate support.
From mid-2024, changes to the NDIS will allow a person's support needs to be algorithmically reclassified, with supports potentially cut without any right to appeal the final, automated decision. The aged care system has gone even further, entirely removing any mechanism for human professionals to override the algorithm's determinations.
The Stark Risks of Removing Human Judgment
The risks inherent in this automation are severe and unambiguous. When human judgment is excised from the process, outcomes are determined solely by pre-programmed rules and numerical scores. If the input data is incomplete, if key variables fail to capture what truly matters in a person's life, or if factors are weighted incorrectly, the system fundamentally misreads the situation. The inevitable result is that the individual is left under-supported or completely unsupported.
Such algorithmically generated decisions may appear fair on a spreadsheet but systematically disadvantage those whose lives and needs cannot be neatly reduced to quantitative data. This typically includes people with complex, fluctuating, or highly atypical support requirements. Cultural or language barriers, limited personal capacity, lack of resources, or poor assessment practices can magnify these errors, resulting in a distorted and partial view of an individual's true circumstances. What the algorithm captures is accepted as absolute truth, even when it catastrophically fails to reflect the person's lived reality.
International Warnings and the Path Forward
We need only examine international precedents to foresee where this path leads. In the U.S. state of Arkansas, an algorithm introduced to ration care for people with severe impairments led to recipients having their support hours drastically cut. In subsequent Senate hearings, lawmakers heard harrowing evidence of "people lying in their own waste, going without food, going without any sort of community contact."
All public resource systems grapple with the tension between consistent process and fair outcomes. By almost entirely removing human discretion, Australia risks creating systems that sacrifice essential nuance for the false god of consistency. Consistency of process does not guarantee fair outcomes; in fact, it can entrench inequality and amplify harm. Excessive standardization creates impersonal processes that overlook individual needs and the rich complexities of lived experience. This is termed algorithmic mis-recognition—a form of moral injury encountered when our lived reality is ignored, erased, or treated as irrelevant by the very systems meant to support us.
Social care services demand a radically different approach. They require systems that are fundamentally attentive to lived experience and fairness of outcomes, which means providing the right support tailored to a person's unique life circumstances. Well-governed systems should support and enhance human judgment and accountability, using technology in limited, safe, and transparent ways to inform—not replace—critical decision-making.
The IAT serves as a stark warning. Care for vulnerable citizens cannot be reduced to rules and scores alone. Aging and disability are profoundly human experiences, and decisions about care carry life-altering consequences. When society relinquishes human judgment and the capacity to override automated decisions, we place countless lives at the mercy of a flawed and unfeeling system.



