Algorithms have been questioned after frontline workers at Stanford Health Care were passed over for the first wave of coronavirus vaccines. Following protests by medical staff at the hospital in California, officials blamed the “very complex algorithm” it had built to decide employees’ place in line.

But a breakdown of the algorithm, sent to medical residents and first published by MIT Technology Review, shows that its real error came from the humans who had designed it: namely, prioritizing employees based on age rather than their exposure to the virus at work.

As local governments begin their vaccine rollout, many are turning to similar algorithms and other scoring systems in hopes of ordering their ranks in a fair, explainable way. Many companies have already begun surveying their workers for data they can plug into the formula for clean results.

But Stanford’s example shows how algorithms can become scapegoats for people’s decision-making flaws. Stanford officials’ blaming of the algorithm glossed over the fact that the system was simply following a set of rules put in place by people. Algorithm designers worry that casting blame on what in fact are mathematical formulas created by humans will feed public distrust of the vaccines’ rollout.

The real problem, they argue, is arbitrary and opaque decision-making that doesn’t engage with the people who would be most affected by the decisions. If how the algorithm worked had been discussed more transparently beforehand, they argue, medical professionals would have been more confident in the results — or even, perhaps, spotted the oversight in advance.

“You can’t cut humans out of the process,” said Cat Hicks, a senior research scientist at the Khan Academy. “There needs to be testing, verifying, trust-building so we have more transparency … and can think about which people might get left out?”

But algorithms are already playing a key role in deciding vaccine deployments. Some of the first people to be vaccinated in the United States, earlier this month at George Washington University Hospital, were selected by an algorithm that scored their age, medical conditions and infection risk, a federal official told the New York Times.

Algorithms are also guiding the federal government’s mass distribution of the vaccine nationwide. The Trump administration’s Operation Warp Speed is using Tiberius, a system developed by the data-mining company Palantir, to analyze population and public-health data and determine the size and priority of vaccine shipments to individual states. The effort has already been marred by supply miscommunication and shipment delays.

The Stanford “vaccination sequence score” algorithm used a handful of limited variables — including an employee’s age and the estimated prevalence for coronavirus infection by their job role — to score each employee and establish their place in line. People age 65 or older, or 25 or younger, got a bonus of 0.5 points.

But the issue, some algorithmic designers said, was in how the different scores were calculated. A 65-year-old employee working from home would get 1.15 points added to their score — based purely on age. But an employee who worked in a job where half the employees tested positive for the coronavirus would get only 0.5 additional points.

That scoring-system flaw, among others, frustrated and confused medical residents when they saw that high-level administrators and doctors working from home were being vaccinated before medical professionals in patient rooms every day. Only seven of the hospital’s resident physicians were among the first 5,000 in line.

Unlike more sophisticated algorithms, the Stanford score does not appear to have added additional “weight” to more important factors, which could have prevented any data point, such as age, from overly skewing the results. Further compounding the problem: Some residents told ProPublica that they were disadvantaged because they worked in multiple areas across the hospital and could not designate a single department, which lowered their score.

When dozens of residents and others flooded the medical center in protest, a Stanford director defended the hospital by saying that ethicists and infectious-disease experts had worked for weeks to prepare a fair system but that their “very complex algorithm clearly didn’t work.”

In an email to pediatrics residents and fellows obtained by The Washington Post, officials said that “the Stanford vaccine algorithm failed to prioritize house staff,” such as young doctors and medical-school residents, and that “this should never have happened or unfolded the way it did.”

Stanford Medicine has since said in a statement that it takes “complete responsibility” for errors in the vaccine distribution plan and that it will move quickly to provide vaccinations for its front-line workforce.

“If we lived in a thriving, economically just society, people might trust algorithms. But we (mostly) don’t,” Jack Clark, the policy director for the research lab OpenAI, wrote in a newsletter last week. “We live in societies which are using opaque systems to make determinations that affect the lives of people, which seems increasingly unfair to most.” The Stanford backlash, he added, is just “a harbinger of things to come.”