The American View: COVID-19 Process Failures and Dallas’s “Missing 5,200”

2020 has been a brutal year for news. Even if you somehow managed to ignore the political cacophony, economic collapse harbingers, and increasing domestic terrorism trends, there’s been one consistent news story that everyone can’t help but track obsessively: the daily increase in local coronavirus infections. It’s the most important story on every rational adult’s mind because it informs our estimation of how likely we are to getting exposed, infected, and potentially killed. Kinda makes sense that we’d all be glued to our screens. 

On the last two days of March, Dallas county’s daily infection rate increased from 61 to 82 cases. On the last two days of April, it increased from 112 to 179. In May, 219 to 228. June 572 to 601. July, 537 to 707. This is all just for Dallas County, mind; the 12 million person Dallas Fort Worth Metroplex consists of 11 counties and 7.5 million people. Dallas just happens to be the most populous of the 11 counties so it’s our local bellwether. Even when new infections were rising from ~500 each day to 700, the coronavirus threat still seemed … manageable. Then, on Sunday, 16th August, I opened the Dallas Morning News to this headline:

Backlog in state reporting adds more than 5,000 coronavirus cases in Dallas County

Having ruined my newsprint edition with a blast of explosively spewed coffee, I fumbled with my phone until I found the digital edition. In the article, reporters Dana Branham and Nataly Keomoungkhoun (somehow) calmly reported:

“Of the 5,361 cases reported Sunday, 5,195 came from the backlog, according to a news release from Dallas County Judge Clay Jenkins. That leaves 166 new cases being reported Sunday, which would be the lowest daily number the county has reported in months … 

“The majority of those cases – 4,298 of them – are from tests conducted in July. Thirteen were from tests conducted in March, 149 were from April, 80 were from May, 52 were from June and 603 were from August, according to data released by the county.”

That’s more than double the population of the last sold-out concert I attended in the “before-time” (Sabaton at Dallas’s House of Blues, for the record).

So, okay. A new daily high significantly lower than the 500-700 new cases that had been reported nearly every morning throughout the mid-summer. That’s … good news … right? It sort of feels like it should be, except … The “good” part of the news pales when you consider the larger picture. The news we’d been greedily devouring like addicts every morning turned out to have been dangerously wrong. The problem – and the existential threat – was markedly larger than we’d been led to believe. 

Was this revelation proof of a wicked government conspiracy to downplay the severity of the virus? A wilful deception, cynically published to trick people into returning to work in the hopes of revitalizing the stock market for the sole benefit of the already-obscenely wealthy? No. No, it wasn’t. As much as that conspiracy theory sounds believable, the underreporting came thanks to a coding error in the state’s computerized case reporting system. A bug. Perhaps a bit more severe than the bug that causes your favourite videogame to crash when saving. Still, though, just a software bug (as far as we can tell).

Let’s remember that this public health emergency is just that: an emergency. Good people are scrambling in good faith to deal with a grand-scale crisis that most communities and agencies weren’t prepared for. Mistakes are going to happen. Everything about the pandemic response is going to stress systems, networks, processes, and protocols until they break. This is, unfortunately, to be expected.

Here’s why I said it’s dangerous, though. Data and Nataly’s article went on to say: “Because of the errors, none of the 5,195 cases from the backlog had been revealed to the county’s public health team, so no tracing had been done on those cases.” [1]

No … contact … tracing … on 5,000+ confirmed infections. Nnnnnnnnnnghhhhhh …[2]

Seems like there’s been, on average, at least one scream-yourself-insensible moment every day of the botched pandemic response. 

The “missing 5,200 reminds me (bizarrely) of that science fiction drama The 4400 from the early 2000s. A huge population of people appear out of nowhere, integrating with normal society, only their blood contains a chemical that has a 50% chance of killing someone if they’re exposed to it … and most people interacting with them don’t realize the danger they’re in. Come to think of it, I should probably finish watching that series. It’s not like I’m leaving the house anymore if I avoid it …

This “missing 5,200” drama seems more like a cheap made-for-cable re-make of a better-written drama. Besides, these people knew that they were sick and could take responsible precautions to avoid infecting others; it was only the epidemiologists who didn’t know. It’s not like there were 5,200 homicidal maniacs hiding in back alleys, waiting to leap out and cough on someone. At worst, the lack of contact tracing meant that some potentially infected people could have been tested earlier … but if they contracted the disease, they probably showed up in the next round of statistics. The number is scary, but the actual threat they posed has largely passed. Hopefully. 

Looking at it pragmatically, for six months us North Texans believed that our public health officials had a good lock on the problem. Politicians, maybe not so much, but they’re always shifty varmints at the best of times. Health experts, though, have been reliably transparent, forthright, and thoughtful in their interactions with us in the stressed-out public. We’ve all accepted the doctors’ and scientists’ unpleasant news and did our best to tactically plan our days according to our own individual risk tolerances. To then learn that 5,000+ people had gotten infected and nothing had been done to trace their potential spread of the disease felt like being unexpected ejected from an airplane. 

Again, though, this was – much as no one likes to admit it – to be expected. There were always going to be process errors! Everyone with experience in large-scale operations planning is painfully aware that reports coming in from “the field” during a chaotic event are often late, usually insufficient, and sometimes flat wrong. The workers on the ground aren’t trying to be difficult; rapidly-evolving situations suffer from “fog of war” complications, making situational awareness, data integrity, and communications extraordinarily difficult to maintain. Information processing channels – like Texas’s COVID-19 reporting system – can malfunction without anyone noticing simply because the flood of information that doesget through is overwhelming enough on its own. 

Bottom line: processes and systems will fail. Maybe not all at once, and maybe not to the extent that people notice the failure in time to pre-empt a disaster. Like rust on a ship’s hull or cracks in an aircraft’s wing root, the stress of excessive demand will find and exacerbate all of the potential weaknesses in a system that always worked reliably under non-emergency conditions. 

That’s why the military is so fond of massive, complicated war games and exercises. Stress-testing everything in your arsenal is the only way to be sure it’s all going to work dependably when you really need it.  

There’s a great takeaway lesson here for business leaders (assuming any of us survive the pandemic). Disaster response and other, similar, emergency operations plans need to be regularly stress-tested past the point of failure. Not just sand-tabled at a leisurely pace but pushed up to and beyond their designed redline. Processes, systems, and plans need to be pushed until they break so that they can be redesigned and rebuilt to a higher performance standard. Then tested again until something else breaks. Repeat as often as budget and endurance allow. We do this with critical machines; we have to do the same with critical protocols. 

I’m not trying to imply that Dallas public health officials did anything wrong; not at all. We all thought the reporting and tracking processes were working more-or-less fine for six months. Then some cracks appeared in the essential reporting process, everyone experienced a bone-chilling fright … and the people in charge owned up to it before shifting effort to fix the newly-identified problems. 

Yes, it would have been better if the system errors could have been spotted and corrected before we had a jumbo-sized pandemic to contend with. Thing is, those system errors didn’t (as far as we know) manifest during the H1N1 pandemic … which only featured 6,128 total cases in Texas. So, the overload point seems to be somewhere between … um … 6,128 cases and just over 500,000. Good … to … know. *sigh*

So, yeah. Let’s fix the reporting system now and then maybe stress-test a new version of it for its next probable failure point. I’m guessing we’ll need to build it to operate effectively for at least 1,000,000 cases … and I’d wager that given how things have gone this summer, Texas is going to reach that apocalyptic milestone waaaaaay too quickly. 

Did I mention we’re reopening schools for in-person instruction this month? Yeah. That … ah … might influence the numbers. 


[1] Emphasis added. If you’re reading this article out loud, I suggest ending this paragraph in a rising atavistic shriek.

[2] Please add your favourite profane exclamation here. 

Pop Culture Allusion: René Echevarria and Scott Peters, The 4400 (2004-2007 television show)

© Business Reporter 2020