Humanity Supersedes Technology: Lessons from the most important tech adoption of our century, the COVID-19 vaccine

Cameron Turner
7 min readDec 15, 2020

By Cameron Turner, VP Data Science and Karl Hampson CTO AI & Data, Kin+Carta

Photo by CDC on Unsplash

In 2020, lack of predictability has become predictable. The abnormal has become normal. “2020” as a moniker is itself ironic. The American Optometric Association (AOA) defines 20/20 vision as “to see clearly at 20 feet what should normally be seen at that distance”. As we enter a new phase of lockdowns globally, this is a year when very little can be known about next week let alone next month.

However, the continuous state of “predictable unpredictability” has spurred us to think, reflect, and perhaps match patterns between this generational crisis and our day-to-day work. Whatever solutions are to be delivered for the global pandemic, they will draw on what we’ve learned in other spheres of our lives.

Having spent our careers in the business of prediction (applied AI), and watching as the global distribution of vaccines unfold, several patterns emerge that follow the same contours of any technology release. The audacious endeavor of vaccinating a majority of the world’s 7.8B people bears some similarities to global adoption of other technologies such as mobile phones, Microsoft Windows, or Facebook.

Here we draw three parallels— global Covid-19 vaccination will represent the largest technology adoption, on a faster schedule, ever before attempted.

Lesson 1: Cultural acceptance must precede distribution

A key to technology adoption is fostering trust. In the US, the role of science has been challenged directly during Covid-19. In both of the US political parties, facts have been squelched or cherry-picked in favor of particular desired impressions as leaders and media outlets seek to reduce complex topics to digestible soundbites. In turn we lose the trust of the population.

In a September survey by Pew Research, 49% of Americans said they would “definitely or probably would not get vaccinate”. While these numbers will no doubt swing with FDA approval, it speaks to a natural distrust in new technology, one that could have unknown and unintended consequences.

In the course of studying digital transformation among consumers and the enterprise, we see a similar pattern. A default “no” to additional risk and cost is only flipped by overwhelming evidence to the contrary, often setting up a chicken-and-egg situation, where technology pilots are delayed or cancelled.

Photo by Ehimetalor Akhere Unuabona on Unsplash

In the realm of Artificial Intelligence specifically, challenges are greater. Machine learning intends to outperform human judgment — using the data from many to influence the outcome for one.

In theory, it’s a reasonable assumption (i.e. why trust the opinion of your doctor based on her experience when you can get the opinion of AI based on all doctors’ experiences?). In practice however… it’s complicated. Cultural acceptance of technology change takes time, and must be done through the lens of subjectivity (aka hope and fear) vs. objectivity (aka statistics and ROI calculations).

Lesson 2: Logistics supersede technical merit

Just as in data science where “the best model” rarely wins, in vaccines we are seeing that the logistics of distribution, not vaccine performance, will decide which of the vaccine each of us is more likely to receive.

The BBC recently published this table showing the efficacy, cost and store temperature for the leading covid-19 vaccine.

Source: BBC December 2020

While Pfizer and Moderna are tied for maximum efficacy, at a storage temperature of -70c and -20c respectively, the logistics of vaccination in remote regions of the world are mind-boggling.

Too often as technologists building software for large scale distribution, we are focused on the tech vs. the people who use it. Developing hardware and software that works only in idyllic circumstances by definition limits the total addressable market. In short, the best technologies are then only available to those in the best of circumstances.

Covid-19 has taught us the importance of avoiding such assumptions and instead putting humans first, designing technology for non-ideal scenarios where seemingly reliable broadband can glitch and where ideal working conditions can no longer be assumed. Companies with this mindset have hurtled forward, for example Zoom’s support for intermittent connectivity/speed (see here for more exciting details on Zoom packet loss). Solutions succeed when designed for both best and worst cases simultaneously, ensuring the best experience for everyone.

Lesson 3: There is no 100% solution

Finally, as euphoria builds over upcoming inoculations, it’s important to consider the empty part of the glass as well as the full. Even at 95% efficacy several confounding factors prevent us from having a “champagne moment” when we flip the switch from a Covid state to Covid-free state.

Citing the work of Britton, Ball and Trapman (Science, 2020), Anderson, et al calculate the population vaccination required to achieve herd immunity as: 1–(1 / R0), where R0 is the basic reproduction number.

With a covid-19 R0 of 2.5 to 3.5, using this formula, we can estimate herd immunity is achieved at about 60–72% inoculation.

If efficacy, ε, is considered, this becomes (1–(1 / R0)) / ε. If we assume ε is 95% (Pfizer/Moderna) then the herd immunity required becomes 63–75% for the defined range of R0 values.

The problem here is one often seen in digital transformation: few step up to the bleeding edge in adopting new technologies, when the risk of unknown outcomes is at its highest. This sets up a paradox, the higher the vaccine efficacy, the lower the required % of the population to be vaccinated to achieve herd immunity. In turn, each individual will ask: “If there is risk, why not be among the minority that reap the benefit later without the risk?”, thereby jeopardizing the required level of population inoculation.

Despite the apparent zeal in the market for artificial intelligence (AI) adoption, in Gartner’s 2019 CIO Agenda Survey , only 14% of organizations report having deployed AI systems. Still less than half of organizations report plans to do so by the end of 2021.

While maturity (skills, data and governance) tops the list of reasons reported for this glacial rate of adoption, the second most cited reason is general fears around value, governance and customer privacy.

CIO Agenda Survey, Gartner 2019

While the majority of risk and benefit of such AI investments are typically realized at the organizational level, in increasingly connected ecosystems of digital services, some level of adoption across the value chain must be realized before it can reaped by any one member of the chain.

Often in our work building AI systems, clients require “real-time” prediction, explanation and recommendation. This works well in highly connected underlying data platforms. However, when there is a dependency on a critical predictive signal which lies in data with high latency (i.e. quarterly earnings reports), the generated model languishes with outdated data.

To solve for this nearly universal issue (no one’s data is perfect!) we often employ techniques to fill or impute missing or outdated data through methods such as surrogate substitution, last observation carry forward (LOCF) or applied collinearity. Through these methods we can often allay what is cited as the #2 Enterprise maturity concern above: data quality.

Similarly, by deploying multiple models simultaneously, properly weighted ensembles (of several models) can produce much higher accuracy and explainability than any one approach, while reducing prediction risk (variance).While no single solution is 100%, using multiple “camera angles” can get us much closer to the objective than any one approach.

Generally, it’s agreed that the same mindset must be applied to vaccination. As in multifaceted business problems, the variations in coronavirus itself suggest a problem-specific response: what is good for one region or population may not be the best treatment for another.

In both contexts, ongoing and cyclical learning is implied by the need to measure the system, intentionally altered by the treatment. As the cycle goes on, the recommended next action will necessarily change as well. Models update, dosage and formula are altered, all to further optimize the ever-adapting system.

Pyhlogeny of the novel coronavirus, www.nextstraing.org December 2020

Conclusion

The 1918 global pandemic infected about 500 million people and killed at least 50 million of the world’s 1.8 billion. 675,000 died in the United States.

Time Magazine, 2020

In the end this pandemic ended with a fizzle — there was no single moment when the economy was “reopened” and life continued as usual. The final slowdown in infections was driven by herd immunity at the great cost of life, plus widespread behavior change including prevalent use of face masks.

As a technology, the vaccines entering the market in the next few months will represent possibly the most impactful technology to touch our lives. However, if vaccines are the “what”, the “how” (logistics/distribution, cultural integration, trust and continuous research) will define the speed and benefit of its adoption.

The covid-19 vaccination deployment reminds us that technology alone cannot affect the day-to-day lives of its intended beneficiaries. Only by considering human need over technology, can we reap the promised benefits that innovation holds.

--

--

Cameron Turner

Cameron is the VP of Data Science for Kin+Carta (LON:KCT), a global technology consulting firm.