MedPageToday posted an article yesterday titled, *Most COVID Patients Landing in Hospitals Aren’t Fully Vaccinated**. (I think you’ll have to be a [free] member to read it in its entirety.)

The article indicated that “The vast majority of people hospitalized with COVID-19 at the Cleveland Clinic weren’t fully vaccinated^{1}, the institution said in a statement,” and the conclusion was that “It cannot be more clear the message that vaccines work and it’s the key action that we need to do to get back to our normal lives as they were before coronavirus” (according to the Cleveland Clinic’s ICU director).

The problem is, **the data they’re reporting are insufficient to draw that** — or any other related — **conclusion.**

## The Actual Data

The only bit of actual data they’re working with (at least as reported in the article) is this:

“Among the 4,300 hospital admissions that occurred from Jan. 1 to mid-April, 99% were not fully vaccinated.”

At first blush, yes, that seems like slam-dunk proof that the vaccines are highly effective. But is it? Let’s think a little longer.

## How Do Statistics Work?

We’ll start with a really obvious — ‘though likely not real-life — example. If 99% of the *population* is unvaccinated and 99% of the *hospital admissions* are unvaccinated, what does that tell us about the vaccine? At that point, the vaccine is likely 0% effective, right? *Because the hospital admissions are exactly representative of the general public.*

Now, the population is obviously not 99% unvaccinated — at least now. But I wanted to start from that point because it’s easy to understand. In order for the percentage *hospitalized *to be meaningful, we have to *know *the percentage in the general public.

## What About the General Public?

Unfortunately, we *don’t* know the percentage in the general public. As of now (May 13), the CDC is reporting that 36% of the public is fully vaccinated. ^{2} That’s up from 32% just in the last week. ^{3}

As of April 14th, only 23% of the population was fully vaccinated.^{4} So, at the *most-vaccinated *point in their range, we’re looking at 99% of hospital admissions not fully vaccinated, compared to 77% of the general public not fully vaccinated. That does still seem to portray a notable difference, ‘though not as much as the first impression implies.

## Time Span

There’s another major problem here, though, in analyzing the data. The number we’re given is a percentage for January through early April, inclusive. We’re not told how many of these admissions occurred when. For all we know, they could have all been in the first week of January or all been in the second week of April. (Obviously, it’s more likely that they were spread out, but the point is we don’t *know *how they were distributed.)

Meanwhile, the vaccination rate was changing *very *rapidly during that time. As of January 30th, 5,259,693 had received 2 doses.^{5} That’s just under 1.6% of the population.^{6} (And at that point, most of those were healthcare workers, so this was not an evenly-distributed 1.6% of the public.)

So we’re left with a variety of possibilities.

**If the cases were largely weighted toward, say, the first couple weeks of January** (a distinct possibility, given the expected trajectories of viral illness), then we’re looking at a vaccination status among the hospitalized that so closely resembles the vaccination status of the general public that no effectiveness is apparent.

**If the cases were largely weighted toward the first couple weeks of April,** then the effectiveness of the vaccine looks better, but this raises other questions — like why the increased and increasing vaccine coverage is correlated with higher *overall *rates of hospitalization. Is it possible, for instance, that vaccinated individuals are less likely to be hospitalized themselves, but more likely to contribute to spread? (This is not entirely implausible, given what we do and don’t know about the vaccines.)

**If the cases were somewhat evenly distributed,** then we’re left wondering how any of the numbers compare, since under 1% (of the public) throughout most of January, and 23% in mid-April is a pretty broad range.

**There simply isn’t enough information here to draw any valid conclusions.**

Yet this kind of data is being presented by the “experts” we’re told to trust, as evidence for the efficacy of COVID-19 vaccines and the reason we should all get them. Whether they’re being intentionally misleading or simply inept or thoughtless, the outcome is the same, and we the public need to be paying attention.

**I’m using the word “vaccinated” (and “vaccines”) because it’s the terminology in the article I’m discussing, and in the CDC data reports. However, I don’t believe these products are properly considered “vaccines,” as they don’t meet the conventional definitions of a vaccine.*

- Also notice the phrase “
*with*COVID-19. It isn’t clear from this if they were hospitalized*for*COVID-19.” - https://covid.cdc.gov/covid-data-tracker/#vaccinations
- https://web.archive.org/web/20210506003640/https://covid.cdc.gov/covid-data-tracker/#vaccinations
- https://web.archive.org/web/20210201000352/https://covid.cdc.gov/covid-data-tracker/#vaccinations
- The CDC wasn’t reporting this figure as percentage of the population at that point, so I’ve worked backward from the later reports to determine what they were using as their baseline.

## One thought on “Watch for Misleading Use of Data”

Comments are closed.