Can Employers Require Covid Vaccinations?

By Kathy Morris - Nov. 23, 2020

Find a Job You Really Want In

0 selections

Many employers and employees are eager to return to work as normal. Promising news of a Covid vaccine means normal just might be on the horizon.

However, 1-in-3 Americans say they will not get a Covid vaccine, creating a dilemma for companies.

How will employers handle the unvaccinated masses in the workplace? After all, Covid outbreaks are costly for employers and hamper productivity. Will companies incentivize employees to get the jab or even mandate the vaccine?

Are employers even legally allowed to require employees take the vaccine? The short answer is yes, your employer can require you to get the Covid-19 vaccine.

Below we break down why and how likely it is you’ll be required, along with who is exempt from vaccine mandates.

Why Employers Can Require The Covid Vaccine

Typically, employment in the United States is at will.

Since “at will employment” means an employee can be terminated at any time without any reason, employees have relatively little protections from the desires of employers, barring discrimination or violation of labor laws.

At will employment goes both ways, meaning you are free to give your boss a big fat no, if you don’t want a Covid vaccine.

However, your employer is perfectly within their legal rights to make vaccination a job requirement.

Already, many employers mandate flu shots, smoking, and other deeply, personal health decisions.

Job type you want
Full Time
Part Time
Internship
Temporary

However, there are legally protected exceptions to vaccine mandates. You can a breakdown on exemptions below.

Who Is Exempt From Mandatory Covid Vaccinations

Vaccine exemptions fall under two buckets:

  • Medical exemptions.
  • Thanks to the Americans with Disabilities Act, if you have a condition or illness that puts you at greater risk of negative consequences, you can be exempted from a workplace vaccine mandate.

  • Religious/Philosophical exemptions.
  • The Title VII of the Civil Rights Act means workers opposed to vaccinations due to “moral or ethical beliefs” that are “sincerely held with the strength of traditional religious views” are exempt.

What employers will require covid vaccines of employees

In general, while most employers could legally mandate Covid vaccines, most won’t.

Instead most companies will focus on incentivizing employers to get inoculated. Expect to see lures such as free and on-site vaccinations. Other employers may go further, offering gift cards and other fun perks.

However, some employers will no doubt mandate Covid vaccines. Hospitals and nursing homes that work with high-risk and vulnerable populations are an example of workplaces likely to mandate Covid vaccines.

Many of these employers already mandate flu vaccines, TB tests, and other medical intrusions, making it a short leap.

Other workplaces prone to super spreader events, such as prisons and meat processing plants, may join them to protect staff and business functions.

However, it probably won’t just be employers mandating vaccines.

Vaccines could be required to gain access to schools, sporting events, or to travel between states without restriction.

While your particular workplace might not require a vaccine, many other facets of your life may be dependent upon receiving one.

Take the hassle out of your job search & get an offer faster
Kathy Morris

Author

Kathy Morris

Kathy is the head of content at Zippia with a knack for engaging audiences. Prior to joining Zippia, Kathy worked at Gateway Blend growing audiences across diverse brands. She graduated from Troy University with a degree in Social Science Education.

Find The Best Job That Fits Your Career

Major Survey Entry Point Icon

Where do you want to work?

0 selections

Related posts