Is a business required to provide health, life and other insurance coverage for its employees?
UPDATED: June 19, 2018
It’s all about you. We want to help you make the right legal decisions.
We strive to help you make confident law decisions. Finding trusted and reliable legal advice should be easy. This doesn't influence our content. Our opinions are our own.
While many employers provide health insurance and other employee benefits for workers, the law generally does not require a business to do so. Most businesses provide these and other benefits to attract and retain good employees and as an additional form of compensation. However, employers generally must carry workers' compensation insurance to cover workplace injuries and illnesses. Additionally, businesses that employ unionized workers must provide whatever benefits are required by the terms of their union contracts, and it also may be necessary to provide certain types of employee benefits as a condition of doing business with or for certain governmental entities or agencies.