Do I need workers compensation insurance?

The short answer is yes. Work comp is mandatory in most states for businesses with employees. Even if you’re not legally required to have work comp insurance, it can still be a valuable asset, protecting your business and your employees. It’s the only insurance that provides both medical and disability coverage for workers as well as legal protection for employers.

If you’re unsure if you need work comp insurance, it’s best to consult an insurance agent or your state regulations via your department of labor and/or workers compensation board.