Should Employers Be Required to Provide Health Benefits?

Source: Employee Benefit Research Institute

How do Americans feel about requiring employers to provide and contribute to health insurance coverage for their workers? Would Americans be willing to pay more in federal income taxes to make sure everyone has health insurance? The 2007 Health Confidence Survey, released recently by the nonpartisan Employee Benefit Research Institute (EBRI), asked some basic questions to gauge reactions to some of the health care policy changes that are currently being considered at the national level. Several of these questions concerned means by which health care coverage could be expanded to include all Americans. Others concerned the tax treatment of health care benefits.

Leave a Reply