The Obamacare mandate for large employers kicks in this year and for smaller employers it kicks in next year. But an increasing number of economistsboth on the right and the leftare saying that mandated health insurance benefits at the work place are a bad idea. Are they right?
Should Employers Be Required to Provide Health Insurance to Their Employees?
Also published in Forbes Thu. February 19, 2015
John C. Goodman is a Senior Fellow at the Independent Institute, author of Priceless: Curing the Healthcare Crisis and President of the Goodman Institute for Public Policy Research.
EconomyHealth and HealthcareHealth InsuranceLabor and EmploymentLabor Law and RegulationLaw and LibertyObamacare
Comments
Before posting, please read our Comment Policy.