Healthiest Companies in America

The Healthiest Companies in America award recognizes select American corporations nationwide for successfully helping their employees live healthier lives. The award acknowledges companies for their commitment to the health and well-being of their employees, and the creation of a corporate culture that encourages everyone to take an active role in managing their health.

Most importantly, the award recognizes health performance and outcomes. Organizations who are named Healthiest Companies in America have achieved a company-wide low risk health status by assisting employees in improving their health.

2013 Healthiest Companies in America