Skip to main content

Opinion: Private equity firms are gnawing away at U.S. healthcare

By The Washington Post  
   January 11, 2024

The increasing role of private equity in healthcare a trend that should have everyone's attention, from politicians to patients, because it can significantly increase costs, reduce access and even threaten patient safety.

Full story


Get the latest on healthcare leadership in your inbox.