Skip to main content

Opinion: Do we really need DEI in medicine?

By Physician's Weekly  
   July 16, 2025

Public debate about diversity, equity, and inclusion (DEI) programs has intensified, and the medical field has not been spared scrutiny. We've seen people lose their jobs because they serve to bring DEI into their workplaces. In my view, DEI initiatives are crucial in all fields. However, is it necessary in the medical sector?

Full story

Tagged Under:


Get the latest on healthcare leadership in your inbox.