Skip to main content

How nurses can learn to love their bodies

By Atlanta Journal-Constitution  
   January 02, 2020

Body positivity is a movement that promotes the idea of viewing your body – as well as others’ – positively, despite societal pressure to look a certain way. Its goals include addressing unrealistic body standards and helping people feel confident in and accepting of their bodies.

Full story


Get the latest on healthcare leadership in your inbox.