Body positivity is a movement that promotes the idea of viewing your body – as well as others’ – positively, despite societal pressure to look a certain way. Its goals include addressing unrealistic body standards and helping people feel confident in and accepting of their bodies.