My body belongs to me, but it hasn't always felt that way. Growing up as a female in the United States, I've experienced many things that have made me feel like I don't own my body or that I'm not the only one who has a say in what happens to it.
Ownership & embodiment often go hand in hand. Only I know what's best for my body and what it needs. But sometimes it feels like the furthest thing from the truth. It can feel like parents, doctors, & even partners all know my body better than I do. But giving ownership away means I start to disassociate from my body & give up my power.
I encourage you to explore this yourself. Ask yourself these questions:
In what ways was your body restricted when you were younger?
In what ways was your body allowed to be free?
What early messages did you learn about bodies?
If you could go back & tell your younger self something loving about your body, what age would you go back to? What would you say?
What kind of friend are you on your most loving days? What would it be like to treat yourself the same way? What love languages would you use? e.g. acts of service, quality time, words of affirmation, gifts, physical touch
What social or cultural messages shaped how you experience your bodily self?
When have you felt the most social power? How is that experience related to your body?
(Full credit given to Hillary L. McBride, PhD & her book "Wisdom of Your Body" for these thought-provoking questions)
Wishing you wholeness,