For a nice long while now a friend of mine has been posting up links to stories on Facebook concerning naturism, and the more I read the more I like it, and how beautifully it dovetails with body acceptance. From Well-Being Magazine:
When we shed our clothes, we also shed the constraints of social status that by which we are often judged. Naturists make no such judgements and simply get to know the true person within.
Then of course there are the positive benefits for body image. How many people go through their lives tortured by insecurity about perceived ‘faults' with their bodies, suffering low self esteem as a result? Our blemishes, scars and ‘imperfections' are not unique, we all have them. Naturism helps us to understand this and also to realise that the human body is to be celebrated for its rich diversity, not held as an object of shame."
When you're constantly seeing different bodies, men and women's, of all ages, shapes and sizes, you'll quickly come to the realization that we're all 'flawed' and that the bodies we see so often in various media are only a very narrow representation of all the beauty and diversity that humans enjoy. By seeing older bodies we can come to learn and accept what comes to all of us who live long enough. When younger kids in naturist families see all the different bodies they gain confidence in their own body, and have a kind of mental and emotional armor against the garbage body image messages they so frequently are attacked by.
But isn't being naked sexual? Isn't being around kids while naked a crime? No and no. The naked body has long been celebrated in art and only becomes sexualized when *we* sexualize it. If it were an inherently sexual thing, men would never be allowed to walk around anywhere topless, as they so often can and do, and women (in Ontario anyway) wouldn't have won the fight to do the same.
What do you think? Is nudism something you enjoy already or something that you'd never do? Are there only certain times or places where it's OK to be naked?