This isn't a political comment though some may go there.
Please try not to.
It would appear that the role women have been naturally endowed
with has gone hidden in modern culture. (Note. There is a spectrum of
personality from what is considered masculine to feminine. I respect all
expressions.)
Women have long been credited with roles such as nurturer,
healer, spiritual guide, peace keeper, environmentalist, equal rights advocate,
as well as ushering the next generation into adulthood with stability,
respectfulness, and responsibility with great, fierce love.
These are the most common attributes women were seen as
having.
Now, in current culture, in a misguided attempt at equal
rights, we are shown in the media stories and shows of women taking up male-like
toughness physically, with weapons, blowing up places with bombs, fighting, and
basically moving backwards in the evolution of our species. It is truly
cringeworthy and sets off alarms, or at least it should.
Creating a better, more peaceful world isn't advanced by
women making the same mistakes as men have. We have the power to give a
peaceful balance to society. It is in our nature and takes more courage than
does hate and violence. So yes, we need to find positions where we can make
change.