Feminists believe that men and women are equal, and women deserve the same rights as men in society. Women shouldn't be treated as less important because without us men wouldn't even be here. We can do the same things as men can. And it's not "cancer" to be a feminist.
We should not have feminism. They SAY they believe in equality yet want to take away ANY rights a man has. And also they never discuss mens issues such as unfair custody rights, no say in abortion what so ever, and they are automatically painted as the bad guy in any situation. If they talk about equality then they should talk about mens issues as well. Not how "oppressed" they are.
It was the <span>departure of the Allied Nation of "Russia" in late 1917 was a turning point in the war, since Russia withdrew from the war due to an internal revolution, which brought about a new era of communism. </span>