Feminists believe that men and women are equal, and women deserve the same rights as men in society. Women shouldn't be treated as less important because without us men wouldn't even be here. We can do the same things as men can. And it's not "cancer" to be a feminist.
Well they were effected in many ways. some had to live without fathers and some even had there homes destroyed but most important of all is that they all had FREEDOM
Maurya Empire and Gupta Empire are the most significant but there has been more than that.
Hope this helps!
They are more likely to have a good education