It should be the British
because ever since the 'United' States of America was formed the African Americans' have been looked down upon and slandered. The Mexicans have also been targeted since a couple of years ago and the Japanese were going against the US during WWII, so people hated them greatly.
I hope I helped
Please make me brainliest
Answer:
For African Americans in the South, life after slavery was a world transformed. Gone were the brutalities and indignities of slave life, the whippings and sexual assaults, the selling and forcible relocation of family members, the denial of education, wages, legal marriage, homeownership, and more. African Americans celebrated their newfound freedom both privately and in public jubilees.