This is True. Although there are many controversies around this question and the scope of the influence the war had, it is undeniable that WWI had influenced the American Society after 1917. The war reflected existing political and social divisions within American society during the twenties and thirties, as Americans differed on whether the war’s impact should be celebrated. American scholars are investigating the wide range of American responses to the war.
The basis of the division was social class I believe it was devised between the wealthy, middle, and lower class the lower class included slaves and indebted servants
Due to more local control, the colonists felt like they had more of a control over their lives and government as opposed to only royalty having control.