No. At first it wasn't and then slowly they started to realize how bad it was and created laws and etc to protect the working class.
With the end of the Civil War came a great transition into a mechanized and factory-based economy, which took workers away from the farms and put them into the factories. Many people saw this is corporate great, which was in some ways true--this led to the formation of many unions and workers organizations.
Its when u have completly schooled a person and show them that its over u lost point blank period
Wealthy residents bought works of art, which supported artists.