Well an obvious answer would be slavery, depending on how late you're talking. After slavery ended, the country began what is known as the "industrial revolution". The north had already begun industrializing for a while, but in the 1870's and beyond, things ramped up quite a bit. Factories for clothes, shoes, farming equipment, packaged foods, and etc. became more popular. You also had coal mines, railroads, and steel mills. This showed the country was moving in the direction of mass production, and moving away from agricultural ways of living. This also meant child labor, which led to many child labor laws in the later years. This also meant many moved out of farming communities, and into larger, noisy, and crowded cities. Many immigrated to the US to take advantage of that, which led to some tension between communities.
Hope that helps!
The answer to your first question would be C. The answer to your second question is B.
During whose reign are you talking about?
They should because they are the ones that pass the laws. They should have a more active role, and know more about what is happening around them so they can make better choices.
They began to get greedy and wanted to make a profit with the money that people entrusted to them.