Was to weak while the state government was too strong
I do not think so.
In my opinion, the foreign policy of the current US government has diminished the country's relative importance to the world. The United States has always been considered the country of freedom and the American dream. However, the sense of freedom has diminished before the preaching of nationalism. Anyway, I believe it is possible that the US will once again be the best country in the world. Governments and their policies are transient, we have a solid democracy and our culture is still hegemonic.
Group A
(the only reson I know this is because you told me)