I believe that it is true, but im not 100% sure.
D is the answeer. Hope this helps.
B.
The US never declared war on Germany because of the Poland invasion. Britain and France declared war on Germany because of the Poland invasion.
The United States was having the most problems with Japan, and the US declared war on Japan shortly after they bombed Pearl Harbor.
Once the US declared war on Japan, Germany declared war on the US due to the Tripartite Pact.