The United States became involved in World War II only "<span>c. after Japan attacked Pearl Harbor," since this represented an unquestionable act of war against the US that needed to be avenged. </span>
Military conquests by the Arab Empire and traders of Islamic faith were the primary means for spreading Islam to Africa, Europe, and Asia. I think its this one