How did World War One Change American Society?
Introduction In 1917 America entered World War one. By doing this America played a grave role in conquering Germany and ushering peace to Europe. However, the Great War also meant that the US would change dramatically through historical issues and changes which resulted in American society. Indus....