Between 1865 and 1920 the United States played a more active role in global affairs. During the age of imperialism, the United States acquired territories (Alaska, Hawaii, Philippines, Puerto Rico, Guam) and intervened in the affairs of other nations (Mexico, China, Latin America) for various reasons. The United States also took part in the largest and most deadly global conflict the world had ever experienced, the First World War. Between the years 1865 and 1920, was American intervention in global affairs justified? Overall, did American intervention have a net positive or negative effect on the world?