• Imperialism

    Americans had always sought to expand the size of their nation, and throughout the nineteenth century they extended their control toward the Pacific Ocean.  However, by the 1880's many American leaders had become convinced that the United States should join the imperialistic powers of Europe and establish colonies overseas.

    Americans Claim an Empire - The Americans Textbook, Chapter 10 Pgs. 342 - 365


    Imperialism Google Presentations

    Chapter 10 Readers' Guides

    Acquiring New Lands Grid

    Chapter 10 Review Sheet