Imperialism

Brian Amaya

Imperialism

Imperialism is a term used to describe the domination of one state over a number of others. In the early twenty–first century imperialism is generally thought to be a bad idea. After World War II ended in 1945 and increasingly during the late twentieth century most people came to view imperialist policies as both morally reprehensible and as economically unsound.