We have been reading about the age of imperialism. In the early 1900's imperialism was accepted practice among the powerful countries of the world. Since then imperialism has fallen out of favor and is viewed negatively. The question I want you to consider is is the United States an imperialist country today. And is imperialism actually bad.