Definition:
Imperialism (scroll down for
answer)
Definition: Imperialism
Imperialism is when one powerful
nation takes over another territory and enforces their political,
economic, and social ideals onto the conquered. Typically, this was
rationalized through Social Darwinism,
or a belief in "survival of the fittest." Most European nations
imperialized in the middle to late nineteenth century. Britain
amassed the greatest Empire. Their main focus was Africa and Asia.
Click here for next
flash card.
Back to eFlashcard headquarters
|