Colonialism
Medical Dictionary
-> Colonialism
Search:
Colonialism
The aggregate of various economic, political, and social policies by which an imperial power maintains or extends its control over other areas or peoples. It includes the practice of or belief in acquiring and retaining colonies. The emphasis is less on its identity as an ideological political system than on its designation in a period of history. (Webster, 3d ed; from Dr. J. Cassedy, NLM History of Medicine Division)
© MedicalDictionaryweb.com 2012 |
Contact Us
|
Terms of Use
|
Teeth Whitening
|
Low Carb Foods and Diets