Colonialism
Colonialism is a practice by which a country controls people or areas, often by establishing colonies,[1] generally for strategic and economic advancement.[2] There is no clear definition of colonialism; definitions may vary depending on the use and context.[3][4][5][6]
This article needs additional citations for verification. (April 2021) |

Colonialism was first used to describe, and comes from the Roman term, for a farm, and later an outpost or the largest class of Roman city. It is formed by adding the -ism suffix, and has been associated with a variety of philosophies and structural understandings of colonies.[6]
Though colonialism has existed since ancient times, the concept is most strongly associated with the European colonial period starting with the 15th century when some European states established colonising empires. At first, European colonising countries followed policies of mercantilism, aiming to strengthen the home-country economy, so agreements usually restricted the colony to trading only with the metropole (mother country). By the mid-19th century, the British Empire gave up mercantilism and trade restrictions and adopted the principle of free trade, with few restrictions or tariffs.
Christian missionaries were active in practically all of the European-controlled colonies because the metropoles were Christian. Historian Philip Hoffman calculated that by 1800, before the Industrial Revolution, Europeans already controlled at least 35% of the globe, and by 1914, they had gained control of 84% of the globe.[7] In the aftermath of World War II colonial powers retreated between 1945 and 1975; over which time nearly all colonies gained independence, entering into changed colonial, so-called postcolonial and neocolonialist relations.
Postcolonialism and neocolonialism have continued or shifted relations and ideologies of colonialism, justifying its continuation with concepts such as development and new frontiers, as in exploring outer space for colonization.