Search This Blog

Saturday, 21 September 2013

What are colonies?

Colonies are those areas that are occupied by the nation outside its borders, People living in colonies do not usually enjoy political or economic rights; they are usually forced to obey the dictates of the powers that govern the parent country. Government officials is colonies represent the interests of the colonial rulers. From the discovery of America to the First World War, European powers, such as Spain, Portugal, France, Great Britain, and the Netherlands, have captured, colonized, and often exploited other countries. Even Germany had colonies in Africa and on some of the islands in the Pacific Ocean. The indigenous people in colonies were often forcefully subjugated by the foreign settlers, as in North America or Australia, or were enslaved brutally and sold off in Africa or South America. It was only in the middle of the 20th century that many colonies succeeded in becoming independent.

No comments:

Post a Comment