DICTIONARY


Definition of NAMIBIA

noun : NAMIBIA

Source: WordNet 3.1

  • 1. (

    ) a republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia forms part of the high Namibian plateau of South Africa ;


See more about : NAMIBIA