Definitions
There is 1 meaning of the phrase
Republic Of Namibia.
Republic Of Namibia - as a noun
A republic in southwestern africa on the south atlantic coast (formerly called south west africa); achieved independence from south africa in 1990; the greater part of namibia forms part of the high namibian plateau of south africa
Synonyms (Exact Relations)
Word Variations & Relations
A-Z Proximities
- reptiliferous
- reptilious
- reptilium
- reptiloid
- republic
- Republic Of Namibia
- republican
- republicanise
- republicanised
- republicanises
- republicanising