Dictionary Only:
Explicit Words:

WEST GERMANY

(noun)

Definitions

There is 1 meaning of the phrase West Germany.

West Germany - as a noun

A republic in north central europe on the north sea; established in 1949 from the zones of germany occupied by the british and french and americans after the german defeat; reunified with east germany in 1990

Synonyms (Exact Relations)
federal republic of germany

Word Variations & Relations

A-Z Proximities

WordDB Icon
WordDB
United Kingdom
Download the WordDB app directly on your home screen for instant access. No App Store necessary, less than 1MB storage, always up-to-date and secure.
1.
Tap on share button
2.
Tap on Add To Home Screenadd button
3.
Find WordDB App Icon on your home screen