Definitions
There is 1 meaning of the phrase
The Indies.
The Indies - as a noun
The string of islands between north america and south america; a popular resort area
Synonyms (Exact Relations)
Hypernyms (Closely Related)
Hyponyms (Broadly Related)
Word Variations & Relations
A-Z Proximities
Add 1 Letter To Make These Words...
Words