Dictionary Only:
Explicit Words:

WEST INDIES

(noun)
Did you mean?

Definitions

There is 1 meaning of the phrase West Indies.

West Indies - as a noun

The string of islands between north america and south america; a popular resort area

Synonyms (Exact Relations)
the indies
Hypernyms (Closely Related)
Hyponyms (Broadly Related)

Word Variations & Relations

A-Z Proximities

Add 1 Letter To Make These Words...

WordDB Icon
WordDB
United Kingdom
Download the WordDB app directly on your home screen for instant access. No App Store necessary, less than 1MB storage, always up-to-date and secure.
1.
Tap on share button
2.
Tap on Add To Home Screenadd button
3.
Find WordDB App Icon on your home screen