Dictionary Only:
Explicit Words:

WEST COAST

(noun)
Did you mean?
Also try..

Definitions

There is 1 meaning of the phrase West Coast.

West Coast - as a noun

The western seaboard of the united states from washington to southern california

Synonyms (Exact Relations)
west coast of the united states

Example Sentences

"I love the west coast weather."
"They went on a west coast tour."
"We enjoyed the west coast scenery."
"She prefers west coast cuisine."
"He has a west coast style of dressing."
View more

Word Variations & Relations

A-Z Proximities

WordDB Icon
WordDB
United Kingdom
Download the WordDB app directly on your home screen for instant access. No App Store necessary, less than 1MB storage, always up-to-date and secure.
1.
Tap on share button
2.
Tap on Add To Home Screenadd button
3.
Find WordDB App Icon on your home screen