Wild West
noun
: the western U.S. in its frontier period characterized by roughness and lawlessness
Wild West
adjective
Love words? Need even more definitions?
Merriam-Webster unabridged
Share