Definition of "West" :
noun: a location in the western part of a country, region, or city
noun: the countries of (originally) Europe and (now including) North America and South America
noun: the region of the United States lying to the west of the Mississippi River
noun: English painter (born in America) who became the second president of the Royal Academy (1738-1820)
noun: United States film actress (1892-1980)
noun: British writer (born in Ireland) (1892-1983)
noun: the cardinal compass point that is a 270 degrees
noun: the direction corresponding to the westward cardinal compass point
adjective: situated in or facing or moving toward the west
adverb: to, toward, or in the west
"We moved west to Arizona."