Read the dictionary definition of west. All definitions for this word.
1. a location in the western part of a country, region, or city
2. the countries of (originally) Europe and (now including) North America and South America
3. the region of the United States lying to the west of the Mississippi River
4. English painter (born in America) who became the second president of the Royal Academy (1738-1820)
5. United States film actress (1892-1980)
6. British writer (born in Ireland) (1892-1983)
7. the cardinal compass point that is a 270 degrees
8. the direction corresponding to the westward cardinal compass point
9. situated in or facing or moving toward the west
10. to, toward, or in the west
1. we moved west to Arizona
2. situated west of Boston