Role of Women in the American West

Man sitting and leaning near bookshelf while reading book.jpg

In the 1800s, women were expected to stay in the “women’s sphere” of society, caring for home and family under the protection of husbands and fathers. The balance of power changed as families moved west, and women expanded their roles.

1 In Business

Some banks in the West preferred offering loans to woman to start businesses, because they were more reliable than men.

2 Educators

Because of the need for teachers, Western women were allowed to attend universities; many of them went on to become school administrators and serve on state boards of education. They were also instrumental in helping run missions, churches and schools for Native Americans.

3 Property Owners

Western women were encouraged to hold property in their own name, so families could increase their family’s holdings. This led to some women running ranches and farms by themselves, including supervising male employees.

4 Professionals

The demand for professionals led to people in the West to accept women as doctors, lawyers and business owners much sooner than people in the Eastern United States.

5 The Negative Side

The negative side of women’s lives in the West was drudgery and loneliness. Because of the shortage of labor, women often had to do farm work in addition to housework and caring for children.

Gwen Bruno has been a full-time freelance writer since 2009, with her gardening-related articles appearing on DavesGarden. She is a former teacher and librarian, and she holds a bachelor's degree in education from Augustana College and master's degrees in education and library science from North Park University and the University of Wisconsin.

×