Women's college
undergraduate college consisting entirely or predominantly of women
(Redirected from Women's colleges in the United States)
Women's colleges in higher education are colleges whose students are all or almost all women. They are often undergraduate, bachelor's degree-granting liberal arts colleges.