For over 25 years, women have obtained the majority of college degrees in the United States. Between 1990 and 2016, the number of women seeking undergraduate degrees has increased 52 percent, while the number of men seeking degrees increased by only 39 percent. The difference is even more pronounced when you consider post-graduate degrees, where the number of women has spiked by 80 percent, compared to 43 percent for men.
Various theories exist to explain the widening gender gap. From childhood, some argue, boys have more difficulty thriving in a classroom environment. Additionally, while many women excel in male-dominated fields, men have tended to be more reluctant to build careers in traditionally held by women, such as teaching and nursing—although that trend is slowly changing as well.
Regardless of the reasons, employers value education, and these women are reaping the benefits of a range of career possibilities. Their education also allows women graduates to command higher wages and positions, as evidenced by the rising number of women Chief Executive Officers (CEOs). If women see higher education as excellent preparation to attain their desired careers, an ever-increasing number of them are embracing the opportunity.