In the US right now, news articles are talking about the supposed "shortage" of teachers, caregivers, and nurses. Schools and hospitals are scrambling because they can't get enough people to sign on - so many have left the field for less stressful similar paying jobs.
Traditionally these jobs are overworked and UNDERpaid but also are FEMALE dominated. This goes to show that society expects women to do the brunt of the work even for less than ideal wages. These jobs also get some of the most disrespect.
People are starting to say that these jobs need to pay more considering the amount of hours and the amount of work that goes in. Teachers are expected to make their classrooms look pretty, pay for it out of their own pockets, and then do a lot of work outside the classroom. Nurses and caregivers also do a lot of heavy labor, are on their feet, don't always get breaks, and put themselves in risky situations (covid, other sicknesses, even being around violent patients).
I know if these jobs were traditionally male, that there would be higher pay and more rights put into place. Look at the male dominated fields of trucking, plumbing, and electrician for example these are high paying careers.
I'm currently looking for a different job and am trying to know my worth and have boundaries and am really frustrated and sickened at seeing how much disrespect the female led jobs get.