Every historical event brings major changes to the society that is involved in it. Every war is a major turning point in the life of the countries. An average person would say that it will affect the economy, politics, and maybe geographical position. However, many people would never get thinking how deeply the social life is affected by this cruel event. No matter how ridiculous it may sound, but it should be stressed that war is not always a negative experience; it also brings some positive changes. World War II is not an exception to this case.
Women and minorities have suffered in some way less and in some ways more. There have been plenty of cases of sexual and physical assault of women and minorities. However, there were the cases when the families with children were not harmed; more so, they received more help from the opposing force than from their own people.
As men went to fight in the war and were gotten killed, new even younger people were recruited each time. This enabled women to do the jobs that by social standards would be defined as strictly masculine. Women were able to become mechanics, doctors, and some went into the warfare. In the course of the war, the women-men ratio was still low; so even after the war, ladies were able to keep their jobs and to prove that they were good at their positions. The average age when people would start getting their jobs has also experienced the drastic fall as most of the families lost their providers.
The women and minorities have had more opportunities open up for them during and after the war. This enabled them to have the jobs they had before and to do things that were perceived as outrageous for the women or the minorities. With all the negative experiences that the war brought, there was something good after all.