Victory in Europe! People across the country celebrated the end of World War I.
America had taken its place as a leader in the world. Its soldiers were coming home. American life wasn’t done changing, though. Women, African Americans, and workers were restless for the rights they deserved. The war showed how important those rights were. It also showed how valuable all citizens were to the nation. Wheels of change kept turning, bringing promises of a brighter future.