Victory in Europe! People across the country celebrated the end of World War I.
America had taken its place as a leader in the world, and its soldiers were coming home. American life wasn’t done changing, though. Women, African Americans, and workers were restless for the rights they deserved. The war highlighted how important those rights were – and also how valuable all citizens were to the nation. Wheels of change kept turning, bringing promises of a brighter future.