The major moments in US History after the Civil War
Looking at the major moments in US History after the Civil War what is the most defining moment in American History? How did it impact the US positively? It has to be from the Reconstruction to 2018.