A new report by RealClearInvestigations reveals that civics education is gaining ground on college campuses across the nation. Civics is defined by Merriam-Webster as “a social science dealing with the rights and duties of citizens.” This “ambitious movement to reform”… Read More
Before there was the United States, colonists lived in America but were still ruled by their home countries. After a while, the British colonies of England felt the king was treating them badly, so they decided to stop following their… Read More