preview

The United States

Better Essays

The Oxford Dictionary defines Wisdom as the quality of having experience, knowledge, and good judgment. Looking over the history of the United States, it can be argued that as a nation, the U.S has grown wiser over the years. This due to the various obstacles that they’ve overcome. Many of which were conflicts that arose with other countries and, at one point, within itself. All of the decisions made during these times of tension and overall crisis, even though at the moment very difficult to get through, all in all have been exceedingly beneficial to its progression as a Nation and in total helping citizens better define what it truly means to be American. Although the United States has been around since the late 1700’s, the definition of what it means to be American has transformed dramatically since then. Through each major conflict and detrimental hardship, the definition of what it means to be American has manifested itself into something far greater than it ever has been before, demonstrating the amount of progress that this fine nation has made over the past two hundred years altogether. The United States was founded on acts of rebellion. Before hand the United States was owned by the British, but in the year 1776 a group of fifty-six congressional delegates decided that they no longer wanted to be anyone else’s property, so they signed what is know as the “Declaration of Independence.” From that day forward, they were to be the United States of America. No longer

Get Access