Liberalism in America
The definition of liberalism incorporates the belief that equal rights and liberty are important to all humankind and when it comes to Liberalism in America many hold this belief. The thought of liberalism is engrained in American society and this is witnessed through many Americans strong desires to uphold their rights and freedom when faced with adversity. American liberalism is a strong force that guides many people in America to make decisions based on this principle in addition to this and with few great social revolutions in America's past American liberalism is seen as different from other forms of liberalism across the globe. The thought of liberalism has always been in the back of the minds of Americans and can be traced back to the beginning of America in 1776 with the declaration of independence. This document signified the desire of the American people to be free and to be able to have equal rights. As far a Britain was concerned at this time social liberalism was not a subject that they would discuss with the American people and many in Britain felt that the monarch of the country determined how much freedom the American people should enjoy.
The American civil war and the civil rights movement were two events that further showed the belief that the majority of people had in equal rights and liberty for everyone. The civil war even pitted brother against brother because of the perceived absence of freedom on both sides. For the south this absence of freedom took the form of slave owners not being able to keep their slave property after abolition of slavery. For the Northern side the battle was fought in reference to slavery and in order to abolish it. In regards to these two sides of the battle field, they both experienced a great deal of feelings towards their personal freedoms which is a sure sign of the liberalism that many Americans practice.
© 2005-2011 whatisliberalism.com All Rights Reserved.