Jimdamick
Well-Known Member
The definition of capitalism is "An economic and political system in which a country's trade and industries are controlled by private owners for profit". This concept has proven itself by driving economic growth in the US for over the last century, envied by the rest of the world. But now we are living in a new century, one where it appears that the rest of those countries have now embraced it also. Even China, one of the last bastions of Communism, has jumped on the bandwagon, with incredible success. But I think the dark side of capitalism is beginning to show up, and it doesn't look good for the planet as a whole. I can think of no better place to see that dark side than here in the US. What we are beginning to witness here is a huge gap between the winners and losers, so to speak, and it's cost to the health of society at large. In the US it has become even a political issue, with the Republicans embracing capitalism wholeheartedly, and the Democrats saying that some adjustment has to be for the economic security of the entire country. This is the main issue we are facing today. Is capitalism sustainable, or over time will it prove to be detrimental to society as a whole? Can having few amass most of the available resources, and leaving a marginal amount for the rest work? This is what made communism such a attractive option for many, but it lost to the economic might of the West. We simply out spent them, and they couldn't keep up except for China. Socialism is supposed to be a compromise between the two, but seems to be losing the war. Do you think that Capitalism is the answer to the economic well being of the world, or will it prove in the long run to be it's downfall? I don't know, but I do know it gives me a bad feeling about the future. There have been revolutions started for less.