The Current Role of the US in the World
In effort to evoke a real and sincere debate, I would like to know what people think the role of the United States should be in the world today. Now we have all heard the liberal and conservative rehtoric time and time again. There are many areas where the US has a responsiblilty and an opportunity to make meaningful and positive contributoins. Africa is an example of this. Given that 3.1 million people were infected with HIV in 2004, is the US doing enough?