Friday, September 14, 2018

How has globalization made the world more Americanized?

First, we
must note that the world has not necessarily become more Americanized.  We can argue that the
opposite has happened.  As the world has become more globalized, other countries have become
more important.  China has become much more important in places like Africa, perhaps becoming
more influential than the US....

No comments:

Post a Comment