Quote:
Originally Posted by Magnum PEI
I'm officially boycotting this show. What the hell were Americans doing in Hawaii and the Philippines? How did they aquire these "territories?" I'd like to see Spielberg make a TV show on that subject.
|
Suit yourself. It was called "imperialism" which was practiced by the US, Great Britain, France,Germany,Spain,Belgium,The Netherlands,Japan, and any other nation which acquired colonies over the last few centuries. Why should Spielberg need to explain this? This miniseries, like Band of Brothers, is about the horror of war and how it affected those tasked with fighting it. Were these men any less brave because they came from countries with questionable foreign policies? Other than blind hatred of all things American, I don't understand your argument.