Quote:
Originally Posted by Blue Player
America has never won a war on its own. The only victories America has ever had have been when they have either France or UK on their side. The moment either of these two countries pull out American soldiers enter body bags.
|
The truth is buddy that if the US wouldn't have stepped in you would be speaking German right now. HA!