Is it fair to say that football in the U.S is now a truly growing sport after many many years of being considered a joke?
I`m English and a massive footie supporter who watches not only many English games but lots of European and World football also. I have to say that watching some of the MLS games, the games are played in the right manner (ball kept on the floor with lots of movement and passing).
So, have the U.S finally embraced the beautiful game?
