Is soccer ruining America?

I came across an article titled “How Soccer is Ruining America” and was quite astonished after reading it. Not only was it a full out bashing of the sport, but I feel as though the writer is somewhat of a hypocrite considering he has three children that play soccer and are supposedly quite good! I agree with Soccer Dad who says Ignorance Is Ruining America, Not Soccer. What are your thoughts on both sides? Feel free to leave a comment.