Back to News US Women's National Team Wins The World Cup Congratulations to the United States Women's National Team, 2015 FIFA Women's World Cup Champions! Thank you for being an inspiration to the millions of soccer players and fans across this great nation!