Forum:WW2 history

Okay for some reason some member here believes that Germany declared war on Japan after Pearl Harbor. True? As far as I know it's not, I just looked it up, and the source said Germany declared war on the United States after the US declared war on Japan, because that's how alliances work, if someone declares war on your ally, you support them, in this case Nazi Germany supporting Japan. Can anyone confirm this member's knowledge of Germany declaring war on Japan? I'm finding it hard to believe so if anyone has a link to a legit source I would like to see it otherwise this member who is saying this stuff needs to back down. BulletBait 133 20:29, 9 July 2009 (UTC)