I watched a documentary a couple of weeks ago and the finale of the document shows Americans dancing, singing happily and a person holding up the papers with the words splashed on the front page, "JAPAN SURRENDERS", "VICTORY".
Of cos, we all know that the Japanese surrender marked the 'end' of WWII.
But questions rage. Has WWII truly ended or is it still going on?
What do u guys think? Poll coming up.