In terms of for Britain, ultimately, no. Look at where it got us now. We act like it didn't happen because, for some reason, people seem to be ashamed of it for some reason and the world treats us like we're just a worthless country despite we practically built up the superpowers of the modern world.
"Arrogance is what made the British Empire.... And lost it.... And then pretended it never happened."
In terms of the world now, yeah, I think it was. English is the dominant language or whatever and western culture is the dominate culture in about 60% of the world now, all because of the foundations laid down by the British Empire.