Probably not dragons.
Nanotechnology developments provoked a fear for a while that the Universe would be reduced to grey sludge by ever-multiplying nanorobots. Or something. This, together with the Matrix, suggests that we should be wary of any technology that appears not to be fully under our control. Creating an AI consciousness, as in the Matrix prequel, could give rise to a rival and utterly incomprehensible intelligence. Asimov's laws of robotics spring to mind - I think there are three of them, but however many: he proposes that laws be programmed into every hypothetical robot forbidding them to harm humans. A kind of morality, in fact.
One scenario that could feasibly happen is the Armageddon/Deep Impact one. And there is absolutely nothing humanity could do about it at present.
However, I think the most likely one, and one that would not make a good dramatic film, is everybody's favourite - climate change. Through our chronic and irreversible stupidity, humanity is sooner or later going to wreck this planet beyond repair, dramatically reducing its capacity to support life. That's if we don't wipe ourselves out in a thermonuclear war first.
If international attitudes remain as they are, I expect this war will start when the oil finally runs out. As human civilisation begins its final collapse, those countries that continue to demand that nonexistent oil reserves be released will eventually lose patience and resort to violence, and we will wipe ourselves off the face of the Universe in a colossal cataclysm of our own making.