What do you think of artificial intelligence and the possibility of achieving the singularity...especially in our lifetime? Dystopia? Utopia? A Frankenstein's monster? A mere hopeful monster in need of nourishment and encouragement? Nary a monster at all? Blarney? Twaddle? Poppycock? Fear mongering? Catastrophizing? Proselytizing? Doom and gloomism? No "ism" at all? Personally, I say it's very Promethean: playing with fire...and more than fire, and you certainly must know what happens when you play with fire. In a nutshell (or in a microchip, as it were [pun intended {he-he-he}]): Never create a technology that could then create the technology or technologies that can or could be infinitely superior to you in unfathomable ways...whether it's in five years or a hundred and five years. It matters only that or if it happens, and I fear that considering the path we are on--a guest, really--, it's inevitable, and how much can your really trust the people in charge of this quest? You have seen an episode of the Outer Limits, haven't you? If not, they rarely end on a positive note. But back to my point. Just because it may not happen for many decades and to other generations, it does not eliminate its unlimited potential for disaster, despite its admitted, low-level boons, especially in medical science; and, hey, that's a good thing. A blessing? But a future disaster is no less a disaster and could trump any of its benefits. One last thought: imagine the lifeforms and technologies that the singularity itself and its offshoots and descendants could create. What if one of those creations went rogue? Never underestimate the power of one; one is all it takes, and, after all, one begets two; and why stop there. Truly, things beyond ourselves--certainly myself. But, hey, that's just how I see it, and a vast, complex topic at that, and one that encompasses many related subjects. What say you? Note: If you yourself are the singularity itself, please leave me alone. You scare me.