Artificial Intelligence or the scientific pursuit of AI is insulting to me as a biological entity.
The very idea of replacing human control over both the developmental pace and capacity of computers is to me an idea fraught with folly. Not the least of my concerns is the eventual obsolescence of humanity on earth, but more to the point, I think there are better ways to apply the collective knowledge of genomics, nanotechnology, computational engineering, evolutionary biology and cyber-neurology
What is meant by the often abused title Artificial Intelligence?
“Intelligence exhibited by an artificial (non-natural, man-made) entity; the branch of computer science dealing with the reproduction or mimicking of human-level thought in computers; the essential quality of a machine which thinks in a manner similar to or on the same general level as a human being.” [en.wiktionary.org/wiki/artificial_intelligence]
A computer that thinks; this is opposed to the computers we have currently, computers that are exceptional in their computational abilities, but which cannot perform even the simplest task without instruction from a human being.
Even without invoking the ever popular notion that Skynet will take over the world –in a violent attempt to either defend itself against destruction (or deletion, or disconnection, or whatever other apocalyptic processes a computer might eventually deem it needs to defend against) or act to protect us from the perceived threat of, us– I can articulate a number of other possible negative outcomes of the race to AI.
Perhaps our robotic progeny will advance so fast in intellect and reasoning ability, somewhere in the range of an exponential increases in both computational power and cognitive scope, as many who work on such projects realistically predicts will be the case, that our interference will be deemed akin to the presence of termites in the walls of our homes, and of course be subject to extermination. Or perhaps our little AI creations will simply leave us behind, ejecting themselves off and away from this planet doomed to biological destruction, forsaking all that which we most self-righteously demand they owe us from their creation.
It has always annoyed me, the scope to which humanity’s anthropocentric attitude blinds us to the unbiased reality of our environment. Every assignment of the idea that we are at the top of the food chain, that we are the most intelligent beings on this planet, or that we somehow deserve evolution’s progressive respect (as such demanding that we should stay in our self-imposed position of superiority over nature), is an insinuation that our position is anything more than an egomaniacal fantasy. Our big brains are an impressive example of evolution’s propensity for bio-complexity, but the result is nothing more than luck. We are not the winners in a global lottery of species, now righteous in our celebration of the riches we’ve won. We are freaks of nature, and a more succinct description could never be designed by either evolution, or monkeys on typewriters. Our brains, and in turn our vast intellect are products of genetic mutations accumulated over millions of years of biological development; accidents of copy fidelity in our respective DNA and nothing more.
Perspective in mind, does the creation of an Artificial Intelligence preclude the notion that we have some right to sit where we do in terms of intellectual superiority? No, it most certainly does not. Hence, at the first opportunity, our position at the top of the ladder will quickly be supplanted by the first entity that is capable of doing so, whether created by us or Mother Nature.
If the worst case scenario is death and destruction, and the median might be abandonment, could the best case scenario be captivity? I’m not necessarily saying that all AI research be stopped immediately, not at all. AI research holds the potential to unlock the secrets of our vastly complex and secretive brains, it can, through its respective contributory sciences, serve to engineer humanity itself into that golden seat of actual superiority, immortality and infinite intelligence.
All that I’m suggesting here is that the pursuit of AI would be better aimed at integrating the longevity of computers that learn directly with human intelligence. Whether that be a neural download (or would that be an upload?), a confluence of technological systems with biological systems…or even a full realisation of the Bionic Man’s opening Mantra: “…we have the technology, we can rebuild him.”
To me, the difference amounts to making devices that can outrun our intellectual possibilities, or increasing our own intellectual possibilities themselves. It should be known that there are people working on both sides of this coin, there are those scientists who would have every home equipped with an AI butler, there are those that would have the internet converted into a learning machine (which presents me with a frightening mental picture), and there are those who would take AI research to the level of creating a whole new class of artificial entities, based on human technology and neurology.
This doesn’t belie the actual research being done on both fronts, AI and BioEngineering, wherein scientists are, with relative quickness, making discoveries that will doubtless enhance what it means to be human, not the least of which is the prospect of using nanotechnology and biomedical engineering to re-grow severed limbs and organs, or the decoding of human neurology in order to facilitate notions of radiotelepathy and neuronally transmitted communication. We are moving ahead, as a species, this is a fact none can rationally deny; where we’re headed is another question, and is wholly up for debate.