cyborg
It is clear that some of the earliest pioneers in personal computing had an agenda for the micro PC. Researchers like Alan Kay, Douglas Engelbart, and Ted Nelson saw the computer as an Augmentation Machine. Early applications in word-processing, electronic conferencing, and hypertext all revolved around the idea of the computer as a device for the augmentation, rather than the supplanting, of human intellect[.2] This involved more than just giving the individual access to information or the knowledge of their peers, as is often mentioned in connection with the "Infobahn." Rather, the computer 'counterculture' saw the personal computer as a tool for personal liberation, and for actually amplifying the speed, flexibility, and creativity of human thought. While AIs could do the thinking for human beings in some limited, specialized expert areas, what was even more important was that the proper computer interface could stimulate and improve human problem-solving itself. Computer scientists like Seymour Papert felt that utilizing tools like LOGO would help children attain the 'mindstorms' that would literally fuse new synaptic pathways, and in the future enable them to deal with even greater conceptual hurdles down the road in logic, mathematics, engineering, and science.
Interest in technologizing the human body did not begin with the invention of the computer, however. The concern with control and mastery over human performance began in the military (which invented standardized intelligence testing) and only later spilled over into economic production, with the introduction of 'scientific Taylorism' and its time-motion studies onto the factory floor. Originally, various ideas concerning the functioning of machines (cybernetics) made their way into the human sciences primarily as heuristic devices, spawning all sorts of conceptual innovations ranging from biofeedback theory to the double-bind theory of schizophrenia and mental illness. Information theory began to be applied to vexing problems in linguistics, sociology, psychology, and education. But the military and the captains of industry wanted more than just heuristics for explaining human activity. Where autonomous robots and AIs would not do (and unfortunately this turned out to be the case in a lot of areas), it became essential to "upgrade" the performance, efficiency, and utility of human beings in carrying out directives.
Science fiction clearly has been fascinated about the integration of the organic and the technological for a long time. One of the first incarnations of the artificial human was the robot or android, which first made its appearance in the movie Metropolis in the 1920s. But such robots were often simply purely electronic devices molded into a humanoid form; there was no organic component. However, by the 1960s, science fiction writers had turned to a more interesting imaginative construct: the cyborg. This being was a sort of hybrid, a mesh of flesh and steel, neurons and wires, blood and circuits. It was a human being partially transformed into a machine. From the Six Million Dollar Man to Robocop , the question posed by all these depictions of the cyborg was, how much of a human being could you replace and still preserve its essential humanity? While some of this technology remains the domain of science fiction, some of it is appearing here and now today, in the form of exoskeletons, artificial limbs and prostheses, biological implants (like Norplant), and electronic devices for restoring vision to the blind.
The history of eugenics, or human improvement, goes back centuries. Prior to the 20th century, most eugenic techniques involved discouraging unfavorable traits by preventing people with those traits from breeding (negative eugenics - sterilization and so forth), and encouraging improvement of favorable traits through ensuring those with those traits bred together (positive eugenics.) In the late 19th century, eugenicists like Francis Galton increasingly tried to define the 'science' of heredity, looking for ways to eliminate criminality, mental retardation ('feeblemindedness'), and sociopathy from the germline. But 20th century eugenics became discredited largely due to its associations with the German Nazi party and its master-race theories. However, with the formation of the neo-Darwinian synthesis (made possible by the discovery of DNA), it became possible to isolate and manipulate the techniques of biological change (the units of heredity which control the makeup of organisms) outside of the control of sexual reproduction. Sociobiological theories seeking to explain human behavior in evolutionary, hereditary terms began to reappear in the 1970s, but eugenics remained a taboo topic until the discover of recombinant DNA splicing techniques in that same decade.
Today in the 1990s these four constructs, eugenics, technologizing the body, the cyborg, and the augmentation machine, are reaching an unprecedented unification in the debates over biotechnology. People today are openly speaking about post-biological man. The technological and the organic are colliding in mysterious ways. Silicon neural networks are being modeled on the human brain and artificial life algorithms are simulating in microseconds the millennium-long processes of evolution and natural selection, while designers contemplate a new generation of computers based on or integrated with DNA to increase their rate of computation and replication as well as a host of devices to place biological activity (nervous system response time, hormone production, circadian rhythms) under technological control. Unfortunately, while the debates about genetic research have begun (even hailing back to the Asilomar Conference of 1974), fewer people are looking at or arguing over the more invisible and silent 'cyborgization' of the human being.
It is clear that some of the earliest pioneers in personal computing had an agenda for the micro PC. Researchers like Alan Kay, Douglas Engelbart, and Ted Nelson saw the computer as an Augmentation Machine. Early applications in word-processing, electronic conferencing, and hypertext all revolved around the idea of the computer as a device for the augmentation, rather than the supplanting, of human intellect[.2] This involved more than just giving the individual access to information or the knowledge of their peers, as is often mentioned in connection with the "Infobahn." Rather, the computer 'counterculture' saw the personal computer as a tool for personal liberation, and for actually amplifying the speed, flexibility, and creativity of human thought. While AIs could do the thinking for human beings in some limited, specialized expert areas, what was even more important was that the proper computer interface could stimulate and improve human problem-solving itself. Computer scientists like Seymour Papert felt that utilizing tools like LOGO would help children attain the 'mindstorms' that would literally fuse new synaptic pathways, and in the future enable them to deal with even greater conceptual hurdles down the road in logic, mathematics, engineering, and science.
Interest in technologizing the human body did not begin with the invention of the computer, however. The concern with control and mastery over human performance began in the military (which invented standardized intelligence testing) and only later spilled over into economic production, with the introduction of 'scientific Taylorism' and its time-motion studies onto the factory floor. Originally, various ideas concerning the functioning of machines (cybernetics) made their way into the human sciences primarily as heuristic devices, spawning all sorts of conceptual innovations ranging from biofeedback theory to the double-bind theory of schizophrenia and mental illness. Information theory began to be applied to vexing problems in linguistics, sociology, psychology, and education. But the military and the captains of industry wanted more than just heuristics for explaining human activity. Where autonomous robots and AIs would not do (and unfortunately this turned out to be the case in a lot of areas), it became essential to "upgrade" the performance, efficiency, and utility of human beings in carrying out directives.
Science fiction clearly has been fascinated about the integration of the organic and the technological for a long time. One of the first incarnations of the artificial human was the robot or android, which first made its appearance in the movie Metropolis in the 1920s. But such robots were often simply purely electronic devices molded into a humanoid form; there was no organic component. However, by the 1960s, science fiction writers had turned to a more interesting imaginative construct: the cyborg. This being was a sort of hybrid, a mesh of flesh and steel, neurons and wires, blood and circuits. It was a human being partially transformed into a machine. From the Six Million Dollar Man to Robocop , the question posed by all these depictions of the cyborg was, how much of a human being could you replace and still preserve its essential humanity? While some of this technology remains the domain of science fiction, some of it is appearing here and now today, in the form of exoskeletons, artificial limbs and prostheses, biological implants (like Norplant), and electronic devices for restoring vision to the blind.
The history of eugenics, or human improvement, goes back centuries. Prior to the 20th century, most eugenic techniques involved discouraging unfavorable traits by preventing people with those traits from breeding (negative eugenics - sterilization and so forth), and encouraging improvement of favorable traits through ensuring those with those traits bred together (positive eugenics.) In the late 19th century, eugenicists like Francis Galton increasingly tried to define the 'science' of heredity, looking for ways to eliminate criminality, mental retardation ('feeblemindedness'), and sociopathy from the germline. But 20th century eugenics became discredited largely due to its associations with the German Nazi party and its master-race theories. However, with the formation of the neo-Darwinian synthesis (made possible by the discovery of DNA), it became possible to isolate and manipulate the techniques of biological change (the units of heredity which control the makeup of organisms) outside of the control of sexual reproduction. Sociobiological theories seeking to explain human behavior in evolutionary, hereditary terms began to reappear in the 1970s, but eugenics remained a taboo topic until the discover of recombinant DNA splicing techniques in that same decade.
Today in the 1990s these four constructs, eugenics, technologizing the body, the cyborg, and the augmentation machine, are reaching an unprecedented unification in the debates over biotechnology. People today are openly speaking about post-biological man. The technological and the organic are colliding in mysterious ways. Silicon neural networks are being modeled on the human brain and artificial life algorithms are simulating in microseconds the millennium-long processes of evolution and natural selection, while designers contemplate a new generation of computers based on or integrated with DNA to increase their rate of computation and replication as well as a host of devices to place biological activity (nervous system response time, hormone production, circadian rhythms) under technological control. Unfortunately, while the debates about genetic research have begun (even hailing back to the Asilomar Conference of 1974), fewer people are looking at or arguing over the more invisible and silent 'cyborgization' of the human being.

dokurochan:
