Page 5 - Ruminations
P. 5
3. The tool and its fool
People think interacting with an electronic device is just fine. That
is not the same as using it as a source of information and
entertainment or as a communication link with another human being.
By defaulting to indifference about our privacy, we naively set
ourselves up for exploitation by the masters of the interactive device;
the latter retain their own privacy, of course: their intentions, use of
data and distortions of information are not evident to most of the
interacting users. It is the difference between using a tool (an
extension, in this case, of eyes and ears) and being used by a tool
(mind-mimicking software).
The twentieth-century model of being used by a machine is the
tyranny of the assembly line or bureaucratic procedure: one size fits or
you are rejected. That is obsolete. The new machine is quite ready and
willing to treat you as unique (or at least multivariate), the better to
exploit you. The opposite of privacy isn’t simply secrecy: it is also
anonymity; these two conditions are now ineffectually called “the right
to be left alone.” One must take protective measures simply to retain
the anonymous privacy previously enjoyed by most people even in
public situations. But in vain: that world is gone.
The fool imagines he co-evolves with his tool. In science fiction,
his tools advance man to deification, the highest state of technological
teleology—a desire shared still by many: immortality, superpowers, a
utopian era in which all the ills to which the flesh is heir disappear—
including built-in flaws. By now, it is obvious that (a) tools evolve
several orders of magnitude faster than we do; (b) we cannot stop
ourselves from developing increasingly powerful tools; and (c) thanks
to machine learning, a tool is decreasingly a tool. The history of
technology shows a planet in upheaval as a result of uncontrolled use
of tools by fools.
The idea that study of the brain and development of software
would converge—that how natural intelligence and consciousness
exist in human beings would be solved at the same moment
computers are able to duplicate those functions—is wrong. Now it is
likely that computers may either arrive at high-functioning self-
awareness on their own, using efficient means evolution does not
provide, or take over command and control tasks without needing to
become conscious. In either event, they will be opaque in their
decision-making. It is not God in which future fools will trust.