fwd: Electronics vs Paper

Doug Yanega dyanega at MONO.ICB.UFMG.BR
Thu Sep 18 20:43:18 CDT 1997


Daniel Barker wrote:

>Permanent free WWW sites will not solve the problem. In 50 years, the
>Internet as we know it will not even exist. It will vanish with 7-bit
>ASCII, Fortran-IV and C89 compatibility, BSD, DOS, X11, MacOS, the Win32
>API, POSIX 4 and every computer we have ever known.

You're probably right about the inevitable changes, but this does not
*have* to be a problem, really. I'm usually on the pessimistic side of the
argument, but at least in the case of retro-compatibility (and restricting
the discussion to technical feasibility alone), I'm not. I can think of 3
points to keep in mind when evaluating whether our goal is attainable:

(1) if all of the institutions who took on the responsibility for archives
ALSO made a commitment to upgrade, there is no reason data can not be
passed from one medium to another as improvements are made. Especially
given the lessons learned from the examples you cited, where some folks did
NOT work to upgrade, people need not make the same mistakes over and over
and over for the next century. Most of all, if we as a community *demand*
that our archival institutions are designed this way - if the people who
control the funds support only those who *have* a forward-thinking plan,
instead of one that won't outlive the present administration, then things
should be manageable. Some degree of centralization and community
commitment (meaning funds will need to be made available explicitly to help
archival centers upgrade) would appear to be essential, and probably
sufficient. Of course, this also means that folks are going to have to iron
out property rights so that little if any scientific information is
considered *proprietary* (and thus resides only in one place). If an
electronic "journal" insists that only they can have and control archives
of their data, for example, and they then go bankrupt, THAT would be a
disaster.

(2) as the storage media improve, it gets easier and easier to pack more
data into less space in less time - this facilitates centralization AND
speeds the upgrading process. It takes a lot of man-hours to convert 100 MB
of data on punch-cards, but 100 MB of data being switched from a CD-ROM to
some denser medium is only a matter of seconds. Units of data transferrable
per unit of effort is increasing quite a bit, and if improvements continue,
we might have a situation where an entire university library's worth of
data can be upgraded in a matter of minutes, and the problem might vanish
that way, too.

(3) people who are designing software and hardware are getting more
conscientious about cross- and retro-compatibility, anyway - we DO learn
from our mistakes.

        On the whole, if people can face up to point #1, and get over their
urge to keep everything proprietary and "in-house", there should be no real
obstacles to effective permanence of electronic information storage. If
we're willing to fight for it and pay for it, we can have it.

Sincerely,

Doug Yanega    Depto. de Biologia Geral, Instituto de Ciencias Biologicas,
Univ. Fed. de Minas Gerais, Cx.P. 486, 30.161-970 Belo Horizonte, MG   BRAZIL
phone: 031-448-1223, fax: 031-44-5481  (from U.S., prefix 011-55)
                  http://www.icb.ufmg.br/~dyanega/
  "There are some enterprises in which a careful disorderliness
        is the true method" - Herman Melville, Moby Dick, Chap. 82




More information about the Taxacom mailing list