Page 125 - DLIS105_REFERENCE_SOURCES_AND_SERVICES
P. 125
Reference Sources and Services
Notes Introduction
Library and documentation centers are the backbone of any research and training institution. The
rapid technological development the world over has increased manifold the value of information
dissemination, and no research or training programme, or institutional development anywhere can
be said to be pragmatic and complete if libraries do not have a role in it.
11.1 Concept of Information
Information is viewed as a type of input to an organism or designed device. Inputs are of two kinds.
Some inputs are important to the function of the organism or device by themselves. In his book
Sensory Ecology, Dusenbery called these causal inputs. Other inputs are important only because
they are associated with causal inputs and can be used to predict the occurrence of a causal input at
a later time. Some information is important because of association with other information but
eventually there must be a connection to a causal input. In practice, information is usually carried
by weak stimuli that must be detected by specialized sensory systems and amplified by energy
inputs before they can be functional to the organism or device. For example, light is often a causal
input to plants but provides information to animals. The coloured light reflected from a flower is
too weak to do much photosynthetic work but the visual system of the bee detects it and the bee’s
nervous system uses the information to guide the bee to the flower, where the bee often finds nectar
or pollen, which is causal inputs, serving a nutritional function.
11.1.1 Characteristics of Information
The view of information as a message came into prominence with the publication in 1948 of an
influential paper by Claude Shannon, “A Mathematical Theory of Communication”. This paper
provides the foundations of information theory and endows the word information not only with a
technical meaning but also a measure. If the sending device is equally likely to send any one of the
sets of <math>N<math> messages, then the preferred measure of “the information produced when
one message is chosen from the set” is the base two logarithm of <math>N<math>. In this paper,
Shannon continues:
The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If
the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested
by J. W. Tukey. A device with two stable positions, such as a relay or a flip-flop circuit, can store one
bit of information.
Information as a message:
Information is a message, something to be communicated from the sender to the
receiver, as opposed to noise, which is something that inhibits the flow of
communication or creates misunderstanding.
If information is viewed merely as a message, it does not have to be accurate. It may be a lie, or just
a sound of a kiss. This model assumes a sender and a receiver, and does not attach any significance
to the idea that information is something that can be extracted from an environment, e.g., through
observation or measurement. Information in this sense is simply any message the sender chooses to
create.
120 LOVELY PROFESSIONAL UNIVERSITY