Page 267 - DLIS402_INFORMATION_ANALYSIS_AND_REPACKAGING
P. 267
Information Analysis and Repackaging
Notes This section has examples of content analysis, using various types of coding. The first two examples
demonstrate content questioning. Example 3 shows how multi-coding can be done, while Example 4
covers the use of software in automatic content analysis.
Example 1: TV violence
An “interview” with a violent episode in a TV program might “ask” it questions such as:
How long did you last, in minutes and seconds?
What program were you shown on?
On what channel, what date, and what time?
Was that program local or imported? Series or one-off?
What was the nature of the violent action?
How graphic or realistic was the violence?
What were the effects of the violence on the victim/s?
Who did the violent act: heroes or villains? Men or women? Young or old? People of high or low
social status?
Who was the victim/s: heroes or villains? Men or women? (etc.)
To what extent did the violent action seem provoked or justified? And so on.
All the answers to the questions are available from watching the program. Notice that some of the
criteria are subjective (e.g. the last one). Instead of relying on a single person’s opinion on such
criteria, it’s usual to have several “judges” and record the average rating, often on a scale out of 10.
Example 2: Newspaper coverage of asylum seekers
working on a project that involves media content analysis, without transcription. The project’s
purpose is to evaluate the success of a public relations campaign designed to improve public attitudes
towards asylum seekers. The evaluation is done by “questioning” stories in news media: mainly
newspapers, radio, and TV. For newspaper articles, six sets of questions are asked of each story:
1. Media details
The name of the newspaper, the date, and the day of the week. This information can later be linked
to data on circulation and readership, which is available from public sources.
2. Exact topic of the news story
Recorded in two forms: a one-line summary - averaging about 10 words, and a code, chosen from a
list of about 15 main types of topic on this issue. Codes are used to count the number of occurrences
of stories on each main type of topic.
3. Apparent source of the story
This can include anonymous reporting (apparently by a staff reporter), a named staff writer, another
named source, a spokesperson, and unknown sources. If the source is known, it is entered in the
database.
4. Favourability of story towards asylum seekers
To overcome subjectivity, we ask several judges (chosen to cover a wide range of ages, sexes,
occupations, and knowledge of the overall issue) to rate each story on this 6-point scale:
1 = Very favourable
2 = Slightly favourable
3 = Neutral
262 LOVELY PROFESSIONAL UNIVERSITY