The design included several means of maintaining civility and the
quality of interactions. Users were prodded to self-reflection and some
restraint on impulsive commenting by having to identify comment type and
summarize the content in the a subject line. A user was also unable to
disguise her identity, because had to submit the comment on a form that
was mailed to her email address In addition,
moderators were used to minimize the posting of low quality, redundant
and inappropriate comments.
The moderators' tools included virtual queues to allocate work, review forms, form letters to users, and a constraint-based view system. These were used to overview all submissions to the meeting, access unreviewed and pending submissions on a topic, rate a submission's quality and decide its status. Options for the last function included accepting the submission for exposure to users, rejecting it or returning it for revision with a form letter explaining the reason, and deferring a decision to another moderator. The moderators in the on-line meeting, volunteers recruited by NPR and assigned to specific nodes, were trained to reject submissions that were low quality, redundant, obscene, personal attacks, whistleblowing or commercial solicitations. By checking on the review form the reason for a rejection, a moderator automatically returned the submission to the user with the appropriate letter of explanation. Much of this labor can be further reduced by the development of rule and neural network based programs that can automatically detect some prohibited content types.
Moderation exploits the database support of views, since accepting a comment merely included it in the public view. As implemented, moderators could see all submitted documents with their review status and quality ratings, while the general user saw or received only those that had been accepted for public display. More generally, views are displays generated by constraints that determine what gets shown to whom, display of the textbase and can be apportioned by arbitrary criteria, e.g., security classifications, organizational roles. A role, like moderator, then can be defined as a capacity to make certain moves towards others, based on the information the role holder gets or sees. Information is thus the context for interactions and power relations in the group.
The discourse grammar and the moderator reviews were constraints on what was said during the meeting, but there were factors that diminished their potential for arbitrary suppression of voices by the moderators. First the grammar itself was fairly transparent and was consistent with the announced purpose of the meeting as a discussion. Second, the moderators for the meeting were peers of the users, shared an interest in a meaningful discussion of the recommendations, and had to give reasons for rejecting a submission. Nevertheless, it is worthwhile comparing the risks that having rules of order and moderator authority poses for democratic practice with other means of control in on-line forums.