Return to index.

Ross F. Collins*

E-Mail Discussion Groups in the College Classroom:
Fired Up—or Fed Up—With Free Discussion?

Volume 11, 1998



Abstract

Among the new internet-based technologies touted for classroom use is humble electronic mail. This easy-to-learn technology has become the common, even standard communication for educators and students alike. Most students obtain free e-mail accounts when they enter a university, and use the technology regularly.

One way to set up e-mail for classroom use is through the discussion group, sometimes nicknamed “Listserv” after the dominant software which controls the lists. During fall semester 1997 the instructor experimented with discussion groups for a large lower-level class as well as a smaller upper-level class. He found, however, that one-fifth of students didn’t even bother to sign up. Of those who did, another 10 percent seldom looked at messages. And the students found the discussion feature to be more annoying than enlightening.



Introduction
One of oldest and easiest to learn of the internet-based technologies is electronic mail. Most mail programs offer easy access and a system of posting as fast as hitting the “Send” button. (In fact, too fast, sometimes, to judge by the messages some of us later regret having sent.) It requires little “bandwidth” compared with picture-heavy web pages, and most schools right down to the primary grades are wired up, or nearly so.

If it’s so fast, so, easy, and so ubiquitous, why not use it for teaching? In fact, many of us are thinking about it. An article in the National Communication Association’s teaching publication last fall (Leonard Shedletsky, “A lot of teachers who can, don’t,” The Speech Communication Teacher 12, Fall 1997, 14-15) advocated setting up e-mail class discussions using a discussion group space on a server with common software such as Listserv. The idea behind class-based e-mail discussion groups resembles the concept of chat rooms, but not with the “real-time” feature of the mostly adolescent-oriented rooms on AOL. In this case, the owner of the e-mail server, usually a school or university, reserves space for a class, and sets up a subscription-based “club” open to any student in the class (or anyone else, for that matter, unless it’s purposely restricted). The class instructor explains the nature of the list to students, and asks them to sign themselves up by sending a special message to the server address.
From then on, students and the instructor may post messages to the discussion group address, and they will automatically be distributed, usually within minutes, to everyone who has joined the list. A moderated list allows the list owner (normally the instructor) to read each message before sending it on; an unmoderated list bypasses the list owner to send a message directly from poster to list.
Pedagogic experts looking at this technology tout it first as a new way to kindle discussion outside the classroom: “an e-mail channel to continue discussion outside of class, an electronic discussion group for the class.” (Shedletsky, 14). Not only that, instructors may pass along class information and handouts quickly, without resorting to wasteful paper handouts. In a state like North Dakota, if we can avoid cutting down more of our trees, it’s certainly a fine thing.

The Pilot Study
In fall 1997 I set up two e-mail based discussion groups for lower-division and upper-division classes. At the beginning end was MCOM 112, an introductory media class with 98 mostly freshmen and sophomores, about two-thirds of them mass communication majors. At the advanced end was MCOM 436, a media history class with 25 mostly juniors and seniors, nearly all of them mass communication majors. I spent about a half-hour explaining the idea to the students, and providing a URL (e-mail address) for them to sign up. I reminded them several times over the course of two weeks to sign up.
Goals of the list, as I explained to the class, included:

* An avenue for outside-of-class discussion of media-related topics.

* A means to offer class handouts.

* A way to offer regular review material for test preparation.

I considered encouraging students to post comments by offering extra-credit points, but decided against that level of enticement. I also decided to leave the list unmoderated. Both these decisions I made not only to encourage students to feel unconstrained by a sort of high-pressure big brother evaluating each comment, but also to save my own time: running a moderated e-mail list of 123 students, gradebook in hand to count postings for credit, can become a monster.

To make the list more useful, I posted weekly “lecture synopses” of material covered in class, and in the introductory-level class, weekly quizzes from class for exam review. I also posted occasional questions to stimulate discussion on media-related topics, but to avoid inhibiting discussion, I seldom threw my own oar into the group, unless someone asked me a direct question.


Results
At semester’s end, I surveyed each class regarding the discussion group. In the intro class 70 percent completed the survey; in the upper-level class, 68 percent completed the survey. (The rest of the students were absent on the survey days.) How did this experiment work? Students reacted fairly positively to the concept: on-line statistics provided to me as list owner showed 80 percent of the students signed on, in both the intro and advanced classes. Some of these sign-ups, however, did not arrive until the class was several weeks old, and a few of them didn’t come until after midterm. Also, 20 percent did not get around to signing up at all. According to the survey, the most common reason was lack of time.

As for postings, in the lower-level class, according to the survey, one student posted more than 20 messages throughout the semester; another two posted 10-20; 10 percent posted 5-10; 15 percent posted 2-5; 32 percent posted once; 28 percent did not post a message; 11 percent did not respond to the question, as they had reported not joining the group at all. (Percentages rounded to nearest whole.) A check of the on-line list statistics for this class showed a semester total of 200 messages posted. In the upper level class, no one posted more than 20 messages; one posted 10-20; no one posted 5-10; two posted 2-5, no one posted once, and 82 percent did not post at all. A check of list statistics for this class showed a semester total of 15 messages posted.
As readers of their mail, 40 percent of the younger group reported logging on at least once a day; 25 percent reported several times a week; 17 percent once a week; 7 percent less than once a week (The same 11 percent non-joiners did not respond to the question). Of the older students, 41 percent checked messages once a day; 24 percent, several times a week; 29 percent, once a week; one student (6 percent) less than once a week.

What did survey participants like most about the on-line news group? Number one answer, by a wide margin in both classes, was the lecture synopses, quizzes, and other exam-review material. What did they like least? The e-mail posts from fellow students, responders stridently emphasized. In fact, at the lower-level class, students over and over decried what they believed to be a useless waste of their time by other students posting irrelevant massages. Written comments included, “Some people have nothing better to do than put stupid messages on the Listserv.” “Wish I didn’t sign up.” “Other student comments were extremely annoying.” “More useful stuff, less garbage from other people on the list.” “Maybe some sort of grade penalty threat for stupid e-mails is in order.” On the other hand, one student complained of “Picky-ass people who yelled at my stupid messages. Need teacher to write, ‘hey, don’t pick on this person because he/she writes non-useful messages!’”

At the upper level, only one student added written comments to the survey, noting, “Students should not be encouraged to blab unless you give them topics to write on. It’s a waste of my time and fills up my e-mail.” However, postings in this class averaged only one a week, compared with about 13 a week in the intro-level class. Still, that’s only a couple a day.


Conclusions
The level of anger over “useless” comments from fellow students surprised me, as it runs counter to the ideal of free exchange of ideas outside the classroom. Not only that but, frankly, I was not sure what students complained about. I analyzed individual postings in the intro class. Students jumped into a lengthy discussion of musicians and the media, a topic that certainly seemed media related. Several responded to my question concerning media revelations of U.S. military ordnance in Iraq, and others talked about celebrities and the media. Yes, several postings were irrelevant or juvenile, and a couple four-letter words crept in, but nothing really offensive cycled through the list. I certainly hadn’t regretted my decision to leave it unmoderated—until I tallied survey results.

So, in this case, it seems my goal to encourage discussion students would find stimulating and attractive failed, miserably. As for my second goal, to disseminate handouts, I tried posting a multi-page guide to writing a historical research paper for students in the upper-level class. Unfortunately I discovered two weeks later that a number of students had not yet read it, either because they hadn’t signed up for the list (20 percent), hadn’t read it or printed it, or hadn’t looked at their e-mail lately. In the end I provided about 10 written copies for students who didn’t, or couldn’t, get it on line. It needs to be pointed out that, in addition to the 20 percent who never signed on to the group, another third in this class only looked at their mail once a week, even less. This does not seem to be an effective way to circulate handouts; perhaps trees still will be needed.

As for the third goal, exam review, the discussion group method seemed to have worked better. Students universally reported this to be their favorite reason to log on. Some even asked for more material. A drawback of this is that it obliges the instructor to prepare lecture synopses, quizzes and other review material for distribution by e-mail. This ate up an hour or two of my time each week, although now that the material is written, it would be faster to cut and paste it from word processing document to e-mail message. Inevitably, of course, some material must be rewritten each time the class is taught.

One difference between upper- and lower-level students: the younger class posted much more readily and enthusiastically. Most of the students in the upper-level classes likely were too old to have had e-mail in secondary school; the 18- and 19-year-olds, however, are perhaps more comfortable with this method of communication.

In sum, then, this experiment in new technology apparently met only one of three goals I’d set for it, offering study aid. But did it even meet that goal? Clearly students appreciated the opportunity to review using e-mail material, but did it improve their performance on exams? In fall 1996, with 74 students, MCOM 112 scores on final exams averaged 69.9. In fall 1997, with 98 students and the e-mail discussion group, MCOM 112 scores on final exams averaged 64.1. The classroom was the same, the material was the same, although last year’s class was 25 percent larger than the same class the year before. Larger classes usually mean lower-quality learning. In fall 1995 MCOM 436 (the class was not offered fall 1996), with 23 students, scores on finals averaged 71.5. In fall 1997, with 25 e-mail juiced students, scores on finals averaged 64.2.

This is glum. It gets worse. In 1996 my standard Student Rating of Instruction for MCOM 112 included ratings at the good or excellent level at no lower than 69 percent for all questions. In 1997, that figure was 67 percent. In MCOM 436, comparable figures were 62.5 percent in 1995, and 59 percent in 1997. In both classes, generally, responses in the “excellent” category dropped in 1997.
We can’t prove that lower student evals or test scores can be attributed to the e-mail experiment as, of course, many factors influence these figures. As well, an ambitious multi-year study based on many more classes would offer us more reliable comparisons, than this one based on one author’s limited pilot study. However, when I tried another experiment with new technology in fall 1996, the “web-based research paper,” student evals also dropped in that class. (See “Using the Web for Class Discussions: A Pilot Study,” North Dakota Journal of Speech & Theater 10 , 1997, 69-74.) At the beginning of spring semester 1998, after the last fall’s experience, I informally surveyed students to assess enthusiasm for e-mail discussion groups in these classes. Results were mixed; a number of students said “no.” I decided to listen to them. Nevertheless, I believe a class-based e-mail discussion group could be more effective if it were moderated, if students were required to post comments relating to class discussions, and were graded for their effort. But this goes against what I thought to be the charm of this new technology: a free-spirited, non-censored discussion of topics selected by student interest and enthusiasm. At least in these communication classes, that apparently is just what students do not want.

* Ross F. Collins is an associate professor of communication at North Dakota State University, Fargo.