Experiments in Computer-Based Classroom Learning in

Introductory Astronomy

 

Fran Bagenal

Astrophysical and Planetary Sciences

 

Part A: Spring 2002: Results from Investigation by Kathy Garvin-Doxas

Part B: Spring 2003: Proposed Follow-on Investigation

 

This material will be presented by Fran Bagenal at an invited talk to the American Association of Physics Teachers in January 2003, as part of a discussion of other educational experiments in the Department of Astrophysical and Planetary Sciences.

 

 

Part A: Spring 2002: Results from Investigation by Kathy Garvin-Doxas

 

Background

 

Introductory Astronomy is taught as a course for non-science majors wishing to fulfill their natural sciences core curriculum requirement. The educational goals are science literacy and enhanced awareness of the Earth’s role in the solar system. The APS Department usually teaches 6 sections of 1000-level astronomy to ~1200 students per semester. This experimental section had a limited enrollment of 72 students and was taught in the Duane G125 smart classroom with Ethernet jacks at each seat.

 

In this course, Prof. Bagenal conducted two lectures per week and replaced the Friday lecture period with computer activities.  Students were encouraged to bring their own laptops to class for these activities, but laptops were also provided by the course and students had access to a computer lab across the hall from the regular lecture hall.  ITS personnel appeared each Friday to assist with laptop hookup and computer access in the regular lecture hall, but the teaching team and students participated in set up as well.  Many of the computer activities were also accessible to students from home. 

 

The course TA and Prof. Bagenal circulated between the two classrooms on Fridays, answered student questions, and talked with students about what they were learning throughout the 50 minute course session.  Attendance was taken on these days and students received 1% credit for each Friday session they attended but students generally worked through activities without being required to turn anything in for grading.  Activities extended lecture material and concepts, but generally did not provide unique information (simply a unique way of ‘seeing’ and experiencing the information).  Student attendance to these Friday class sessions was consistently high (~40 of 44 students per Friday).  Students were encouraged to work in groups (defined by Prof. Bagenal as 3 students, but most often interpreted by students as 2 or more students) in both computer settings. 

 

Data presented here are based on a small sample, but gives us some interesting directions for future research.  Student response rate to survey questions was high, differed for each question, and includes numerous open-ended questions (used as exploratory probes to determine important themes rather than definitive results was equally high).  The only questions that revealed no strong pattern of responses were those dealing with student preferences for particular computer activities.

 

Observations

 

Observational data showed that student participation – defined here as peer discussion of the activities as they worked through them (sometimes guided by questions provided on paper by the professor and sometimes open exploration) – was consistently much higher and robust in the smart classroom than it was in the computer lab.  The computer lab setting discouraged discussion in several ways that were probably due to (or at least contributed to by) the physical layout of the space: the room was small, computer screens were large and set on top of CPUs, and they were all arranged in rows.  It was impossible to see anyone in the row in front of you (let alone the front of the room where the white board was located) over the high monitors.  It was difficult for even two students (let alone three) to share computers – not because of the size of the monitors, but because of the spacing around each computer.  These physical elements made discussion and collaboration very difficult. 

 

As documented, students who chose to work in the computer lab rarely worked together.  When they did, they generally asked one another questions when they needed help, but worked on their own computers.  True discussion almost never took place – 2 groups managed (over the observations conducted during the entire semester) to really work together.  The computer lab was a very silent place to work.  Conversations generally relied on questions posed by the TA or Prof. Bagenal.  In contrast, the smart classroom periods were characterized by student discussion, most of which was focused on concepts being explored in the computer activities.  Talk always increased when Prof. Bagenal entered the room, but it was consistently present during the entire class period.  Some talk moved into interpersonal and relational channels, but the vast majority (80+%) centered around the computer activities.  Students who worked in the smart classroom on laptops engaged in a higher degree of collaboration (discussion, debate, peer learning) than those working in the computer lab.  In other words, it was rare to see a student group (functionally, two or more people) dominated by one member.  About 1-2 students per class period who worked in this environment in a group of three failed to contribute their share of the work, but the majority of the groups actively engaged one another in learning.  Improvements could be made in the quality of the content of their discussions, but the fundamental elements were present. 

 

This effect of the space on learning style was also found in the astronomy lab settings in SBO.  When students worked in a regular computer lab, interaction was limited and subdued.  When students worked on the same activities on flat screen monitors in their regular lab setting, interaction improved in terms of amount and discussion content.

 

Very few students (5 total for the whole semester) actually brought their own computer to class.  Several others did not have the software and/or hardware to connect to the internet line in the classroom. A survey of >1000 intro astro students the previous semester suggested that about 1/3 of students own a laptop.

 

Results of observations led to survey questions – many of which included open-ended portions designed to determine the range of responses.

 

Survey Results

 

 

 

·      Only 11.9% of the respondents have had experience with computer activities in other classes.  None of the computer activities they described are in the same category (learning tools rather than tracking tools, etc.) as those in this course.

 

·      92.7% of the students felt that they learned something by working through the computer activities in class.  83.3% felt that the activities improved their learning in some fashion; there was no pattern among the remaining students who felt they learned something from the activities.  Only one of the small number of students who did not feel they learned anything from the activities explained why (“just not motivating”).  When asked what they liked about the computer activities: 54.1% of the students felt that the activities were good because they assisted with learning; allowed visualization; and/or allowed ‘hands on’ experience with concepts.  24.3% of the students felt they were good because they provided a change of pace; were fun; and/or were interesting.  A small number of students (4/37) felt the activities were good because they provided opportunities to work in groups or alone, as they chose.  When asked what they disliked about the computer activities: 43.3% of the students felt that there was a lack of consistency in the labs in terms of content, visuals, and/or instructions; the remainder of the responses held no strong pattern.

 

·      38.6% of the students had laptops – more students reported that they brought their laptops to class than were observed to have done so.  61.4% of the students do not have a laptop, but feel that if they did, they would bring it to class.  If this trend persists, self-report data is unreliable on this issue and observation or other independent tracking mechanism will be necessary.

 

·      In spite of the time it took to hook up laptops in the classroom, 66.7% of the students did not feel that it was a lot of trouble. 

 

·      Students do NOT feel that laptops should be required for courses (73.1%), primarily for economic reasons.  Some students felt that sometimes it was appropriate for a course to require a laptop from students, but that laptop use would not make sense for all courses.

 

·      Most students felt that laptops were better than PDA’s and cited size and capacity as the primary reason why.  However, quite a few of those who responded to this question did not know what a PDA was. 

 

·      Students had no clear favorite activity.  There were no clear patterns in the reasons why students choose a particular lab.

 

Suggestions for further/future research

 

Students feel that computer activities improved their learning.  There is room for a great deal of exploration here:

1.     perception vs. actual learning that can be traced to computer work;

2.     investigations of means and ways that computer activities actually and are perceived to enhance student learning from perspectives of teachers and students; finally

3.     for those students who didn’t feel that the activities enhanced learning, further exploration of why they did not (current responses are not patterned on this issue).

 

These results are consistent with finding of a large, multi-year, multi-setting study conducted by the Solar System Collaboratory that found that students perceive themselves to be hands on and/or visual learners and that computer simulations provide hands on and/or visual learning experiences.  How can this be unpacked better as to inform decision making among both educators and developers of technology-based educational tools?

 

Many students believe that working on a laptop in a smart classroom on these sorts of activities is preferable to working in computer labs.  However, this research suggests that students will not bring their own laptops to class (plus, more than half of the students don’t have laptops).  What are good options for maximizing the physical learning environment if this data holds to be true across larger student populations?  Should universities plan to provide laptops for smart classrooms – if so, how many (one per student; one per 2 students; etc.)?  Can the positive elements of the smart classroom environment be used to reconfigure computer lab spaces?

 

If students find computer activities to be helpful (and they did not damage class attendance), why are so few faculty members integrating these sorts of tools in their courses?  Are student perceptions skewed in this area?  What is the level, quality, purpose, and goals of the computer activities used in our campus classrooms?  What is the range of uses; etc?

 

Although students offered no consistent reason for preferring one activity to another, several mentioned graphics – the look of the program.  What is the balance needed between functionality in terms of learning and content and attraction in terms of student interest and meeting student expectations?

 

 

 

Part B: Spring 2003: Proposed Follow-on Investigation

 

 

Goals of the Investigation

 

            Having established that the technical challenges of running computer-based activities in the classroom are not overwhelming, the next step is to try to evaluate if pedagogical goals are enhanced by the technology. Fran Bagenal, Kathy Doxas-Garvin and Doug Duncan began planning the Spring 2003 investigation by first considering that the primary pedagogical goals should be:

 

·      Enhanced learning of concepts in astronomy (rather than learning of facts)

·      Enhanced understanding of the scientific process

·      Greater appreciation of science and its value to society.

 

These are very broad goals. We aim to determine more specific criteria and methods of evaluation over the coming weeks. The primary hypothesis is that working in groups can enhance learning and enjoyment of learning science. We aim to have a mixture of group activities, combinations of computer-based, hands-on and conceptual over the semester. Doug Duncan brings extensive experience of teaching introductory astronomy with hands-on activities at the University of Chicago. We aim to evaluate the following issues:

 

·      What types of group activities are most effective for learning? What types of group activities do the students enjoy most?

·      Which of the computer-based activities are most effective? What aspects of the activities make them effective?

·      How effective are self-selected groups? What are the pitfalls?

 

We would like to compare learning between different sections of Introductory Astronomy. While an accurate control group is not feasible (since the other sections are of different size and taught by different instructors with different styles), we aim to at least develop some qualitative comparisons and can probably perform pre- and post-course testing of different sections.

 

 

Computer Equipment

 

            With a class of 72 students we need 24 computers for groups of 3 students. The percentage of students owning laptops continues to increase (reported at 50% in Fall 2002). Nevertheless, we will need to rely on supplying some equipment. We propose the following strategies:

 

·      We are requesting that the APS Department purchase a dozen laptops for use in this and other courses.

·      We are enquiring around campus to see if other laptops are available (on loan or funds for purchasing them).

·      Fran Bagenal has 2 laptops that she will lend for class periods.

·      The classroom across the hall from the main classroom has been re-furbished with computers which take up much less space and are arranged to facilitate group activities. The capacity of this room is ~24 students.

 

The computers need to have a CD reader and an Ethernet card. Current prices are less than $1000 per computer.

________________________________________________________________________

Acknowledgements – We thank Alex Pearson and ITS staff for their support of this project in Spring 2002.