Open Access Paper
8 October 2015 Refining scientific writing skills with feedback that works for students and instructors
Leily S. Kiani, Carrie Menke
Author Affiliations +
Proceedings Volume 9793, Education and Training in Optics and Photonics: ETOP 2015; 97932N (2015) https://doi.org/10.1117/12.2223230
Event: Education and Training in Optics and Photonics: ETOP 2015, 2015, Bordeaux, France
Abstract
Evaluation of student learning through assessment of communication skills is a generally important component of undergraduate education and particularly so for promotion of interdisciplinary research conducted by future scientists. To better build these skills we aim to quantify the effectiveness of feedback on student writing of technical reports in an upper-division physics lab course. In one implementation, feedback utilization - in the form of observing commented technical reports, attending office hours or emailing rough drafts of their reports was monitored then correlated with improvement in student writing. The improvement in student writing is quantified as the single-student normalized gain. A slight positive relationship was found between the number of times a student utilized feedback and the improvement in student writing. A subsequent study involved correlation of two complimentary assessments of student work. In the first assessment students received consistent feedback throughout the semester on all sections of a technical report in the form of highlighted bullet points in a detailed rubric. In the second assessment method students received varying amounts of feedback for each section of the technical paper throughout the semester with a focus on one section each week and follow-up feedback on previously covered sections. This approach provides focused feedback that can be scalable to larger classes. The number of highlighted bullet points in the rubric clearly decreases as a function of the focused feedback implementation. From this we conclude that student writing improves with the focused feedback method.

1.

INTRODUCTION

The Modern Physics Laboratory course is one in which many practical skills for professional physicists are learned. This is an upper division laboratory course where students are exposed to a variety of modern physics concepts primarily through guided laboratory experiments. The learning outcomes from this course, such as effective presentation in written and oral formats and the ability to design various apparatus to test hypotheses, are a substantial part of the major-level outcomes for the physics major at University of California, Merced. A cornerstone learning outcome for this course is the ability to write about laboratory activities in a technical report.

The technical reports are formatted to reflect the writing style of academic journal papers. Students cycle through six different experiments throughout the semester and are required to write a technical report in the same style for each experiment. Assessment of student performance on the technical reports in this course is aligned with the program learning objectives in the physics department. The exact language used in the Program Learning Outcomes of the UC Merced Physics Undergraduate Major states “students will be able to clearly explain their mathematical and physical reasoning, both orally and in writing.” The rubric and feedback provided to students on their technical reports reflects this objective.

The work presented here details our approach to providing feedback on the technical reports and our assessment of the efficacy of this feedback. Feedback is provided in a variety of ways, and is mostly formative, continuously throughout the semester. Students are expected to refine their technical writing by utilizing feedback from their previous report to better write the following report. In the first study we monitored student utilization of feedback and observed the correlation of that utilization with improvement in technical report scores. The sample size in this study was limited to eight students, nevertheless some correlation was observed. In a follow up study we monitored the correlation between two different forms of feedback, one very dense in commenting and the other brief in comparison. The aim of this study was to demonstrate a method for providing commented feedback on student writing that is scalable to a larger class size. Results from our study suggest that dense and detailed written feedback is an effective method of providing feedback to refine scientific writing skills and it can be effective after a single implementation.

2.

METHODS

In the first implementation, student reports were evaluated using a 15 point rubric where 10 of the points deal with providing critical information about their experiment and 5 points that dealt with clarity of the ideas and messages in the report. We applied a thorough feedback method for their technical reports wherein each paragraph was scrutinized and detailed comments were provided to students. Feedback was aligned with the expectations in the rubric but went beyond an explanation of why students were losing points. To further support the learning outcome of professionalism in their writing, formative feedback in the form of questions posed about vague areas of the report was provided. These questions were intended to call attention to issues with clarity and logical flow in their writing and to encourage students to think about how they can improve sections of the report. This questioning was combined with comments that contained examples of how to rephrase sentences to achieve a balance of specificity and generality. Summative feedback was provided on the overall impression of the reports, written at the end of the report or addressing the entire class.

The reports were delivered to instructors through an online platform. The online submission process allowed instructors to grade electronically as well as monitor whether students accessed the feedback. This was a valuable resource for determining which students were utilizing feedback by reading the comments posted to their submissions. A correlation was drawn between the utilization of feedback and gain in technical report grades. The number of times each student sought or utilized feedback was compiled. This included the number of times they viewed comments on their reports, the number of times they sent rough drafts for comments before the assignment was due and how many times students received one-on-one feedback outside of regular class time. The gain in technical report grades was calculated for each student with the following equation from Hake (2002).1 Individual student normalized gain, g, is defined as,

00071_psisdg9793_97932N_page_2_1.jpg

where %gain is the percentage gain of a students final technical report grade from their initial technical report grade and %gainmax is the maximum possible percentage of improvement in technical report grade for an individual student. This was used as a metric for quantifying student learning of technical writing skills.

The second implementation was motivated by providing dense and detailed feedback to larger classes. In order to address the scalability of our method we implemented a cross correlated feedback method with dense and detailed feedback (DDF) in the form of comments as in the previous study along with a more passive form of feedback as highlighted bullet points in a detailed rubric. The rubric used in the previous study was too brief to provide exceptional feedback in the form of highlights so we adopted a more meticulous rubric based on one published online from MIT OpenCourseWare.2 We provided the highlighted feedback on each section for each technical report throughout the semester however we staggered the implementation of the commented feedback to build section by section though the course of the semester. We assumed that the number of rubric highlights in a section would be indicative of student performance on that section of the report. This allowed us to observe the correlation of student performance with the provision of feedback in the form of commenting.

3.

RESULTS

In the feedback utilization monitoring study, by mid-semester it could be seen that students ability to hone their writing skills in this course was found to be correlated with the amount of feedback they were actually utilizing. A correlation could be drawn for students that were utilizing the TA feedback with students who were improving their technical report grades. We presented this correlation to the students and many students in the class responded positively to this new information. There was a surge in students seeking one-on-one explanations of feedback on their technical reports. The average grade of the following technical report improved by fourteen percent compared to the previous report and by thirty-four percent compared to the first technical reports. To characterize this improvement over the semester duration, a plot of the individual student normalized gain vs. the number of times a student utilized feedback is shown below. It can be seen from this plot that in general students who are utilizing the provided feedback are also displaying improvement in their technical report scores. Students who are showing utilization of feedback more than six times over the course are showing the greatest gains in their technical report scores. Alternatively, students that are utilizing feedback minimally achieve a variety of personal gain values. This suggests that there may be some threshold of feedback utilization for the benefits to show in the technical report scores or that students are receiving high quality feedback that is not observable to the teaching assistant (TA). An example of the receipt of feedback that would not be recorded as utilization of TA feedback would be a peer review of a student’s technical report which likely occurs between lab partners in this course. Students may also be entering the course with a variety of writing skill levels and their score improvement by this measure is larger when they achieve a high score on their first report.

Figure 1.

(a) Plot of the normalized gain for each individual student vs. the number of times a student utilized feedback. Each point represents a student enrolled in the course. Students that display the highest improvement in technical report grades by this metric are utilizing feedback however there seems to be a threshold of more than six utilizations before the correlation trend becomes clear. This preliminary result suggests that the trend would be more clear in a course with a larger sample size. (b) An example of the dense and detailed feedback provided to students by the teaching assistant.

00071_psisdg9793_97932N_page_3_1.jpg

In the scaled feedback study we examined the total number of highlights per section for each iteration of the technical report. The number of these highlights showed a general decrease over the semester indicating improvement of writing as assessed by the rubric. This was then used as a metric to assess the efficacy of the commented feedback. We chose the Methods section for our first implementation of commented feedback because it is often the section that students are most prepared to write at the beginning of the course. The Results and Discussion sections were then commented on in the second and third technical reports. There was a fifty percent decrease in the number of highlights post-commenting for the Results section and a forty percent decrease for the Discussion section. The Abstract and Introduction section were commented on later in the semester in the fourth and fifth technical report. The number of highlights in these sections also decreases over time but less so in response to the dense and detailed commented feedback. The cause of this is likely due to students having received much feedback from the highlights and from summative feedback discussed during regular class time so their writing improved before the commented feedback was provided.

To further explain the effect of commented feedback we show three plots, Fig. 2 (a) (b) (c), of the total number of highlights in a given section of the lab report summed over the total number of students in the class versus the technical report numbers, from the first to the sixth. The plots show that for a given section there is almost no change in the number of highlights per report before the commented feedback is given followed by a dramatic decrease in highlights after the commented feedback is provided. This implies that students are able to better improve their written work when given dense and detailed feedback. In Fig. 2 (d) we show results for all sections of the report demonstrating the overall improvement of student writing over time.

Figure 2.

Plots of the total number of highlights in the (a) Methods (b) Results and (c) Discussion sections across the population of the class for each iteration of the technical report. (d) Plot of the highlights per section for all sections. Fewer highlighted bullet points in rubric implies improved student writing. Dense and detailed feedback (DDF) is given in the first technical report for the Methods section, in the second for the Results section and in the third for the Discussion section. Rubric highlights decrease after DDF is implemented in a selected section of the technical report. Highlighted bullet points in the rubric decrease for all sections over the course of the semester.

00071_psisdg9793_97932N_page_4_1.jpg

4.

DISCUSSION

The Modern Physics Lab course enables students to learn and practice scientific writing skills. This necessitates the careful attention of the course instructors to the needs of each student to drive them to develop these professional skills to their full potential. Here, we have provided students with dense, detailed and specific feedback from which they have advanced their skills. Student work clearly improved more for students that utilized our dense and detailed commented feedback than for those who did not. Additionally student work could be improved with a more gradual application of feedback which presents a scalable method for applying dense and detailed feedback. Another approach is to apply commented feedback to a full report early in the semester then to follow up with the highlighted rubric feedback and adjust the commenting application as needed. This time dependent application of feedback may also be useful for measuring the efficacy of other pedagogical activities.

ACKNOWLEDGMENTS

The authors would like to acknowledge support from the Council of Graduate Schools (CGS) through the Undergraduate Learning Outcomes Assessment Certificate Program and from the University of California, Merced.

REFERENCES

1. 

R. R. Hake, “Relationship of individual student normalized learning gains in mechanics with gender, high-school physics, and pretest scores on mathematics and spatial visualization,” in Physics Education Research Conference (PERC), (2002). Google Scholar

2. 

A. Elby, “Laboratory Fundamentals in Biological Engineering,” (2007) http://ocw.mit.edu Google Scholar
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Leily S. Kiani and Carrie Menke "Refining scientific writing skills with feedback that works for students and instructors", Proc. SPIE 9793, Education and Training in Optics and Photonics: ETOP 2015, 97932N (8 October 2015); https://doi.org/10.1117/12.2223230
Lens.org Logo
CITATIONS
Cited by 2 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Physics

Calcium

Education and training

Oscilloscopes

Photonics

RELATED CONTENT


Back to Top