Name File Type Size Last Modified
ChatGPT course-level analysis.do text/plain 5.8 KB 03/13/2025 10:19:AM
ChatGPT course-level data.csv text/csv 56.3 KB 03/13/2025 10:37:AM
ChatGPT student-level analyses.R text/x-rsrc 7.6 KB 03/13/2025 10:09:AM
ChatGPT student-level data.csv text/csv 47.2 KB 03/13/2025 10:09:AM

Project Citation: 

Arum, Richard. ChatGPT Early Adoption in Higher Education: Variation in Student Usage, Instructional Support and Educational Equity. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2025-03-13. https://doi.org/10.3886/E222781V1

Project Description

Summary:  View help for Summary Data for this study were collected at the University of California – Irvine (UCI) as part of the UCI-MUST (Measuring Undergraduate Success Trajectories) Project, a larger longitudinal measurement project aimed at improving understanding of undergraduate experience, trajectories and outcomes, while supporting campus efforts to improve institutional performance and enhance educational equity (Arum et. al. 2021).   The project is focused on student educational experience at a selective large, research-oriented public university on the quarter system with half of its students first-generation and 85 percent Hispanic, Asian, African-American, Pacific Islander or Native American.  Since Fall 2019, the project has tracked annually new cohorts of freshmen and juniors with longitudinal surveys administered at the end of every academic quarter.  Data from the Winter 2023 end of term assessment, administered in the first week of April, was pooled for four longitudinal study cohorts for this study (i.e., Fall 2019-2022 cohorts). There was an overall response rate of 42.5 percent for the Winter 2023 end of term assessment. This allowed us to consider student responses from freshmen through senior years enrolled in courses throughout the university. Students completed questionnaire items about their knowledge and use of ChatGPT in and out of the classroom during the winter 2023 academic term. In total 1,129 students completed the questionnaire, which asked questions about: knowledge of ChatGPT (“Do you know what ChatGPT is?”); general use (“Have you used ChatGPT before?”); and instructor attitude (“What was the attitude of the instructor for [a specific course students enrolled in] regarding the use of ChatGPT?”). Of those 1,129 students, 191 had missing data for at least one variable of interest and were subsequently dropped from analysis, resulting in a final sample of 938 students.
            In addition, for this study we merged our survey data with administrative data from campus that encompasses details on student background, including gender, race, first-generation college-going, and international student status.  Campus administrative data also provides course-level characteristics, including whether a particular class is a lower- or upper-division course as well as the academic unit on campus offering the course.  In addition, we used administrative data on all students enrolled at the university to generate classroom composition measures for every individual course taken by students in our sample – specifically the proportion of underrepresented minority students in the class, the proportion of international students in the class and the proportion of female students in the class. For our student-level analysis [R1], we used binary logistic regressions to examine the association between individual characteristics and (1) individual awareness and (2) individual academic use of ChatGPT utilizing the student-level data of 938 students. Individual characteristics include gender, underrepresented minority student status, international student status, first generation college-going student status, student standing (i.e. lower or upper classmen), cumulative grade point average and field of study.  Field of study was based on student major assigned to the broad categories of physical sciences (i.e. physical sciences, engineering, and information and computer science), health sciences (i.e. pharmacy, biological sciences, public health, and nursing), humanities, social sciences (i.e. business, education, and social sciences), the arts, or undeclared. We defined awareness of ChatGPT as an affirmative response to the question “Do you know what ChatGPT is?” Regarding ChatGPT use, we focused on academic use which was defined as an affirmative response of either “Yes, for academic use” or “Yes, for academic and personal use” to the question “Have you used ChatGPT before?” For our course-level analysis [R2], we constructed a measure – course-level instructor encouragement for ChatGPT use – based on student responses to the end of the term survey conducted at the completion of the Winter 2023 term. In the survey, students were asked to indicate the extent to which their instructors encouraged them to use ChatGPT in each of their enrolled courses. The response options were: (1), "very much discouraged"; (2), "somewhat discouraged"; (3), "neither discouraged nor encouraged"; (4), "somewhat encouraged"; and (5), "very much encouraged." Our study generated data on 57 percent of all standard course sections offered in Winter 2023, excluding those without a classroom component, such as independent studies or study abroad.  For course sections mentioned multiple times by students in the sample, we aggregated student responses at the course level (courses in our analysis on average had 10.4 students reporting on instructor encouragement or discouragement of ChatGPT use). As a result, our analysis encompassed 1,047 distinct course sections reported by 915 students, with an average encouragement score of 2.7 (out of 5). For the course-level analysis, we conducted simple binary regressions to examine the relationships between average student-reported instructor encouragement in specific courses and the proportion of underrepresented minority students, as well as the proportion of international students in the class. Scatterplots were created to visualize these relationships, with fitted regression lines included to highlight general trends. Both the scatterplot markers and the regression models were weighted by classroom size to ensure that larger classes are more influential than smaller classes in our course-level statistical analyses. 

Citation:
Arum, Richard, Eccles, Jacquelynne, Heckhausen, Jutta, Von Keyserlingk, Luise, Li, XunFei, Yu, Renzhe, Orona, G., Mathew, D., Chang, D. UCI Measuring Undergraduate Success Trajectories (UCI-MUST) Project Data, [United States], 2019-2025.  



Related Publications

Export Metadata

Report a Problem

Found a serious problem with the data, such as disclosure risk or copyrighted content? Let us know.

This material is distributed exactly as it arrived from the data depositor. ICPSR has not checked or processed this material. Users should consult the investigator(s) if further information is desired.