Test Scoring Instructions
- General Instructions
- Instructions for Students
- Instructions for Faculty
- Answer Keys
- Test Analysis Options
- Email Output
- Answer Key Quick Reference
Test Scoring Services uses National Computer Systems (NCS) optical scanning equipment and the general forms are available at the Auburn University Bookstore in Haley Center.
Test drop off is available on the Foy Concourse at the Biggio Center/Testing Services entrance (Foy Suite 116) and at the OIT Building (300 Lem Morrison Dr.).
For 24 hour turnaround time, tests (scantrons) may be dropped off at the Biggio Center, Monday through Friday, 8:00 a.m. - 3:00 p.m. Scored tests will be available for pick-up at the Biggio Center the next business day. Tests dropped off on Friday will not be available for pick-up until the following Monday. All tests not picked up from the Biggio Center within 2 business days will be available for pick-up at the OIT Building, 300 Lem Morrison Drive.
For immediate turnaround, tests may be dropped off at the OIT building front desk, located at 300 Lem Morrison Drive from 7:45 a.m. until 4:45 p.m. Monday thru Friday. Scored tests will be available for pick-up as soon as they can be scanned - usually in 10 to 15 minutes, depending on how busy things are.
Scan sheets received through Campus Mail will not be processed. Scan sheets should be placed in an envelope labeled with the name, telephone number and email address of the person to contact for pick-up.
Examination scan sheets and computer printouts will be returned only to the instructor listed on the answer key as having given the examination or to their authorized representative. An instructor may designate a student or fellow faculty member as an authorized representative by lending them the instructor's ID card.
In addition to the printed output, instructors may obtain a copy of the raw scanned data and the simplified output on CD or via email. Instructors who want the files on CD should include a blank writeable CD-R in the envelope with the scan sheets.
All scan sheets must be marked with a #2 Pencil only.
All response positions must be fully marked. Do not mark outside the designated area of any response position. All stray marks must be completely erased. Incomplete erasures may be read as incorrect answers and/or cause the test not to be scored.
Scan sheets must not be stapled, folded or mutilated.
All scan sheets must face in the same direction--make sure that the notched edge is at the top right. Place the answer key(s) on top.
The student should print his/her name in the Name boxes with the last name first and mark the corresponding bubbles. The student should then print his/her Banner number in the Identification boxes and mark the corresponding bubbles. The Banner number is a nine digit identifying number that begins with 902. This field should not be left blank. The only restriction is that 9999 in the first four boxes (A-D) of the Identification field is reserved to indicate an answer key for the test.
If more than one test version is given, the student must print the correct test version number in box J and mark the corresponding bubble or the test will NOT score. If only one test version is given, then box J can be left blank. The student should then mark the bubbles corresponding to his/her responses to the test questions. Any question left blank within the range of test questions will be scored as an incorrect response.
Multiple marks, including incomplete erasures, may result in an inaccurate scoring of the test. Because the scanner reads the back of the answer sheet through the paper, on tests with more than 100 questions, stray marks on the front of the sheet can result in inaccurate scoring of questions on the back of the sheet.
Any number of test questions up to a maximum of 200 can be used on the test.
The AUSCAN scoring program can accommodate a maximum of nine test versions. All versions of the test must have the same number of questions. At least one answer key must be completed for each test version. Test versions must be numbered consecutively starting with version 1. Any test version that is not actually used will still need a answer key (but all answers can be left blank).
Each test question can have up to five correct answers. Additional answer keys must be completed when test questions have more than one correct answer.
The number of answer keys required will be the number of versions times the maximum number of correct answers.
For each version of the test, the answer keys must be ordered as follows:
- Main Key with correct answer for all questions.
- Second Key with the first alternative correct answer for those questions having 2 or more correct answers. Questions with only 1 correct answer are left blank.
- Third Key with the second alternative correct answer for those questions having 3 or more correct answers. Questions with less than 3 correct answers are left blank.
- Fourth Key with the third alternative correct answers for those questions having 4 or more correct answers. Questions with less than 4 correct answers are left blank.
- Fifth Key with the fourth alternative correct answer for those questions having 5 correct answers. Questions with less than 5 correct answers are left blank.
For example, if a test has three versions and some questions have two correct answers, six answer keys must be supplied in the following order:
Version 1, Main Key
Version 1, Second Key
Version 2, Main Key
Version 2, Second Key
Version 3, Main Key
Version 3, Second Key
All required information must be marked for the first answer key in a set of answer keys. Only the first 4 digits of the Identification field, the Test Version field and the appropriate question answer fields need to be marked for subsequent answer keys.
Specific instructions for marking answer keys follow:
|1.||NAME||Print the instructor's name in the boxes provided and mark the corresponding bubbles. The instructor's name will appear on all computer printouts.|
|2.||DEPT, COURSE and SECTION||Print the department id, course number and section id in the boxes provided and mark the corresponding bubbles.|
|3.||TIME||Mark the bubble corresponding to the nearest time of day when the test was administered.|
|Boxes ABCD||Print "9" in each of the four boxes and mark the corresponding bubbles to identify the scan sheet as an answer key. This is REQUIRED on all answer keys.|
|Box E||Output option #1- Print "1" in the box and mark the corresponding bubble to request student summary tear-off sheets (see the Test Analysis Options section for more details). Leave the box blank if this option is not desired. This option will not be available during Final Exam processing. Effective Fall Semester 2000, all tests will be run as final exams without student tear-off sheets unless the instructor specifically requests them in writing on the delivery envelope.|
|Box F||Output option #2 - Print "1" in the box and mark the corresponding bubble to request instructor summary output. Leave the box blank if this option is not desired.|
|Box G||Output option #3 - Print "1" in the box and mark the corresponding bubble to request question analysis output. Leave the box blank if this option is not desired.|
|Box H||Output option #4 - Print "1" in the box and mark the corresponding bubble to request test statistics output. Leave the box blank if this option is not desired.|
|Box I||Output option #5 - Print "1" in the box and mark the corresponding bubble to request question statistics output. Leave the box blank if this option is not desired.|
|Box J||Print the test version number in this box and mark the corresponding bubble. This correlates with box V (Number of Versions) and is REQUIRED on all answer keys.|
|Boxes KLM||Print the number of questions in the three boxes and mark the corresponding bubbles. This is REQUIRED on the first answer key.|
|Box N-S||Print the test date here - boxes N and O are for the month, boxes P and Q are for the day, and boxes R and S are for the year. Mark the corresponding bubbles.|
|Box T||Print the maximum number of correct answers for any test question in the box and mark the corresponding bubble. If more than "1" is marked, additional keys will be needed.|
|Box U||Output sort order - Print "0" in the box and mark the corresponding bubble to sort into alphabetical order by student name. Print "1" in the box and mark the corresponding bubble to sort by student identifying number. Print "2" in the box and mark the corresponding bubble for alphabetical order on the Instructor Summary, but leave the student summary tear-off sheets in the same order as the scan sheets were processed. Print "3" in the box and mark the corresponding bubble to sort first by section number and then alphabetically by student name. Option 3 is useful when the tests for multiple class sections are processed together. If the box and bubble are left blank, the output will be printed in the same order as the scan sheets were processed.|
|Box V||Print the total number of test versions in the box and mark the corresponding bubble. This is REQUIRED on the first answer key.|
|Boxes WX||The AUSCAN program can delete test questions from scoring based on an instructor-supplied value for the maximum allowable percentage of students who respond INCORRECTLY to the question. To use this feature, print the desired percentage in boxes W and X, and mark the corresponding bubbles. For example, to drop a question that more than 3 out of 4 students miss, code 75 for this option. Test scores will be based only on the non-deleted questions and deleted questions will be identified on all output.|
|6.||QUESTIONS||Mark the bubble corresponding to the correct answer for each question.|
The AUSCAN program can provide the following information:
- A listing of those students whose exams were not processed because of an invalid test version number in box J.
- OPTION # 1 - Student Summary Tear-Off Sheets This option provides tear-off sheets for distribution to students. The sheets are printed in the requested sort order and contain the following:
- Student Name
- Student Identifying Number
- Test Date
- Test Time
- Course Name
- Instructor's Name
- Test Version Number
- Raw Score
- Percent Score
- Student Responses to all test questions
- An indication of which questions were answered incorrectly
- Correct Answers for all test questions
- An indication of those test questions that were deleted
- OPTION #2 - Instructor's Summary
This option provides a sorted listing of the following information for all students scored for the test:
- Student Name
- Student Identifying Number
- Test Version
- Raw Score - the number of non-deleted questions answered correctly
- Percent Score
- Number of Valid Questions (those NOT deleted)
- Number of Deleted Questions
- Number of Blank Questions (valid but not answered)
- List of questions answered incorrectly
- List of student responses to questions answered incorrectly
- OPTION #3 - Question Analysis
This option provides a frequency table of student responses to the test questions. A separate table is created for each test version.
- OPTION #4 - Test Statistics
This option provides statistics about the test. The statistics are calculated separately for each test version and include:
- Number of students - the number of students scored for the test version.
- Number of questions deleted from scoring. Where there are multiple test versions and a relatively small class, this number may vary widely from test version to test version.
- Test High Score - the greatest number of correct responses from any student on the test version.
- Test Low Score - the fewest number of correct responses from any student on the test version.
- Mean Score - the sum of the scores for students taking the test version divided by the number of students. Zero scores are considered valid scores.
- Median Score - the numerical value of the middle score once all scores have rank ordered from lowest to highest. If there are an even number of scores the median is assumed to the score midway between the middle two values
- Test Variance - a measure of the dispersion of the test scores about the test mean. This is one measure of how closely the individual scores cluster about the mean. Mathematically, it is the mean of the squared deviations of the individual scores from the mean score.
- Test Standard Deviation - the square root of the test variance. If the test scores are normally distributed, then the range from the mean minus the standard deviation to the mean plus the standard deviation contains approximately 68% of the student scores.
- Skewness - a measure of the degree to which the distribution of test scores approximates a normal distribution. A positive value occurs when the majority of scores are high. This can be indicative of an easy test or of well-prepared students. A negative skew occurs when the majority of scores are low. This can be indicative of a difficult test, poorly prepared students or poor teaching.
- Kurtosis - a measure of the peakedness or flatness of the distribution of test scores in relation to that of a normal distribution. A value greater than 0 indicates the distribution is more peaked than a normal distribution. A kurtosis less than 0 indicates a relatively flat distribution of scores.
- Reliability - a measure of the internal consistency of the test. A high value indicates that student performance is consistent across the entire test. A low value indicates that student performances on different questions are inconsistent.
- Standard error of measurement - the standard deviation multiplied by the square root of 1 minus the reliability. This is an indication of the amount (in raw score points) that an individual score can be expected to vary on repeated administration of the test.
- Standard error of test mean - the standard deviation divided by the square root of the number of students minus 1. This is the amount (in raw score points) that the test mean can be expected to vary in repeated administrations of the test.
- OPTION #5 - Question Statistics
This option provides various item statistics as well as frequency distributions for the raw and percent scores. These statistics include:
- Difficulty Index - the number of students responding correctly to a question divided by the total number of students answering the question. High values for this statistic are indicative of "easy" questions.
- Discrimination Index - ranges from -1 to +1 with positive values indicating a positive correlation between success on the question and score on the test. Negative values indicate that students who scored well on the test scored poorly on this question.
- Biserial Correlation - an alternative to the discrimination index
- Mean of scores for students who responded correctly to the question.
- Mean of scores for students who responded incorrectly to the question.
Only the first eight items are included in the basic Instructor Summary file which can be made available on CD or through email.
Test Scoring email output includes four attachments for each test (where xxxxxxx is the unique identifier for the test):
xxxxxxxS.txt - basic instructor summary in standard text format
xxxxxxxS.xls - basic instructor summary in Excel format
xxxxxxxL.txt - full instructor summary in standard text format
xxxxxxxQ.txt - question and test statistics in standard text format
Effective Spring Semester 2013, an instructor may request an additional output file that can be used to import grades into Canvas (a video of instructions for using this file are available at http://www.auburn.edu/img/canvas/help/teachers/f_import_from_spreadsheet/f_import_from_spreadsheet.htm ). This file will only be provided if there is a written request for the CANVAS FILE on the delivery envelope. All students must enter their Banner ID in the Identification field. If this field is blank or incorrect, the CANVAS FILE cannot be created.
Email users should click on an attached text file to bring the file up in a Notepad window. (If your system automatically brings up .txt files in your word processor you will need to make sure that it uses a mono-spaced font, such as Courier, to properly display the columns.) Clicking on the Excel attachment should start Excel and bring up a spreadsheet with the summary information for the test.
Pine users should see the files displayed as part of the mail message. The files can be saved for downloading by using the View attachment command.
Last Updated: September 24, 2015