297 lines
16 KiB
Plaintext
297 lines
16 KiB
Plaintext
|
March 1990
|
|||
|
|
|||
|
MANAGING POLICE BASIC TRAINING CURRICULUM
|
|||
|
|
|||
|
By
|
|||
|
|
|||
|
Rene A. Browett
|
|||
|
Curriculum Manager
|
|||
|
Northern Virginia Criminal Justice Academy
|
|||
|
Arlington, Virginia
|
|||
|
|
|||
|
The scenario might go something like the following. It's
|
|||
|
test day at the police academy. The recruits have just completed
|
|||
|
their first major examination and are anxious to find out how
|
|||
|
they performed. The training staff is sequestered in a large
|
|||
|
room, seated at long tables covered with stacks of exams in
|
|||
|
front of them. Calculators are in clear evidence. The grueling
|
|||
|
task of hand grading the exams begins. Question by question,
|
|||
|
they trudge through the exam. To validate test questions (to
|
|||
|
find out if more than half the class has missed a certain
|
|||
|
question), the leader calls out the questions by number to see
|
|||
|
how many of the graders have papers in which a student has missed
|
|||
|
a specific question. Hands go up, a count is taken, numbers
|
|||
|
recorded, calculators figure averages--some right, some wrong.
|
|||
|
The process goes on for hours, even days. Meanwhile, the
|
|||
|
students wait.
|
|||
|
|
|||
|
Does this often-repeated scene have to be? No--not if the
|
|||
|
recruit curriculum testing-and-evaluation vehicle involves some
|
|||
|
computer assistance. Such a system exists at the Northern
|
|||
|
Virginia Criminal Justice Academy (NVCJA). The NVCJA basic
|
|||
|
training staff has developed a systematic process whereby days of
|
|||
|
effort by a training staff are reduced to less than 2 hours using
|
|||
|
one or two people.
|
|||
|
|
|||
|
This article will discuss how a basic police training
|
|||
|
curriculum can be quickly and efficiently managed with an
|
|||
|
effective, programmatic approach. Using the NVCJA as a case
|
|||
|
study, the article will first provide some background about the
|
|||
|
academy and then will discuss the hardware, software and the
|
|||
|
process involved in managing the basic recruit curriculum.
|
|||
|
|
|||
|
ACADEMY BACKGROUND
|
|||
|
|
|||
|
Established in 1965, the Northern Virginia Criminal Justice
|
|||
|
Academy provides training for over 25 criminal justice
|
|||
|
jurisdictions in the Northern Virginia area. Staffed with 32
|
|||
|
full-time employees, the academy has a leadership cadre of six
|
|||
|
executives, including the director, all of whom are former police
|
|||
|
officers. Along with a permanent support staff, the academy is
|
|||
|
augmented with officers from each of the participating
|
|||
|
jurisdictions which provide them as instructors on assignment for
|
|||
|
up to 3 years. As one of nine regional academies in the State of
|
|||
|
Virginia, the academy is governed by a Board of Directors
|
|||
|
comprised of the chiefs of police, sheriffs and city/county
|
|||
|
managers from the larger participating jurisdictions.
|
|||
|
|
|||
|
The academy provides both recruit and inservice training for
|
|||
|
each of the participating police departments and sheriff's
|
|||
|
offices in the region. Consequently, recruit training consists
|
|||
|
of subject matter leading to three certifications mandated by the
|
|||
|
Commonwealth of Virginia: Basic Law Enforcement, Basic Civil
|
|||
|
Process-Court Security, and Basic Jailors. Each year the academy
|
|||
|
graduates approximately 300 students after completion of a 14- to
|
|||
|
18-week course of instruction.
|
|||
|
|
|||
|
In order to graduate, each student must successfully
|
|||
|
complete all State- and academy-mandated tests and related
|
|||
|
requirements. Developed by the Department of Criminal Justice
|
|||
|
Services (DCJS) headquartered in Richmond, VA, State-mandated
|
|||
|
requirements are commonly known as performance objectives (POs).
|
|||
|
These State mandates (POs) are the end result of a formal job
|
|||
|
task analysis, commissioned by the department (DCJS), where the
|
|||
|
various functions of police officers and sheriffs were
|
|||
|
identified. Developed from this study were over 400 performance
|
|||
|
objectives which form the basis for State-mandated police
|
|||
|
training. Each training academy must teach and test every
|
|||
|
performance objective and also retest any objective missed by
|
|||
|
students.
|
|||
|
|
|||
|
The academy (NVCJA) also provides State-mandated minimum
|
|||
|
inservice requirements (MIR) training. The inservice staff
|
|||
|
coordinates with DCJS to ensure that every 2 years, all officers
|
|||
|
receive at least 40 hours of State-mandated training, to include
|
|||
|
instruction in the law. Taught almost exclusively by outside
|
|||
|
instructors and coordinated by a professional staff, the academy
|
|||
|
offers overs 100 inservice training classes. While inservice
|
|||
|
training is not the focus of this article, information is
|
|||
|
provided to give a more complete profile of the academy.
|
|||
|
|
|||
|
DEVELOPING THE ``STAR'' SYSTEM
|
|||
|
|
|||
|
Because the State requires that every student successfully
|
|||
|
pass all of the performance objectives, several problems
|
|||
|
immediately became evident when the academy staff first
|
|||
|
approached the problem of more efficient curriculum management.
|
|||
|
First, how could the academy successfully track each objective
|
|||
|
through the training process to ensure accountability? Second,
|
|||
|
what type of test construction would be needed to assure the
|
|||
|
administration that mandated objectives would be adequately
|
|||
|
tested and validated? Third, how could performance-based tests
|
|||
|
be graded within a few hours, not a few days?
|
|||
|
|
|||
|
With these basic questions in mind, the recruit staff
|
|||
|
concluded that a computer application might offer a workable
|
|||
|
solution. After preliminary analysis and over 5 years of
|
|||
|
refinement, the Student Testing and Records (STAR) System was
|
|||
|
developed and is presently used at the academy. With this
|
|||
|
program as the basic software package, the testing system
|
|||
|
incorporates several components:
|
|||
|
|
|||
|
* An optical mark reader (SCANTRON), which automatically
|
|||
|
scores each exam and feeds the raw data directly into a
|
|||
|
computer
|
|||
|
|
|||
|
* An additional software program which provides both
|
|||
|
database and spreadsheet capabilities
|
|||
|
|
|||
|
* An IBM compatible personal computer with 640K RAM memory
|
|||
|
and a hard drive, and finally
|
|||
|
|
|||
|
* A laser printer for letter quality reports
|
|||
|
|
|||
|
This system costs less than $5,000. By using this
|
|||
|
relatively simple but highly effective system, the curriculum
|
|||
|
manager is now able to better manage the basic training
|
|||
|
curriculum from tracking to testing to validation of each
|
|||
|
State-mandated performance objective.
|
|||
|
|
|||
|
MANAGING THE PROCESS
|
|||
|
|
|||
|
At the core of the academy's curriculum and testing system
|
|||
|
are the POs. Simply put, the tests must ensure that each student
|
|||
|
masters each State-mandated PO. Thus, test construction and
|
|||
|
administration are vital to the integrity of the academy's
|
|||
|
curriculum management process. Performance-objective
|
|||
|
accountability and the testing process are the primary
|
|||
|
responsibilities of the academy's curriculum manager. From
|
|||
|
lesson plan review to test construction and administration, the
|
|||
|
curriculum manager is the academy's point man with regard to
|
|||
|
accuracy and accountability.
|
|||
|
|
|||
|
Constructing Tests
|
|||
|
|
|||
|
All basic training examinations are constructed by the
|
|||
|
curriculum manager. It is also his responsibility to analyze and
|
|||
|
validate all test results for each recruit. The testing process
|
|||
|
begins with the basic lesson plan, which is then reviewed and
|
|||
|
approved by academy management. Written by staff instructors,
|
|||
|
each lesson plan must be revised and updated at the end of each
|
|||
|
training session. They must also contain those specific POs
|
|||
|
mandated by the State and appropriate for that block of
|
|||
|
instruction. Test questions flow from and can be directly
|
|||
|
tracked to POs found in each lesson plan, thus assuring test
|
|||
|
accountability. Staff instructors, accountable to both the
|
|||
|
students and the curriculum manager, are responsible for ensuring
|
|||
|
each PO is adequately taught. Student performance, at test
|
|||
|
time, usually will reflect whether this situation has, in fact,
|
|||
|
occurred.
|
|||
|
|
|||
|
At the end of each specified testing time period, the
|
|||
|
curriculum manager begins to prepare a test that spans several
|
|||
|
disciplines and many instructors. How the test construction
|
|||
|
takes place mechanically is simple and is coordinated by the
|
|||
|
curriculum manager. First, he speaks with all instructors to
|
|||
|
verify that their test questions, which are based on the mandated
|
|||
|
POs, have been taught and are part of a pre-existing database.
|
|||
|
|
|||
|
Second, the curriculum manager constructs a rough-draft test
|
|||
|
based on pertinent subject area questions stored in the database.
|
|||
|
The draft exam is then reviewed by all the respective instructors
|
|||
|
for their final updates and edits. This phase of the process is
|
|||
|
accomplished with a high degree of attention to exam security.
|
|||
|
At this point, if an instructor did not teach a specific
|
|||
|
objective, the instructor advises the curriculum manager who
|
|||
|
deletes that PO from the current exam. It will, however, be
|
|||
|
tested on a later exam, so as to comply with State mandates.
|
|||
|
|
|||
|
Third, the draft exam is then edited by the curriculum
|
|||
|
manager based on specific verbal and written feedback from each
|
|||
|
instructor. The final exam is then constructed, with the rough
|
|||
|
draft copy kept on file for documentation and accountability.
|
|||
|
|
|||
|
Administering Tests
|
|||
|
|
|||
|
To ensure uniformity and test security, all exams are passed
|
|||
|
out simultaneously to proctoring staff members who immediately
|
|||
|
take them to the test sites. Proctors physically remain at each
|
|||
|
test site for the exam's duration to ensure test integrity. Once
|
|||
|
the tests are passed out, a staff member reads a test cover sheet
|
|||
|
containing complete test instructions. Once the instructional
|
|||
|
sheet has been read, the students begin their exams.
|
|||
|
|
|||
|
The tests are primarily multiple choice with very few
|
|||
|
true-and- false questions. Students fill out their answers on an
|
|||
|
answer sheet with a #2 pencil so that it can easily be read by
|
|||
|
the optical mark reader.
|
|||
|
|
|||
|
When each recruit section is finished, all exams are
|
|||
|
returned and accounted for by the curriculum manager. A single
|
|||
|
missing exam is treated as a compromise to the test's integrity
|
|||
|
and the results are then deemed invalid. This has yet to happen
|
|||
|
at the academy while using this system.
|
|||
|
|
|||
|
Scoring Tests
|
|||
|
|
|||
|
To score each examination, the assistant director for basic
|
|||
|
training and the curriculum manager work as a team to complete
|
|||
|
the effort. First, the curriculum manager determines each answer
|
|||
|
sheet is properly completed. If an answer sheet is incomplete,
|
|||
|
the recruit officer is called in and asked to make the required
|
|||
|
corrections. This pertains only to basic identification
|
|||
|
information on the form and not to incomplete test answers.
|
|||
|
Should a defective answer sheet get into the system, the computer
|
|||
|
will automatically reject it when it is scanned. Multiple
|
|||
|
answers, unclear erasures, or answer spaces left blank can cause
|
|||
|
an answer sheet to be defective. In each case, the computer
|
|||
|
indicates the nature of the problem, the location of the problem,
|
|||
|
and will inquire what the user wishes it to do regarding the
|
|||
|
defect.
|
|||
|
|
|||
|
To prepare the system for the grading mode, a master answer
|
|||
|
sheet (previously prepared by the curriculum manager from the
|
|||
|
master exam) is scanned. This sheet will provide the test basis
|
|||
|
from which all student answer sheets will be graded. Each
|
|||
|
student's answer sheet is then quickly fed into the scanner, with
|
|||
|
the data automatically stored on the recruit section's
|
|||
|
information disk. The scanning process takes approximately 3-5
|
|||
|
minutes for a section of 25 to 30 recruits. After the scanning
|
|||
|
step is completed, the computer produces a raw scores average
|
|||
|
sheet which is a rank-order listing of each class member based on
|
|||
|
the computer-generated averages of correct responses. This
|
|||
|
immediately shows the curriculum manager what the statistical
|
|||
|
range of the class is and if he has any individual failures to
|
|||
|
review.
|
|||
|
|
|||
|
The next step in the grading process is item analysis and
|
|||
|
test validation. To determine the relative fairness of each exam
|
|||
|
question, the computer automatically produces a distractor
|
|||
|
analysis document. The computer automatically views each
|
|||
|
question, and where 50% or more of the class get a question
|
|||
|
wrong, the question is reviewed and the instructor consulted. If
|
|||
|
he or she feels the material was adequately covered, the question
|
|||
|
will remain. If the question is tough, but fair, it stays in the
|
|||
|
exam. However, where there exists reasonable doubt that the
|
|||
|
students were genuinely confused by the question, it is
|
|||
|
eliminated from the overall test score. The benefit of a doubt
|
|||
|
is always given to the student. After all the eliminated
|
|||
|
questions are determined, they are subtracted from the original
|
|||
|
total to formulate a net basis for computing the final test
|
|||
|
scores. Students must score 70% or better to pass.
|
|||
|
|
|||
|
Generating Reports
|
|||
|
|
|||
|
As an integral part of the process, the computer generates
|
|||
|
several other reports. First, it produces a subject mastery
|
|||
|
report (a report card) to each student which tells them how they
|
|||
|
did in each subject area. Second, it produces the same report
|
|||
|
but in a cumulative format, which is used by staff members for
|
|||
|
counseling and remedial training. Third, a test answer sheet is
|
|||
|
generated which tells the recruits not only the correct answer
|
|||
|
but also what answer they put on their answer sheet. This report
|
|||
|
also indicates that questions are State-mandated and require
|
|||
|
retesting if missed. This capability provides a vehicle that
|
|||
|
allows identification of missed POs and retest on a timely basis
|
|||
|
by the curriculum manager, a procedure required by the State.
|
|||
|
|
|||
|
In short, using the software in the STAR program, the
|
|||
|
computer can generate any aspects of the testing process into a
|
|||
|
hard copy for the student staff member within an average of 2
|
|||
|
hours--from start to finish.
|
|||
|
|
|||
|
Maintaining Security
|
|||
|
|
|||
|
The testing, grading, question database, lesson plans and
|
|||
|
section data records are maintained on both floppy disk and hard
|
|||
|
drive. As an added security feature, the curriculum manager's
|
|||
|
office is locked during non-office hours and backup disks and
|
|||
|
access codes are secured. Hard copies of exams, rough drafts,
|
|||
|
and actual lesson plans are likewise kept secure.
|
|||
|
|
|||
|
CONCLUSION
|
|||
|
|
|||
|
A well-managed curriculum begins with a good job task
|
|||
|
analysis and performance objectives that arise from such
|
|||
|
analysis. In turn, lesson plans and student activities should be
|
|||
|
based on those performance objectives and must be tested
|
|||
|
accurately. Unfortunately, all too often, testing and tracking
|
|||
|
of such objectives get so cumbersome that administrators of an
|
|||
|
academy or educational institution do what they can, not what
|
|||
|
they should. However, if applied meaningfully to the task, the
|
|||
|
computer offers welcome relief to training administrators. The
|
|||
|
NVCJA has, over the years, tried to develop and refine a process
|
|||
|
that adequately manages a complex curriculum process without
|
|||
|
hamstringing the staff--a compromise that is working very well.
|
|||
|
|
|||
|
|