Board of Forensic Document Examiners
TEST DEVELOPMENT
Forensic Document Examiner Certification Test
The Association of Forensic Document Examiners contracted with a professional test development agency, Occupational Research and Assessment, Inc. , (ORA), to construct a certification test for document examiners that met the professional standard required for cognitive testing and was based on criteria that could be demonstrated to be objective, valid and reliable.

Test Development Agency
Occupational Research and Assessment, Inc.
Steven C. Clark, Ph.D., Director
124 Elm Street
Big Rapids, MI 49307

Test Development Period
Fall of 2000 through current

Test Development Committee
The test development committee consisted of ten forensic document examiners, each having at least twenty years of professional experience. These seasoned examiners work in law enforcement and in private practice and are from various geographical locations, including one international member.

Purpose of Test
To verify the knowledge, skills and abilities of forensic document examiners pursuant to a specified domain of knowledge.

Domain of Knowledge
The committee developed a test curriculum based on the job description of a forensic document examiner as published in the ASTM Standard E 444-98, and essential laboratory equipment used by document
examiners.

Development Process
The following chronicles the development and validation of the curriculum and cognitive testing for forensic document examiner certification.

The following was written by Steven C. Clark, Ph.D.1

The project consisted of the following eight phases: (1) needs analysis/occupational analysis, (2) job and task analysis, (3) curriculum validation, (4) objective test item writing and coding, (5) item validation, (6) test construction, (7) pilot testing, and (8) standard setting.

Phase 1 — Needs Analysis/Occupational Analysis
Phase 1 had two objectives. First, the assessment of research need, and second, the development of a conceptual framework for the field of forensic document examination. This was accomplished by a literature
review and interviewing incumbent members of the Association of Forensic Document Examiners.

Phase 2 — Job and Task Analysis
Using the DACUM (Developing A Curriculum) method a “development team” of document examiners from various geographic locations and employment circumstances (public/private) participated in a 2-day DACUM workshop. The result of the workshop was the drafting of a “national curriculum outline” for forensic document examiners. The outline, or job profile, consisted of all duties and tasks considered essential for the professional document examiner.

Phase 3 — Curriculum Validation
The drafted job profile or task list, was used to develop a survey instrument for international distribution. Respondents were identified by the development team as experts in the field of forensic document
examination. The survey had two dependent objectives, (1) validating the content, and (2) expanding and/or contracting the curriculum. The first was done by asking respondents to rate each task as to their level of agreement or disagreement that the task belonged in the curriculum. The second was simply the addition of tasks (based on offerings from respondents) and subtraction of tasks (based on low agreement ratings).

Phase 4 — Objective Test Item Writing and Coding
Using a set of previously validated test items, each member of the development team was asked to “code” items to one of the validated document examiner tasks. Then, based on data collected during the DACUM, areas without enough test items, or new areas of knowledge were reviewed and new test items were written and coded to specific tasks.

Phase 5 — Item Validation
All drafted test items were reviewed for bias and edited before the development team was reconvened to review each test item in the document examiner “bank.” To quantify each test item as content valid, all items were then run through a rating process to establish a point value for: importance (for success as a document examiner), frequency (of performance on the job), difficulty (compared to other questions in the “bank”), and level (of education required to answer the item successfully).

Phase 6 — Test Construction
After national validation of the curriculum and test item writing, rating and review, the development team was directed through an item selection process. The DACUM research identified the percentage of test items required (by duty and task) on the test, the sorting process identified the highest rated (by importance) items within each duty area as well as the “easy” to “difficult” item progression. Using this data the development team had the difficult task of granting “final approval” to each item (with acceptable importance ratings). Once the final list of Items was identified, each went through final edit and the correct answer was verified.

Phase 7 — Pilot Testing
During the national conference a final draft of the Forensic Document Examiner’s Certification Examination was pilot-tested. Individuals submitted complete demographic information, years of experience as a forensic document examiner, specific duties and tasks performed on the job, as well as educational background and training programs completed/certifications held. This data was used to verify pilot test results.

Phase 8 — Standards Setting
The forensic document examiner certification examination is a “criterion-referenced” test (as opposed to “norm-referenced”). “Passing” each section of the test will be achieved by meeting or exceeding a criteria,
set for each section. In other words, a number of correct responses within each section will indicate “competent” levels of knowledge and skill (each section may have different criteria [cutting score]). Using data
from the pilot testing and their own ratings of each item on the test, the development team established and set criterion levels for each section of the test.

Conclusion
This process provides a basis for focusing on continual educational efforts in the actual needs of practitioners.
The function of the evaluation instruments is two-fold: (1) to identify individuals who appear to possess the skills, knowledge and abilities necessary to be a "competent" document examiner, and (2) to identify areas of training needed for individuals who fall short of established standards. In addition, the validated test specifications will allow training efforts to focus on essential areas of knowledge. Training will follow a
standardized curriculum for agencies (public and private), educational institutions, training organizations, and professional associations that wish to prepare individuals as professional forensic documents examiners. The testing will ensure an adherence to a standard set of skills and knowledge.

Performance Examination
The test development committee designed performance examinations to measure the applicant’s skills and practical ability in actual case assignments. Developing the performance examination consisted of four
phases: (1) daily work analysis, (2) case construction, (3) case validation and pilot testing, (4) setting an administration standard.

Phase 1 — Daily Work Analysis
The committee determined the kinds of cases handled routinely by forensic document examiners. Such cases involve handwriting identification, hand printing identification, signature comparison, altered documents (manual and electronic), paper comparison, ink comparison, fraudulently constructed documents.

Phase 2 — Case Construction
The cases were constructed by the development team so that the correct answer was predetermined. Each case consisted of an assignment in one of the areas determined in Phase 1. The case packet distributed for examination includes a letter of assignment with the typical information received from counsel or an investigating officer; original documents (as practical), and high quality scanned or photocopied documents or photographs.

Phase 3 — Case Validation and Pilot Test
Each test case in the “bank” was administered to four qualified document examiners. In order to validate the case as typical of routine assignments handled by forensic document examiners and to ensure that the case provided sufficient information to allow a competent examiner to reach a conclusion, all examiners had to agree on its suitability and to reach the same opinion/conclusion for the case to be accepted for the performance examination.

Phase 4 — Setting an Administration Standard
The committee had to choose whether to administer the performance test on-site in a proctored setting or allow the case to be examined off-site in the test taker’s office/laboratory. The advantage of allowing the test taker to perform the examination in his/her own laboratory is that there is no time constraint and his/her own laboratory equipment is available to perform examinations. It was decided that that method, however, lacked security in two areas. First, it was not possible ensure that the test case was not copied and/or distributed to other individuals; and second, it was not possible to ensure that the applicant did not consult with another individual before rendering his/her final opinion. Although the committee believes that most examiners would be trustworthy and would follow the guidelines established for an unproctored off-site performance test, a breach of security would not be detected using this method. If a breach occurred, it would provide an
advantage for a future test taker(s), and the certification testing would not then be a reliable means of determining the competency of its participants. This method was rejected for the reasons previously stated,
and because ensuring security would require that the performance test cases be used only one time and to generate four cases (which would have to be pilot tested) for exclusive use by one test taker is not realistic.

The committee decided to administer the performance examination on-site in a proctored setting. The disadvantage of an on-site examination is the lack of laboratory equipment. Therefore, if laboratory equipment
is necessary for a through examination, the test taker would be required to state this information, list the equipment he/she would opt to use, define the parameters of the examination using this equipment, the
procedure, and the potential results (both positive and negative). The majority of the examinations would be designed so that basic equipment, such as grids, light box, magnifiers and a microscope would be sufficient to reach an opinion.

The test taker would receive four cases for examination. These would be assigned from the “bank” of cases and would include two cases requiring a handwriting comparison and two cases from other areas.

The acceptable range of expressing conclusions/opinions for the performance examination are:
(a) high probability bordering on identification
(b) probable to the positive
(c) inconclusive
(d) probable to the negative
(e) high probability bordering on exclusion

The test taker must arrive at the predetermined opinion to pass the examination. All four examinations must be passed to pass the performance test.

Test Administration Agency
The Association of Forensic Document Examiners decided that it would be in the best interest of the profession and those using the services of forensic document examiners for the testing program to be administered by an agency not associated with the organization. Therefore, the Board of Forensic Document Examiners was incorporated as an agency to administer the testing. The Board is entrusted with the independent and impartial administration of the forensic document examiners certification test administration. It has sole authority to accept an applicant for testing, administer the test, and award certification. The Board
will lease the test developed by the Association of Forensic Document Examiners for a period of years, after which it will become owner of the testing program.

In a further effort to distinguish the testing program and ensure that it remains free from even the appearance of preferential bias in awarding certification, the Board has contracted with Occupational Research and
Assessment, Inc. (ORA) to administer, proctor and grade the cognitive examinations, and to maintain the security of the “bank” of test questions. ORA is a professional test development and administration agency and will provide an additional security feature by following accepted professional procedures in generating a new test from the bank of questions each time the test is offered.