Date of Award

Spring 2015

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Psychology and Behavioral Sciences

First Advisor

Donna Thomas

Abstract

Organizations rely on job analysis to provide information about the work performed and requirements needed for a position. The use of inaccurate information may have negative outcomes, such as the misallocation of human resources or inefficient training programs. Many job analysis techniques rely on averaging responses, which may oversimplify the results. Preserving idiosyncratic variance, which reflects differences in the ways in which respondents experience and evaluate the job, may increase job analysis accuracy. To assess overall accuracy, the job analysis data in the present study was examined utilizing a practical model of accuracy (Prien, Prien, & Wooten, 2003). To detect idiosyncratic variance, subject matter experts (SMEs) responded to the job analysis. SME respondents were categorized according to job performance, job experience, work unit, and work role. To compare ratings within and between each group, reliability estimates were converted to standard values using Fisher's r-to-Z transformation and then averaged. Differences in the rating consistency of the groups were compared using a one-way between groups ANOVAs conducted for each position under analysis. Overall, subgroup rating consistency was not found to be higher than whole group rating consistency, thus failing to support three of the four hypotheses. Global SMEs and incumbents were found to offer similar levels of rating consistency, indicating small groups of experts may be capable of providing similar job analysis results as large groups of incumbents. Limitations and suggestions for future research are discussed, as are implications for using these techniques on other samples and other human resources applications.

Share

COinS