Access Restriction

Author Smith, Jeffrey ♦ Whalley, Alexander ♦ Wilcox, Nathaniel T.
Source CiteSeerX
Content type Text
File Format PDF
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Subject Keyword Program Impact ♦ Program Participant Good Evaluator ♦ Self-reported Program Effectiveness ♦ Random Assignment ♦ Little Evidence ♦ Experimental Evaluation ♦ Labor Market Outcome ♦ Inexpensive Potential Proxy ♦ Before-after Difference ♦ Untreated Outcome ♦ Information Content ♦ Individual Program Impact ♦ Counterfactual Outcome ♦ Experimental Impact ♦ National Job Training Partnership Act ♦ Individual Self-reports ♦ Improved Participant Evaluation Measure ♦ Program Participant ♦ Rank Preservation ♦ Jtpa Program ♦ Subgroup Variation ♦ Program Effectiveness ♦ Survey-based Participant Evaluation ♦ Individual Impact
Abstract Participants, like econometricians, may have difficulty in constructing the counterfactual outcome required to estimate the impact of a program. In this paper, we directly assess the extent to which program participants are able to estimate their individual program impacts ex-post. More generally, we examine the information content of survey-based participant evaluations. Utilizing data from the National Job Training Partnership Act (JTPA) Study (NJS), an experimental evaluation of the JTPA program, we compare estimated program impacts to individual self-reports of program effectiveness after the completion of the program. We estimate individual impacts in two ways: (1) using subgroup variation in experimental impacts; and (2) under the assumption of rank preservation between the treated and untreated outcomes following random assignment. We find little evidence of a relationship between these estimated program impacts and self-reported program effectiveness. We do find evidence that cognitively inexpensive potential proxies for program impacts such as before-after differences in earnings, the type of service (and thus the value of inputs) received, and labor market outcomes are correlated with self-reported program effectiveness. Based on our findings, we suggest an improved participant evaluation measure.
Educational Role Student ♦ Teacher
Age Range above 22 year
Educational Use Research
Education Level UG and PG ♦ Career/Technical Study
Publisher Date 2006-01-01