Friday, September 11, 2009

Assignment 1 ECUR 809

Assignment 1: ECUR 809
September 11, 2009
For: Jay Wilson
By: Joanne LaBrash

Lessons Learned from a State-Funded Workplace Literacy Program

Project Summary/Background

An evaluation was completed on a set of ten workplace literacy pilot programs that were publicly funded by a state government (The Indiana Department of Workforce Development). One of the identified purposes for the projects came from: “the perception that Indiana’s competitiveness and growth may be constrained by a mature, incumbent workforce that has deficient basic skills” (p 2). These pilot programs were developed to improve those perceived deficient skills.

This evaluation is both formative and summative: formative because it examines a pilot project; (its future improvements, adaptations, or termination depend on the evaluation results), and summative because it quantified scores to measure learner gains. (1800 participant surveys and test scores were collected and analyzed).

The model showed elements of Provus’ Discrepancy Model, in that it compared intentions with actual results in the areas of DIPPS: design and content, installation, process, product and cost-benefit analysis (to varying degrees). Both qualitative and quantitative data were used and I think that the combination of methods adds valuable layers of perspective.

Qualitative Focus

Two site visits occurred at ‘most’ of the 10 funded projects. I would have liked all 10 sites visited, which would give a more thorough analysis and evaluation. The first visit focused on planning and implementation (design, installation and process) and the second visit near the end of the project focused on outcomes and lessons learned (product and cost benefit analysis). The visits involved examination of administrative challenges, instructional delivery, contextualization, motivations of both employee participants and businesses, and a general cost-benefit analysis for workers, companies, literacy providers, and government.

The struggle with implementation was highlighted. None of the sites were familiar with the pre and post test requirements, which led to strained relationships with workers and employers when it was determined that a third assessment appraisal was needed. This confusion led to inconsistency across all sites, with only three sites administering the pre-assessment test and only one-third completing post-tests. The results showed higher scores for pre-tests than post-tests overall! I tended to discount the validity and use of the data for these reasons.
Another implementation failure was that the computer course required internet access, which was not always available. The resource materials also revealed problems (especially the computer course) with “pushback from students (and employers) who felt it was too technical” (p 4).

Quantitative Focus

A survey of participants, the adult student assessment test scores and resulting learning gains along with earnings histories were the data quantitatively analyzed. I noticed a failure to draw any strong connection between the earnings histories to the impact desired; the state hoped to reduce the use of public assistance benefits. The inclusion of possible additional factors involved and a longer time frame were neglected. Therefore, the information gathered seemed to be irrelevant and a bit misguided, although it probably looked good in a graph. Statistics tend to hold more appeal and appear objective, but for what purpose?

Overall, I found the evaluation very effective, with strong examination of design, content, installation and process. Its weakest area was the evaluation of the product. In my view, the attempts to measure outcomes and changes were riddled with testing and resource material problems that negatively impacted the validity intended. All of the qualitative data and most of the quantitative data were directly connected to the findings and were purposefully and thoughtfully examined.

Reference:

Hollenbeck, Kevin and Timmeney, Bridget. (2009). Lessons Learned from a State-Funded
Workplace Literacy Program W.E. Upjohn Institute for Employment Research. Indiana.
(Working Paper 09-146). http://ssrn.com/abstract=1359182

1 comment:

  1. Well done Joanne

    A thorough analysis of the program. The breakdown of qual and quant help me to generate a clear picture of how the study was implemented. You have identified an appropriate model that was applied to this study. There do seem to be a raft of issues around the implementation of the study and the lack of agreement by stakeholders. The qualitative data does not appear to have been useful in generating any valid or reliable outcomes.

    ReplyDelete