95-7743. Evaluation of the Summer Youth Employment and Training Program  

  • [Federal Register Volume 60, Number 61 (Thursday, March 30, 1995)]
    [Notices]
    [Pages 16507-16515]
    From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
    [FR Doc No: 95-7743]
    
    
    
    -----------------------------------------------------------------------
    
    
    
    DEPARTMENT OF LABOR
    
    Evaluation of the Summer Youth Employment and Training Program
    
    AGENCY: Office of the Secretary, Labor.
    
    ACTION: Expedited review under the Paperwork Reduction Act.
    
    -----------------------------------------------------------------------
    
    SUMMARY: The Employment and Training Administration, Department of 
    Labor, in carrying out its responsibilities under the Paperwork 
    Reduction Act (44 U.S.C. Chapter 35, 5 CFR 1320 (53 FR 16618, May 10, 
    1988)), is submitting a study to examine the range of practices 
    currently being used in the Summer Youth Employment and Training 
    Administration (SYETP) to deliver educational services. It will assess 
    the quality of training and evaluate contributions to the educational 
    deficiencies of participants.
    
    DATES: The Employment and Training Administration has requested an 
    expedited review of this submission under the Paperwork Reduction Act; 
    this Office of Management and Budget (OMB) review has been requested to 
    be completed by April 14, 1995.
    
    FOR FURTHER INFORMATION CONTACT:
    Comments and questions regarding the Evaluation of the SYETP should be 
    directed to Mr. Kenneth A. Mills, Departmental Clearance Officer, 
    Office of Information Resource Management Policy, U.S. Department of 
    Labor, 200 Constitution Avenue NW., Room N-1301, Washington, DC 20210, 
    (202) 219-5095.
        Comments should also be sent to OMB, Office of Information and 
    Regulatory Affairs, Attn: OMB Desk Officer for ETA, NEOB Room 10235, 
    Washington, DC 20503, (202) 395-7316.
        Any member of the public who wants to comment on the information 
    collection request which has been submitted to OMB should advise Mr. 
    Mills of this intent at the earliest possible date.
    
    Average Burden Hours/Minutes Per Response: 113 minutes
    Frequency of Response: One time
    Number of Respondents: 9,115
    Total Annual Burden Hours: 17,167 hours
    Total Annual Responses: 9,115
    Affected Public: Individuals or households; Non-for-profit 
    institutions; State, Local or Tribal Government
    Respondents Obligation to Reply: Voluntary
    
        Signed at Washington, DC, this 24th day of March 1995.
    Kenneth A. Mills,
    Departmental Clearance Officer.
    
    I. Introduction
    
        This document represents a request for approval of the data 
    collection protocols to be used in the Evaluation of the Summer Youth 
    Employment and Training Program, being conducted by Social Policy 
    Research Associates (SPR) and Brandeis University's Center for Human 
    Resources, under contract to the U.S. Department of Labor (DOL). The 
    study uses qualitative (case study) and quantitative data collection 
    and analysis methods to examine training practices being used in the 
    Summer Youth Employment and Training Program (SYETP), Title II-B of 
    JTPA. The Introduction to this document provides a brief overview of 
    the study and its purposes, and it discusses the data collection 
    procedures and analysis plans. Subsequent sections respond to the 
    Office of Management and Budget's (OMB) specific instructions for 
    justification and address issues related to the collection of 
    information using statistical methods.
    
    Background
    
        Funded under Title II-B of the Job Training Partnership Act, SYETP 
    has its origins in a thirty-year federal commitment to create summer 
    jobs for disadvantaged youth. However, developments in recent years 
    have as well affirmed an emphasis on providing educational services. 
    For example, amendments to Title II-B enacted in 1986 enumerated the 
    enhancement of basic educational skills and encouragement of school 
    completion as explicit goals of the program. Further, SDAs were 
    required to assess the reading and mathematics skill levels of SYETP 
    participants and to provide remedial and basic education services where 
    appropriate. Subsequent DOL issuances reinforced the educational 
    emphasis of the Summer Youth program and encouraged efforts to link 
    work and learning. [[Page 16508]] 
    
    Purposes of the Study
    
        The changing focus of SYETP raises questions about the proper role 
    of the program's educational component and the feasibility of 
    diagnosing and meaningfully redressing the basic skills deficiencies of 
    large numbers of youth within the compressed time frame of the summer 
    program. Thus, the objectives of the evaluation are to examine the 
    range of practices currently being used to deliver educational 
    services, explain variation in service designs, assess the quality of 
    training being provided, and evaluate its ability to meet the needs of 
    participants and make significant contributions to their educational 
    deficiencies. Ultimately the study will enable DOL to gauge the 
    adequacy of services currently being provided, identify areas of 
    weakness and, conversely, service designs that appear especially 
    efficacious, and provide leadership and technical assistance to improve 
    training practices.
    
    Conceptual Framework
    
        Guiding the data collection and analysis efforts are a client-level 
    model of high quality educational services and a system-level model of 
    factors that determine training practices. The client-level model of 
    training quality, presented in Exhibit I, depicts how clients flow 
    through the SYETP program, the quality indicators for each type of 
    service that the program provides, and the intended consequences of 
    high-quality services for youth. Steps identified in this model are:
         Recruitment, assessment, and service planning practices. 
    Quality indicators associated with this phase of service delivery 
    include whether programs have a clear strategy for which youth should 
    be targeted and effective procedures to recruit them, whether they 
    conduct a comprehensive assessment of youths' skills and interests, and 
    whether the assessment results are used to develop an individualized 
    service strategy tailored to the skills and interests of each 
    participant.
         Providing effective educational services, either through 
    classroom or work-based instruction. Quality indicators for both 
    training content and instructional methods are identified in the 
    exhibit, including whether the training objectives are well-specified, 
    whether they promote the educational skills needed in the workplace, 
    whether training is provided in a functional context, whether 
    participants' progress is documented, whether there are ample 
    opportunities to learn, and whether the style of instruction promotes 
    active learning that is adaptive to the needs of individual 
    participants.
         Providing linkages with continuing educational activities, 
    to sustain and build on learning gains.
        In contrast to the client-level model, the system-level model, 
    shown in Exhibit II, is intended as a casual (rather than a temporal) 
    model and identifies factors that influence service delivery, including 
    those that facilitate or impede the development of high-quality 
    educational services. The far right box of this model contains the 
    elements of high-quality SYETP educational services that were described 
    in the client-level model. The exhibit schematically identifies aspects 
    of Federal and State policies and the local environment that can affect 
    an SDA's program design, and it shows how design decisions and 
    educational provider characteristics, in turn, affect the quality of 
    educational services provided. Specifically, it identifies:
         Federal, State, and local influences on programs' designs, 
    including federal Title II-B policies, other Federal initiatives and 
    policies, State JTPA and educational policies, and characteristics of 
    local youth and of the local area.
         SDA design factors, including program goals, target 
    groups, and service delivery arrangements.
         Attributes of the service providers who deliver 
    educational services to participants, including the types of 
    institutions, their history, objectives, and funding sources.
    
    Questions for the Evaluation
    
        The preceding conceptual frameworks give rise to a number of 
    specific questions to be investigated in the project. These include 
    issues relating to the design of services at the SDA level, the design 
    of services at the level of the educational provider, and the quality 
    and impact of educational services.
         The design of SYETP at the SDA level.
    
    --What general objectives have SDAs established for their Title II-B 
    programs? What specific objectives (in terms of skills to be conveyed, 
    benchmarks to be achieved) have been established for the Title II-B 
    educational components?
    --Do programs identify priority client groups? If so, what target 
    groups have they established? Who makes those decisions, and how and 
    why were they made?
    --What types of providers are used by the SDA for educational 
    instruction? How were these providers selected and why were they 
    selected?
    --How are other services, including supportive services and stipends, 
    used in the summer youth program? How are these services used to 
    support educational and other goals for the program?
    --What linkages has the SDA established between its Titles II-B and II-
    C programs?
    --What ``front-end'' and ``back-end'' linkages has the SDA established 
    with public schools? Who instigated these linkages and who maintains 
    them? Are the linkages formal or informal?
    --What role have federal and state policies and local influences played 
    in the SDA's design decisions? How have these policies been perceived 
    and implemented?
    
         The design of SYETP at the provider level
    
    --What types of organizations provide educational instruction? What 
    objectives have been established for their programs?
    --Why did the provider decide to participate in the summer youth 
    program and how was it selected?
    --What objectives has the provider established for its educational 
    program? What skills (e.g., basic skills, SCANS skills) is it 
    endeavoring to teach? Is it attempting to link learning and work?
    --What service design is it using to meet these objectives? Who 
    developed the design and why? Was the design established explicitly for 
    the summer youth educational program?
    --How are educational services sequenced? How was the curriculum 
    developed?
    --How did it recruit and train its staff?
    --How have the SDA's objectives for the summer youth educational 
    program been communicated to the service provider and how have they 
    been acted upon?
    --What role has the SDA played in designing the provider's educational 
    services, including its content and method of delivery? How does the 
    SDA monitor the services that are being provided and how does it 
    suggest changes?
    --How does the provider's design reflect other elements of the local 
    context, including the needs of the community, the characteristics of 
    youth in the area, and the characteristics of the school district?
    
         The quality and impact of educational services
    
    --What procedures are used by SDAs to recruit youth for the summer 
    program? Do recruitment methods correspond to their targeting goals?
    --How is the participant's initial assessment conducted and how is 
    [[Page 16509]] subsequent progress assessed and documented?
    --How are individual service plans developed? Are the service plans 
    truly individualized to the needs, skills, and interests of each 
    participant? Does the youth play an active role in formulating the 
    service plan? Are the resulting goals clear and ambitious (without 
    being unrealistic)?
    
    --Does the educational instruction that is being provided have well-
    specified objectives? Do the objectives indicate skills to be acquired 
    (rather than knowledge to be learned)?
    --Does the instruction emphasize skills needed in the workplace? Are 
    the skills taught in a functional context?
    --Does the instruction promote active learning and training for 
    transfer?
    --Is instruction adaptive and provided by capable and caring adults who 
    view their role as a ``facilitator''?
    --Have linkages been established to provide feedback to schools or 
    other programs serving the youth?
    --What implications do service provider characteristics and design 
    decisions have for the quality of educational services?
    --What implications does participation in SYETP, in general, have for 
    stabilizing or improving academic or other achievements, promoting 
    school completion, and increasing the motivation to learn?
    --What implications do alternative designs for delivering educational 
    services (e.g., provider characteristics, the locus on instruction) 
    have for these same youth outcomes?
    
    Study Design
    
        To address the research questions identified above, this study uses 
    two evaluation components--a process study and a client-level study of 
    outcomes.
    
    The Process Study
    
        The process study uses a series of ``nested'' qualitative case 
    studies to examine the design and operation of SYETP services at 30 
    SDAs nationwide and approximately 3 educational providers at each of 
    these SDAs (up to 90 providers total). The data collection activities 
    will consist of a review of plans for the Summer Program as well as 4-
    day on-site visits to each selected SDA and its associated providers, 
    during which time researchers will meet with SDA and provider 
    administrators and staff, classroom instructors, and worksite 
    supervisors, and they will observe educational instruction. While on 
    site, researchers also will conduct 1 focus group with approximately 5-
    6 youth participants at each educational provider visited and review 
    case files for 2 youths at each provider. Follow-up telephone 
    discussions also will be conducted with youths selected for the case 
    file reviews and with their parents and school counselors to learn of 
    retrospective impressions of and satisfaction with the SYETP experience 
    and perceived impacts on subsequent achievements and behaviors in 
    school.
    Selecting the Sample
        As part of the study, samples are being drawn of 30 SDAs and up to 
    3 educational activities in each SDA (or 90 total). To ensure that the 
    resulting sample will be nationally representative, the 30 SDAs are 
    being selected using stratified random sampling. In selecting the 
    sample, all SDAs nationwide are being assigned to one of 4 strata. The 
    first 3 of these groups are defined according to the percent of their 
    Summer Youth participants who receive educational instruction, with the 
    first stratum consisting of those SDAs with percents between 1% and 
    41%, the second between 42% and 73%, and the third between 73% and 
    100%. These cutoffs were chosen so that approximately equal numbers of 
    youths receiving educational instruction are in each of the three 
    strata. The 4th stratum consists of those SDAs for whom information on 
    the number of participants in educational instruction is not available.
        Within each strata, SDAs were sampled with the odds of selection 
    proportionate to the number of participants being served,\1\ so that 
    the resulting sample of SDAs is approximately self-weighting.
    
        \1\Dollar allocations for the Summer Youth program were used in 
    the fourth stratum, because the number of participants receiving 
    educational instruction is not available for these SDAs.
    ---------------------------------------------------------------------------
    
        Because this study is intended to describe and compare the 
    effectiveness of a wide variety of approaches to building the 
    educational skills of SYETP participants, educational providers are 
    selected within each of the 30 SDAs using purposive selection methods. 
    Specifically, all educational providers used by these SDAs are to be 
    categorized according to their:
         Content emphasis (e.g., basic skills only; SCANS 
    foundation skills and/or competencies; or other academic subjects, such 
    as science, history, or art).
         Locus of educational instruction (e.g., classroom-based, 
    work-based, or both).
         Type of provider (e.g., SDA; secondary school, other 
    educational institution such as community college or technical college, 
    or other).
         Targeted participants (e.g., 14-15 year olds, 16-18 year 
    olds, other target groups).
        Providers are being selected to ensure the diversity of the sample 
    (both within the SDA and across all 30 SDAs) with respect to these 
    dimensions.
    Data Collection
        The field protocols, or topic guides, developed for this process 
    study are designed to guide the data collection activities. These 
    protocols will permit site visitors to tailor discussions and 
    observations on a standardized set of issues to the particular context 
    of each case study SDA and sampled educational activity. The following 
    topic guides have been developed and are submitted for OMB's review:
         SDA Guide #1 includes the topics to be covered in 
    discussions with SDA policy, planning, and administrative staff, 
    including those relating to the goals, design, and management of the 
    Summer Program.
         SDA Guide #2 includes the discussion topics to be used 
    with SDA staff responsible for direct operation or oversight of client 
    recruitment, assessment, service planning, and case management services 
    for Title II-B participants.
         Program Guide #1 includes the topics to be covered in 
    discussions with administrators of the selected educational activity, 
    staff that participated in the planning and development of the detailed 
    curriculum, and supervisors responsible for hiring, training, and 
    overseeing instructors/work site supervisors involved in educational 
    activities.
         Program Guide #2 includes the topics to be used in 
    discussions about the classroom-based learning approach with classroom 
    instructors or other staff whose primary responsibility is to support 
    learning in a classroom or individual study setting (e.g., tutors, 
    educational resource staff).
         Program Guide #3 includes the topics to be used in 
    discussions with work project coordinators and worksite supervisors who 
    are involved in work-based learning. This guide includes topics for 
    projects using the 100% work-based learning approach as well as topics 
    for staff involved in work activities that are closely coordinated with 
    classroom-based learning.
         Program Guide #4 is a guide for structured observations of 
    educational activities.
         Program Guide #5 is a guide for structured review of 
    curriculum materials.
         Client Guide #1 describes the topics to be addressed in 
    focus group [[Page 16510]] discussions with approximately 5 youth 
    participating in each selected educational activity.
         Client Guide #2 will be used to extract relevant 
    information for the case history sample from the participants' written 
    case files at the SDA or provider.
         Client Follow-Up Guide describes the topics to be 
    discussed with selected youth participants (i.e., those selected for 
    the case file review) several months after their Summer Program's 
    participation has ended.
         Parent/Guardian Guide will guide the issues to be 
    addressed with these youths' parent or guardian during the follow-up 
    period.
         Counselor Guide describes the topics to be addressed with 
    the youths' secondary school counselors during the school year 
    following the youths' summer participation.
    Data Analysis
        The analysis of the case studies will begin with a within-site 
    explanatory analysis. This task will consist of bringing to bear the 
    data that has been collected to arrive at a comprehensive picture of 
    the practices in each of the SDAs and service providers that were 
    selected for the study and how they have contributed to the needs of 
    the participants. A particular objective will be to uncover especially 
    innovative practices, with an eye to understanding how they were 
    implemented and what makes them work so well.
        The next step will consist of cross-site comparisons to synthesize 
    the findings. This analysis will clarify further the unique procedures 
    that programs adopt to deliver high quality training in a variety of 
    environments and arrive at an understanding of commonalities and 
    differences between programs and how these are related to effective 
    practices.
    
    Client-Level Study of Outcomes
    
        In addition to collecting and analyzing information from the case 
    studies about program practices, this study also will gather and 
    analyze quantitative information for a sample of approximately 1,800 
    youths who participate in the Summer Program in 1994 and an additional 
    4,000 youths who participate in the 1995. By compiling and analyzing 
    information for a sample of participants on the services that were 
    received and the outcomes that were obtained, the study will draw 
    inferences regarding the relative efficacy of various service design 
    and delivery methods.
    Selecting Participants for the Study
        A key element of the overall research design is to tie the results 
    from the case study observations of classroom instruction and 
    appraisals of training quality to the analysis of participants' 
    outcomes. In this way, inferences can be drawn regarding the 
    relationship between the training practices observed in the field to 
    the consequences of those practices for the youth who receive them. For 
    this reason, participants selected for the client-level study of 
    outcomes will be those whose SYETP instruction was delivered by the 
    service providers whose practices were observed on site. Specifically, 
    participants will be selected in the following ways:
         Preliminary information received from the 30 SDAs has led 
    us to determine that approximately 15 of them require pre-tests and 
    post-tests of basic skills for all participants receiving academic 
    instruction. All youth served in the summer of 1994 by the 3 selected 
    providers in these SDAs will be included in the study. This will yield 
    an expected sample of approximately 1,800 respondents.
         The service providers visited for the case studies in the 
    30 SDAs included in the study will each be asked to administer a common 
    pre-test/post-test in the summer of 1995, as well as a brief instrument 
    measuring self-esteem. All youth served by these providers will be 
    included in the study. This will yield approximately an additional 
    3,600 respondents.
         A randomly chosen sample of 400 youths not receiving 
    educational instruction also will be included in the study as a 
    comparison group.
    Data Collection
        The plan for the client-level study of outcomes is based on the 
    analysis of information for the sampled participants drawn from a 
    variety of sources and that, to a large degree, already exists. Thus, 
    the compilation of these data for analysis purposes entails data 
    gathering at least as much as new data collection. Specific data 
    sources to be used are these:
         The SDA's MIS. Although the specific types of information 
    doubtless will vary from one SDA to the next, most SDAs' MIS will 
    include: participant's demographic and background characteristics 
    (e.g., race, school status, gender), barriers to employment (e.g., 
    whether the youth is a limited-English speaker or has a disability), 
    and summary information about services received. SDAs will be requested 
    to transmit these data to the contractor electronically (e.g., on data 
    diskette).
         SDA's Client Files. Those SDAs able to provide pre-test/
    post-test scores for youth served in the summer of 1994 will forward 
    those scores to the contractor for data entry. All participating SDAs 
    will forward the hard-copy pre-test/post-tests and self-esteem surveys 
    of sampled participants served in 1995 to the contractor for scoring 
    and keypunching.
         School Records. An important objective of the study is to 
    learn how participants (at least those who are students) fare in their 
    subsequent schooling. Outcomes of interest include measures of academic 
    achievement (e.g., grade-point-average), but also evidence of 
    behavioral problems (e.g., as evidenced by absenteeism, suspensions/
    expulsions). Thus, school record information will be abstracted for 
    sampled summer 1995 youths who sign and have their parents/guardians 
    sign a consent form.
        Exhibit 3 summarizes the variables to be measured from these 
    sources.
    Data Analysis
        A preliminary analysis will be conducted using SDA MIS data and 
    pre-test/post-test scores for the 1,800 youth who participated in the 
    summer program in 1994, selected as described above. A more 
    comprehensive analysis for a larger sample will be conducted when 
    school record data are collected, for youths who participated in the 
    summer of 1995. Additional outcomes to be examined with these data 
    include: self-esteem, school attendance, grade completion, grade-point 
    average, and absenteeism. Two types of analysis will be conducted:
         Descriptive analyses, which will paint a picture of the 
    characteristics of persons receiving educational instruction in the 
    sampled programs, the types of services received, and the outcomes 
    obtained.
         Explanatory analyses that will examine the efficacy of 
    alternative service designs and delivery mechanisms for subsequent 
    outcomes.
    
    Reporting
    
        The project's major deliverables include:
         An Interim Report. This report will detail the results of 
    the process analysis, describing results from the case studies 
    regarding how services are designed and delivered. It also will include 
    the preliminary results from the study of outcomes based on the data 
    collected for youth who participated in the summer of 
    1994. [[Page 16511]] 
         A Technical Assistance Guide (TAG). The TAG will be a 
    practitioner's guide describing effective practices in the delivery of 
    educational services, focusing especially on how educational 
    instruction can be delivered in a functional, work-related context.
         A Final Report. This report will represent a summation of 
    the study's findings and recommendations. As such, it will include the 
    content of the Interim Report, combined with the comprehensive results 
    of the study of outcomes.
    
    II. Supporting Statement
    
    A. Justification
    
    1. Circumstances Making the Data Collection Necessary
        The Department of Labor (DOL) is considering ways of improving the 
    educational component of the JTPA Title II-B Summer Youth Employment 
    and Training program (SYETP), in keeping with Secretary Reich's ``First 
    Jobs/New Jobs/Better Jobs'' initiative. Its objectives for SYETP are to 
    improve the program's effectiveness in assisting young people acquire 
    strong workplace foundation skills (including basic skills, thinking 
    skills, and interpersonal skills) and gain an appreciation of the 
    inextricable connection between learning and success in the workplace. 
    As part of its effort to foster program improvements, DOL needs to 
    obtain a thorough understanding of educational services currently being 
    provided to summer youth participants--including how participants are 
    assessed, the curriculum being used, and how the educational and work 
    components of SYETP are integrated--and identify particularly 
    efficacious practices.
        As part of its response to Executive Order No. 12862 requiring all 
    Federal agencies to develop customer service standards, DOL also needs 
    to know participants' views about the services they received in SYETP, 
    including their service needs and how well the program responded to 
    those needs. This information is critical to implementing changes that 
    can improve program responsiveness.
    2. Use of Information and Consequences if Not Collected
        The information being collected in this study will be used to 
    address these objectives:
        1. Describe variation in the design of SYETP educational services 
    across service delivery areas (SDAs) and their service providers, with 
    respect to general goals and objectives they have established for the 
    program, their targeting decisions, assessment procedures, specific 
    skills being taught, the locus of instruction, linkages between work 
    and learning, and linkages with public schools and year-around Title 
    II-C JTPA services.
        2. Describe variation in the quality of educational services, 
    including whether assessments are comprehensive, whether service 
    strategies are individualized to the needs and interests of 
    participants, whether the participants are actively involved in 
    formulating the service plan, whether educational instruction has well-
    specified objectives relating to skills to be acquired, whether skills 
    are taught in a functional context and emphasize skills needed in the 
    workplace, and whether instruction is adaptive.
        3. Identify factors that explain variation in how educational 
    services are being designed and delivered, such as federal policies, 
    opportunities for technical assistance and training, state-level 
    partnerships between JTPA and the school system, and other local 
    influences.
        4. Document consequences of participation in SYETP educational 
    services, especially high quality services, for participants' skill 
    levels, subsequent academic achievement, and school attendance and 
    performance.
        5. Document participants' satisfaction with the program, including 
    their assessment of the helpfulness of the services they received.
        If this information is not collected, DOL will not have the 
    information it needs to evaluate how educational services are being 
    delivered or their effectiveness, and thus it will not have the 
    necessary foundation for implementing program improvements.
    3. Considerations to Reduce Burden
        The data collection activities have been designed to minimize the 
    burden on respondents in four major ways. First, pre-existing 
    information will be utilized wherever possible to minimize the need for 
    new data collection. These pre-existing sources will include SDAs' 
    plans for their Title II-B programs, RFPs and contracts written by SDAs 
    to secure the services of the direct providers of educational 
    instruction, data collected as part of last summer's study of SYETP, 
    test scores, information from school records, and existing MIS data 
    compiled by SDAs about their participants' characteristics, services, 
    and outcomes. These data sources can be forwarded to the contractor 
    with minimal burden to SDA or school or provider staff and to program 
    participants. Where data abstraction requires hand-coding (e.g., from 
    school records), abstractors will be compensated by the contractor.
        Second, where feasible (and at least with respect to the MIS data), 
    information will be transferred to the contractor electronically (i.e., 
    via modem or data diskette), greatly facilitating the data transmission 
    process.
        Third, only data of direct relevance to the goals of the study will 
    be collected.
        Fourth, much time on site will be devoted to the unobtrusive 
    observation of educational instruction and the review of written 
    documents and participants' case files, and this too should be 
    minimally burdensome to SDA and service provider staff or participants.
    4. Efforts to Identify Duplication
        A study conducted of SYETP during the summer of 1993 included 50 
    on-site visits and a mail survey of all SDAs (1205-0327, expired 12/
    93). However, this data collection focused on general operational 
    issues, did not entail on-site observations of classroom instruction to 
    characterize its quality and did not attempt to study youth at any 
    point beyond their period of participation.
        Additional information available about SYETP comes from the SDAS's 
    plans for their summer's activities. However, these documents provide 
    no information about how or how well the plans are implemented, nor do 
    they allow an assessment of the instruction's quality or effectiveness, 
    nor do they speak to the participants' satisfaction with the services 
    they received.
        Finally, states are required to submit annual reports providing 
    aggregate counts of participants served and their characteristics 
    (1205-0200, expires 7/97). However, these simple summary reports are 
    useful for little more than identifying the numbers of persons of 
    different ages and education levels who were served.
    5. Why Similar Information Cannot be Used
        Information from the sources described above will be used to the 
    fullest extent possible in the study being planned. Indeed, these data 
    provide a strong foundation to support the study by providing essential 
    background and other information. However, DOL has concluded, on the 
    basis of the effort to identify duplication, that these pre-existing 
    sources are not adequate to characterize the quality or education 
    services, support an analysis of the factors associated with high 
    quality services, describe the consequences of participation in SYETP 
    for subsequent achievements, or document participants' satisfaction 
    with the program. Nor are they adequate, consequently, to support DOL's 
    efforts to foster program improvements.
    6. Burden on Small Businesses
        Some activities associated with this study will involve the 
    collection of data from the administrators or staff of organizations 
    providing educational instruction as part of SYETP, and some of these 
    entities may be small businesses. However, as described under #3, 
    ``Considerations to Reduce Burden,'' only information of direct 
    relevance to the study's objectives will be collected while on site. 
    Secondly, much on-site data collection to be conducted at service 
    providers will involve the unobtrusive observation of classroom 
    instruction and the review of client case files, and it will thus 
    entail minimal burden on the providers' administrators or staff. 
    Finally, as part of the agreement allowing them to deliver services 
    under JTPA, providers acknowledge DOL's right to evaluate and/or 
    monitor their activities and services.
    7. Consequences of Less Frequent Data Collection
        The data collection activities associated with this study will be 
    conducted one time only.
    8. Collection Inconsistent With 5 CFR 1320.6
        Data collection will be consistent with 5 CFR 1320.6.
    9. Efforts to Consult With Persons Outside the Agency
        Responsibility for devising and carrying out the data collection 
    rests with DOL's contractor, Social Policy Research Associates (SPR), 
    and its subcontractor, Brandeis University's Center for Human 
    Resources. Key personnel associated with these institutions are 
    nationally known experts in evaluation research and have in-depth 
    knowledge of employment and training programs in general and Summer 
    Youth programs in particular.
        Additionally, the study team has enlisted the aid of additional 
    experts, who are serving as consultants on the project. Their advice 
    was solicited regarding the usefulness of the data elements to be 
    collected, the feasibility of the data collection plan, and the clarity 
    of instructions. These consultants are:
    
    Ms. Nancy Bross, Public Policy Support, 1377 McLendon Ave., N.E., 
    Atlanta, Georgia 30307, (404) 581-9895
    Ms. Lee Bruno, Consultant, 3106 Old Largo Road, Upper Marlborough, 
    Maryland 20772, (301) 627-1415
    Ms. Janice Hendrix, North Central Indiana PIC, 36 West Fifth St., Suite 
    102-B, Peru, Indiana 46970, (317) 473-5571
    Mr. Gill Ritt, Career Resource Associates, 2932 Sumac Drive, Atlanta, 
    Georgia 30360, (404) 698-8427
    Mr. Kip Stottlemyer, Consultant, 1408 Milestone Drive, Collierville, 
    Tennessee 38017, (901) 854-1438
    
        In addition, all the protocols guiding the conversations with key 
    respondents have been pre-tested on not more than 9 respondents, and 
    modifications to the protocols were made on this basis where it seemed 
    appropriate.
    10. Assurances of Confidentiality
        The information to be collected will be held strictly confidential 
    and will be used for research purposes only. To ensure confidentiality, 
    DOL will require that the study team take the following measures:
         Access to the data will be limited to the contractor's 
    project team members only.
         Reports to DOL will focus on describing and analyzing the 
    range of service designs and training practices that were observed and 
    will not associate a design or process with any specific SDA or service 
    provider, except by way of providing an example of exemplary practices, 
    and then only with the SDA's approval.
         Reports to DOL that contain individual vignettes based on 
    the experiences of participants will not contain individual names or 
    any other identifying information.
         The contractor's project team members will be trained in 
    the confidentiality requirements and cautioned to use the data for 
    research purposes only.
    11. Justification of Questions of a Sensitive Nature
        Two sources of data are potentially sensitive. First, pre-tests and 
    post-tests will be administered to youth included in the study. Second, 
    school record information will be abstracted for those youth in the 
    sample who participated in the summer program of 1995 and who return to 
    school in the fall. These data elements are imperative to examine 
    learning gains for those who receive educational services and to 
    examine if SYETP participation is associated with improved school 
    performance.
        However, for the most part these data elements do not represent new 
    data collection activities. JTPA currently requires that all summer 
    youth be administered a test of basic skills to determine their need 
    for basic skills remediation, and many SDAs also administer post-tests 
    to document learning gains.
        Similarly, information about school performance will be abstracted 
    from existing student files. Moreover, youths and their parents/
    guardians will be asked to sign a consent form before the abstraction 
    will be conducted. This form will outline the objectives of the study 
    and ask the youth and his/her parent to allow access to student records 
    for purposes of the evaluation. It will be explained that participation 
    in the study is completely voluntary and that a refusal to participate 
    will not jeopardize the youth's receiving SYETP services.
    12. Cost to the Federal Government and to Respondents
        The total estimate cost to the federal government for the 
    collection and analysis of these data is $849,543. Because the study 
    will be conducted over 3 years,\2\ the average per annum cost is 
    approximately $283,000. This amount includes the costs of designing the 
    field protocols, performing the on-site visits and telephone follow-up, 
    recording observations from the site visits, collecting the client-
    level data for the study of outcomes, analyzing the data, and preparing 
    two reports on the results (i.e., an Interim Report and a Final Report) 
    and a Technical Assistance Guide (to disseminate information on 
    effective practices). The method used to derive this figure entailed a 
    quantification of hours of effort involved by each study team member 
    and included expenses for materials and services (e.g., photocopying 
    expenses and expenses involved in binding the report).
        The costs to respondents result only from the time spent answering 
    the questions. Estimates of the time to respond are presented 
    below. [[Page 16512]] 
    
        \2\A separate PWR package will be submitted for any burden 
    associated with follow-up work done after the first year.
    ---------------------------------------------------------------------------
    
    13. Estimate of Burden
        Below is the estimate of the respondent burden. Time estimates are 
    based on the pretest of the instruments (for the topic guides to be 
    used in the process study) or from the use of the instruments in 
    previous studies (for the pre-test/post-test and self-esteem 
    [[Page 16513]] scales). It is anticipated that these will be conducted 
    within the next year and the burden hours represent the first year 
    burden claimed. If follow-up activities extend beyond one year, an 
    Inventory Correction Worksheet will be submitted.
    
    ------------------------------------------------------------------------
                                        Number of   Minutes per             
               Instruments             respondents   respondent  Total hours
    ------------------------------------------------------------------------
     SDA Guides for Discussions With:                                       
        1. Policy, Planning, and                                            
         Administrator Staff.........           60           60           60
        2. Recruitment, Service                                             
         Planning, and Case                                                 
         Management Staff............           60           45           45
    Program Guides for Discussions                                          
     With:                                                                  
        1. Program Administrators....           90           90          135
        2. Classroom Instructors.....           85           20           28
        3. Work Project Coodinators..           10           20            3
    Client Guides for:                                                      
        1. Focus Group with                                                 
         Participants................          450           15          112
        2. Participants, at Follow-up          120           15           30
    Guide for Discussions with                                              
     Parents.........................          120            5           10
    Guide for Discussions with                                              
     Regular School Counselors.......          120           15           30
    Pre-test and Post-test of                                               
     Participant's Basic Skills                                             
     (Workplace Literacy Test).......        4,000          240       16,000
    Self-Esteem Instrument (Rosenberg                                       
     Self-Esteem Scale)..............        4,000           10          667
    ------------------------------------------------------------------------
    
        Additional data collection to be used in the project represents the 
    abstraction or review of existing information. There are no respondents 
    for these guides and thus they entail minimal burden on SDA or provider 
    personnel or program participants beyond copying documents or data 
    files and shipping them to the contractor. Where hand-extraction of 
    information is required (e.g., from student records, abstractors (SDA 
    or service provider personnel) will be compensated.
    14. Reason for Change in Burden
        This is a new collection as reported in ETA's ICB (Information 
    Collection Budget). The first year's burden of 17,120 hours is being 
    submitted now. An Inventory Correction Worksheet will be submitted for 
    any follow--up activities in the out years.
    15. Plans for Statistical Analysis
        Data to be collected for this project for the process study 
    generally will not be analyzed using qualitative research methods, 
    findings will be detailed in a narrative, and their implications for 
    improving program quality will be detailed.
        Data collected for the study of outcomes will be analyzed using 
    statistical methods to address these research issues:
         What are the characteristics of persons receiving 
    educational services in the Summer Youth program? Are educational 
    services targeted to those who have a greater need for remediation?
         What types of educational services were provided? 
    Specifically, in what subject areas (e.g., math, reading, other 
    academic subjects, SCANS skills) was instruction provided? With what 
    intensity?
         How does the intensity and nature of the training received 
    relate to outcomes, including learning gains, school attendance rates, 
    grades in school, rates of absenteeism, and suspensions and expulsions? 
    How do the outcomes for youth who received educational instruction 
    compare to those in the compoarison group?
         Data for this component of the project will be compiled in 
    various phases:
         Phase I: Collect MIS and pre-test/post-test data for 
    sample members who participated in the summer program in 1994. 
    Collection of this information will occur during the winter of 1995.
         Phase II: Collect MIS, pre-test/post-test, and self-esteem 
    data for sample members who participated in the summer program in 1995. 
    Collection of this information will occur during the winter of 1996.
         Phase III: Collect school record information. These data 
    will be collected during the summer of 1996, after the conclusions of 
    the 1995-96 school year, so that school outcomes measured for those who 
    participated in the 1995 summer program will reflect a full school 
    year.
        Methods to be used in analyzing these data will include univariate 
    and multivariate statistics. Specifically, univariate distributions 
    will be calculated to describe the characteristics of participants, 
    their services, and their outcomes. Cross-tabulations will be used to 
    examine the relationship between variables. Multivariate analyses, 
    primarily regression analysis, will be used to examine how various 
    participant characteristics and measures of services received relate to 
    outcomes.
        The project's major deliverables include:
         An Interim Report. This report will detail the results of 
    the process analysis, describing results from the case studies 
    regarding how services are designed and delivered. It also will include 
    the preliminary results from the study of outcomes based on the data 
    collected for youth who participated in the summer of 1994. This report 
    will be completed at the end of the Summer of 1995.
         A Technical Assistance Guide (TAG). The TAG will be a 
    practitioner's guide describing effective practices in the delivery of 
    educational services, focusing especially on how educational 
    instruction can be delivered in a functional, work-related context.
        The TAG will be prepared in the Spring of 1996.
         A Final Report. This report will represent a summation of 
    the study's findings and recommendations. As such, it will include the 
    content of the Interim Report, combined with the comprehensive results 
    of the study of outcomes. This report will be prepared in the Spring of 
    1997.
    
    B. Collection of Information Employing Statistical Methods
    
        This process study utilizes qualitative case study data collection 
    and analysis methods. In terms of identifying appropriate respondents 
    in each local site and analyzing case study data, qualitative rather 
    than statistical methods will be used. Discussions of estimation 
    procedures and degree of accuracy (power analysis) in generalizing 
    sample findings to the universe of all potential respondents are not 
    applicable to the process study, because findings will not be expressed 
    in quantitative terms.
        The study of outcomes will employ statistical methods, however. 
    These [[Page 16514]] methods are described in the rest of this section.
    1. Potential Respondent Universe and Sampling Methods
        Approximately 625,000 youths can be expected to participate in the 
    Title II-B program during each of the summers of 1994 and 1995, if 
    current levels of funding are maintained. Of these, about 40%, or 
    250,000, will be receiving educational services. These youth are served 
    by the nation's approximately 640 service delivery areas (SDAs).
        The youths to be included in the study will be served by 
    approximately 90 service providers used for educational instruction in 
    30 SDAs that were selected for examination in the process study. To 
    ensure that the sample of SDAs is nationally representative, the 30 
    SDAs are selected using stratified random sampling. In selecting the 
    sample, all SDAs nationwide are assigned to one of 4 strata. The first 
    3 of these groups are defined according to the percent of their Summer 
    Youth participants that receive educational instruction, with the first 
    stratum consisting of those SDAs with percents between 1 % and 41%, the 
    second between 42% and 73%, and the third between 74% and 100%. These 
    cutoffs were chosen so that approximately equal numbers of youths 
    receiving educational instruction are in each of these three strata. 
    The 4th stratum consists of those SDAs for whom information of the 
    number of participants in educational instruction is not available.
        Approximately an equal number of SDAs were drawn from each stratum 
    and, within each stratum, SDAs were sampled with the odds of selection 
    proportionate to the number of participants being served,\3\ so that 
    the resulting sample would be approximately self-weighting. The number 
    of SDAs in each stratum and the number selected for the study are shown 
    below.
    
        \3\Dollar allocations for the Summer Youth program were used in 
    the fourth stratum, because the number of participants receiving 
    educational instruction is not available for these USAs.
    
    ------------------------------------------------------------------------
                                                                      Number
       Percent of the SDA's youths receiving educational     Total   of SDAs
                          instruction                        number   in the
                                                            of SDAs   sample
    ------------------------------------------------------------------------
    Low: 1% to 41%........................................      273        8
    Med. 42% to 73%.......................................      132        8
    High: 74% to 100%.....................................      101        8
    Information missing...................................      117        7
    ------------------------------------------------------------------------
    
        Because this study is intended to describe and compare the 
    effectiveness of a wide variety of approaches to building the 
    educational skills of SYETP participants, educational providers are 
    selected within each of the 30 SDAs using purposive selection methods. 
    Specifically, all educational providers used by these SDAs are to be 
    categorized according to their:
         Content emphasis (e.g., basic skills only; SCANS 
    foundation skills and/or competencies; or other academic subjects, such 
    as science, history, or art).
         Locus of educational instruction (e.g., classroom-based, 
    work-based, or both).
         Type of provider (e.g., SDA; secondary school, other 
    educational institution such as community college or technical college, 
    or other).
         Targeted participants (e.g., 14-15 year olds, 16-18 year 
    olds, other target groups).
        Providers are being selected to ensure the diversity of the sample 
    (both within the SDA and across all 30 SDAs) with respect to these 
    dimensions.
        Because most providers serve fairly few youths, all youths served 
    by the selected providers during the summer of 1995 will generally be 
    selected for the study of outcomes. However, for large providers (those 
    serving more than approximately 70 participants), youths will be 
    selected who attended classes served by the instructors who were 
    observed by the site visitors, the 400 youth to be selected for the 
    comparison group will be selected randomly, approximately 15 from each 
    SDA.
        Following the above procedures, an approximately equal number of 
    youths will be selected from each SDA. MIS data for each of these 
    youth, should be available without exception, as will pre-test and 
    post-test scores and the measure of self-esteem. Similarly, because 
    SDAs typically require access to school records as part of the 
    assessment process, we anticipate that high percentages of sampled 
    youths and their parents/guardians will sign the consent forms allowing 
    the researchers' access to this information. At least an 80% rate of 
    cooperation is anticipated.
    2. Procedures for the Collection of Information
        Sample Selection. As discussed above, the sample has been drawn in 
    a two-stage process. First, a sample of SDAs and their providers was 
    chosen, and next participants served by these providers are selected, 
    along with a randomly chosen sample of participants for the comparison 
    group.
        Degree of Accuracy. To meet DOL's objectives for the survey, the 
    sample size must be sufficient to allow reliable estimation of 
    relatively small differences in outcomes across various service 
    strategies. Let us suppose that the outcome variable is a percentage 
    (e.g., the percentage of participants who complete their next grade 
    level), that the average of the outcome is 50%, that we can explain 25% 
    of the outcome's variation with all predictor variables combined, and 
    that 10% of the variance in the educational components can be explained 
    by other control variables. Under these circumstances, the sample size 
    to be used for this study would be able to detect an approximately 3 
    percentage points difference in outcomes across a dichotomous measure 
    of service (e.g., instruction is provided in a functional context or 
    not). Note that these are generally fairly conservative assumptions. 
    For example, sample size requirements would be less stringent if the 
    average of the outcome were either higher or lower than 50%.
        Estimation Procedures. As described in Section A, Item 15, several 
    estimation techniques will be used. First, means and univariate 
    distributions will be calculated to describe the sample. Second, t-
    tests of means (for outcomes measured on a continuous scale) and chi-
    square tests (for categorical variables) will be calculated to 
    determine whether outcomes vary significantly for participants with 
    different characteristics (e.g., age) or who received different 
    services. Third, multivariate analysis methods will be used, with 
    various measures of outcomes as the dependent variable. Independent 
    variables will include participant characteristics (e.g., age, gender, 
    pre-test scores) and measures of the types of services received.
        Ordinary least squares (OLS) will be used as the estimation 
    technique for most of these multivariate models, because of its 
    desirable properties. However, OLS is inefficient when the dependent 
    variable is categorical (e.g., whether the next school grade was 
    completed). In these cases, logit analysis will be used.
    3. Methods to Maximize Response Rates
        Most data to be used for the client study represent pre-existing 
    records collected by SDAs and schools. For this reason, response rates 
    should be quite high for all components of the data collection. 
    Potentially, however, some participants or their parents may deny the 
    researchers access to school records. To minimize this possibility, 
    SDAs and their service providers will be contacted far in advance of 
    the start of the 1995 summer program, and their cooperation will be 
    enlisted. Thus, when youth are first enrolled in the program they can 
    be told immediately that they are being [[Page 16515]] asked to 
    participate in the study and the study's importance can be explained 
    carefully to them.
    4. Tests of Procedures
        The data collection for the client-level study involves no new 
    survey or other instruments. Therefore, no test of procedures is deemed 
    necessary.
    5. Contractor and Individuals Consulted
        The Department of Labor has contracted with Social Policy Research 
    Associates (SPR) to design, conduct, and analyze the study of outcomes. 
    Key personnel at SPR at Dr. Ronald D'Amico, Dr. Katherine Dickinson, 
    and Mr. Richard West. They may be contacted at: Social Policy Research 
    Associates, 200 Middlefield Road Suite 100, Menlo Park, CA 94025. Their 
    phone is (415) 617-8625.
    
    [FR Doc. 95-7743 Filed 3-29-95; 8:45 am]
    BILLING CODE 4510-22-M
    
    

Document Information

Published:
03/30/1995
Department:
Labor Department
Entry Type:
Notice
Action:
Expedited review under the Paperwork Reduction Act.
Document Number:
95-7743
Dates:
The Employment and Training Administration has requested an expedited review of this submission under the Paperwork Reduction Act; this Office of Management and Budget (OMB) review has been requested to be completed by April 14, 1995.
Pages:
16507-16515 (9 pages)
PDF File:
95-7743.pdf