socw-6311-WK9A- Drafting a Process Evaluation

The steps for process evaluation outlined by Bliss and Emshoff (2002) may seem very similar to those for conducting other types of evaluation that you have learned about in this course; in fact, it is the purpose and timing of a process evaluation that most distinguish it from other types of evaluation. A process evaluation is conducted during the implementation of the program to evaluate whether the program has been implemented as intended and how the delivery of a program can be improved. A process evaluation can also be useful in supporting an outcome evaluation by helping to determine the reason behind program outcomes.

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

There are several reasons for conducting process evaluation throughout the implementation of a program. Chief among them is to compare the program that is being delivered to the original program plan, in order to identify gaps and make improvements. Therefore, documentation from the planning stage may prove useful when planning a process evaluation.

For this Assignment, you either build on the work that you completed in Weeks 6, 7, and 8 related to a support group for caregivers, or on your knowledge about a program with which you are familiar. Review the resource “Workbook for Designing a Process Evaluation”.

Submit a 4- to 5-page plan for a process evaluation. Include the following minimal information:

  • A description of the key program elements
  • A description of the strategies that the program uses to produce change
  • A description of the needs of the target population
  • An explanation of why a process evaluation is important for the program
  • A plan for building relationships with the staff and management
  • Broad questions to be answered by the process evaluation
  • Specific questions to be answered by the process evaluation
  • A plan for gathering and analyzing the information

Running Head:

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

ROLE OF STAKEHOLDERS IN A PROGRAM

ROLE OF A STAKEHOLDER IN A PROGRAM 2

ROLE OF STAKEHOLDERS IN A PROGRAM

Stephan A. Bell

SOCW 6311 WK 6

Stakeholders are people, organizations, or an individual who invests in a program and is interested in the results of an evaluation. Stakeholders ensure the goals of the program are achieved and the intended population or society has benefited from the program. They represent their needs and interest throughout the period of the program that is fundamental to a good program evaluation. They manage and communicate with the program manager for the overall success of the program. There are different categories of stakeholders depending on the degree they are influenced by the program. They include primary stakeholders, secondary stakeholders, and the key stakeholders. (Gilliam et al., 2002)

Primary stakeholders are directly affected either positively or negatively by the running program. For instance, in housing construction, architect and investors are the primary stakeholders simply because they contribute to the construction of houses. Builders and beneficiaries of houses will also be stakeholders as they will be affected directly by the constructed houses. Secondary stakeholders are being affected indirectly. For instance, in housing construction, engineers will be the secondary stakeholder as he will get guidelines from the constructors. Examples of key stakeholders are the surveyors since they are not either directly or indirectly affected by the program but offer technical roles in the house building.

Stakeholders have legal rights in making decisions and they can manage the scheduling of programs and also control budget issues. (Gilliam et al., 2002)It’s their responsibility to finance projects, educate developers, setting of milestone dates, and approve program changes and creating scheduling parameters. Stakeholders are part of the board of directors and they help in taking decisions along with other board members and can also have the power to disagree with the decision made by other members. Some of the concerns that can arise from stakeholders include problems that may occur during the program period and also how resources have been utilized during the program period. Stakeholders will also be eager to know how programs they are funding or receiving are progressing and whether they are producing the intended effects.

References

Gilliam, A., Davis, D., Barrington, T., Lacson, R., Uhl, G., & Phoenix, U. (2002). The Value of Engaging Stakeholders in Planning and Implementing Evaluations. Retrieved 4 January 2021, from

https://pubmed.ncbi.nlm.nih.gov/12092937/

.

Running head: SOCIAL WORK PRACTICE

1

SOCIAL WORK PRACTICE

5

Social Work Practice

Stephan A. Bell

SOCW 6311 WK 7

The success of social work programs is dependent on the capacity to identify the needs and interventions of the target groups elaborately. Social workers need to understand the opportunities and strengths of their activities. In supporting low-income young mothers, social workers need to identify the most basic needs of the group. Notably, low-income mothers lack proper housing, feeding, and health services. The group requires assistance to ensure the quality of life and access to opportunities. Vulnerable communities do not easily access assistance programs and resources. The missing link undermines the efforts to ensure the realization of positive outcomes. Therefore, social workers need to use a rational approach to identify the needs of clients and execute effective program activities that help address the challenges facing vulnerable communities.

The components captured in the model include inputs, activities, resources, and expected outcomes. The choice of a program to assist low-income young mothers should target the activities that are not available from the government. In the case study, one social worker argues the importance of providing information on community resources to young mothers. The objective is to create awareness of the availability of social assistance programs targeting vulnerable communities. The social worker also asserts that the activities should facilitate the promotion of assistance programs such as medical insurance, food stamps, and income support (Plummer, Makris, & Brocksen, 2014). Nonetheless, it has been identified that some of the clients know about the program activities. The available public information on social programs coupled with many people in need of assistance points out a missing gap. When assisting young mothers, the desired outcomes include parenting resources, feeding programs, and health services (Randolph, 2010). The program connections should persuade the social workers to devise an appropriate mechanism and scheme of activities targeting the vulnerable group. This should help in optimizing the outcomes and benefits.

A logic model helps in identification the link between a need and actions in a program. Social work activities necessitate recognizing effective and reasonable actions to facilitate the attainment of positive program outcomes. The model provides a concise and clear picture of the operations from the beginning to the end. The objective is to allow rational choice and execution of activities according to the stipulated objectives (Randolph, 2010). The vulnerability and instability of low-income young mothers jeopardize their capacity to enjoy a quality life. The group is susceptible to challenges such as unemployment, drug abuse, and domestic violence, among others. Therefore, the provision of social support through non-state or state-sponsored programs is necessary. This support offers protection from economic hardships, psychological distress, and poor child outcomes.

The logical model identifies the specific needs and their relation to the expected benefits as well as outcomes. Low-income young mothers will require information on the available programs targeting education, feeding, housing, and health services. Some clients have information from government sources, although they lack the approach to benefit from the programs. Therefore, the usage of a logic model should help link the operations of a program and client change. The program connections identify the causal relationships across the program components and changes that foster the attainment of expected outcomes (Dudley, 2014). Low-income young mothers are vulnerable and unstable due to limitations in their individual capacity. Their plight ends up contributing to resultant effects on children. The model should offer the framework to help social workers evaluate the process by stipulating the important links and variables in the program activities. It is also helpful in planning, monitoring progress, and outcome measurement for future improvements (Randolph, 2010). Working with low-income young mothers should target the initiatives that address not only individual mothers but also their immediate families.

In summary, social workers should identify measures to foster benefits arising from programs intended for low-income young mothers. The promotion of support and subsequent benefits to families require advanced knowledge on the efficacy and rationality of the program activities. Social workers should define the instruments that capture the diverse circumstances where support is appropriate. The intervention design and execution should also strengthen the vulnerable young mothers to improve individual and family outcomes. The usage of a logic model is effective in identifying the right approaches and programs to assist vulnerable communities. Social workers require the model in their decision-making process to ensure the program activities match the intended outcomes and goals.

References

Dudley, J. R. (2014). Social work evaluation: Enhancing what we do. (2nd ed.) Chicago, IL: Lyceum Books.

Plummer, S.-B., Makris, S., & Brocksen S. (Eds.). (2014a). Sessions: Case histories. Baltimore, MD: Laureate International Universities Publishing.

Randolph, K. A. (2010). Logic models. In B. Thyer (Ed.), The handbook of social work research methods (2nd ed., pp. 547–562). Thousand Oaks, CA: Sage.

Running head: NEEDS ASSESSMENT PLANNING 1

NEEDS ASSESSMENT PLANNING 4

Needs Assessment Planning

Stephan A. Bell

SOCW 6311 WK 8

Cases of dementia have been on the rise and it is important for the society and family members to learn and understand how to accept, how to handle and take care of dementia patients in the family and societal levels. The Family Alliance Group is a caregiver support group for families, partners, and caregivers with dementia specialty. Nevertheless, the needs of patients with dementia are diverse and having them addressed on the basis of priority would help the patients live a better life than when some less impactful issues are handled first. The fact that the Family Alliance Group is being commenced requires also a set of requirements put in place to ensure that dementia patients’ caregiving service is professional and of a positive impact. The following outline provides a needs assessment for a dementia caregiving support group program meant to help families.

The resources needed to operate this service

To offer the Family Alliance Group Dementia Services, there are a number of resources needed. The major resource is human resource which is comprised of a number of caregivers familiar with dementia patients’ needs especially on how to relate with them. The second resource is physical materials like in some cases wheelchairs to avoid instances of falls or easily helping them move from one location to the other (Tutty, & Rothery, 2010). Physical resources also entail healthy foods, medication, and basic needs like a healthy and safe environment. Technological resources are also necessary for instance digital devices that would keep the caregiver connected with medical professionals and family members all the time.

The program activities

· Massaging the patients

· Keeping the patients engaged with interesting stories

· Talking a nature walk

· Physical exercises

· Helping the patient reminisce about their life

· Reading favorite books and watching interesting movies

· Animal therapy

· Music therapy

· Art and crafts

· Engaging in simple home chores that the patients finds fun in doing

Of important to note is that all these activities would be carried out in a systematic manner and this means that the activities must all be covered within a week. The activities that the patient would get engaged in today would be different from tomorrow’s activities to avoid instances of monotony and boredom (Tutty, & Rothery, 2010). This would also be an effective way of making the dementia patient and caregiver’s day interesting, inspiring, and creative as the main focus would remain on attaining desirable outcomes.

The desired outcomes

The overall intention of engaging the patient in all these activities is to have them have a clear flashback of their lives which might help in stimulating the brain and the memory. It is also desired that the patient through the activities would keep away from other ailments that might make the dementia more severe. According to Dudley, (2014) it is a desired outcome that the patient would stop getting severe and instead increase levels of memory thereby stagnating the condition.

A plan for gathering information about the population to serve

From family members since they are the individuals that have a higher level of contact with the dementia patients. the members of the family can help shed light on signs and symptoms of dementia they have noted on the patient and different ways they feel that their patient is happy and contented as well as the things the patient does not appreciate (Stewart, et al., 2011). Apart from family members, there are also close friends and relatives who are familiar with the condition and might help in understanding the needs of dementia patients. The medical professionals with dementia condition knowledge would also provide professional and reliable information. Lastly, the members of the society are also a critical shareholder to the program.

Justifications for your plans and decisions

The above plan and decision would come as a great aid to families with dementia patients as they would have a program through which best and professional services would be offered to their kin. It is a fact that dementia affects aged individuals who are also vulnerable in many other ways. Through the caregiving support group program, it would be possible for the families to have reliable caregivers they can entrust their loved ones and attain the most desirable outcomes with the help of provided resources.

Conclusion

With the starting of the program, it is possible to conduct a follow-up to the needs assessment at the implementation stage of the program evaluation. This is through visiting families and partners registered under the program to get to understand the needs they would consider a priority to them and have the caregivers deal with the needs as per the priority list provided (Dudley, 2014). From time to time, engaging the family members and the caregivers together with observing the patient would help in determining whether or not the program is working as expected on the patient or a review of the needs assessment should be redone.

References

Dudley, J. R. (2014). Social work evaluation: Enhancing what we do. (2nd ed.) Chicago, IL: Lyceum Books.

Tutty, L. M., & Rothery, M. A. (2010). Needs assessments. In B. Thyer (Ed.), The handbook of social work research methods (2nd ed.,pp. 149–162). Thousand Oaks, CA: Sage. (PDF)

Stewart, K. E., Phillips, M. M., Walker, J. F., Harvey, S. A., & Porter, A. (2011). Social services utilization and need among a community sample of persons living with HIV in the rural south. AIDS Care, 23(3), 340–347.

Workbook

for
Designing
a Process
Evaluation

Produced for the

Georgia Department of Human
Resources

Division of Public Health

By

Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.

Department of Psychology
Georgia State University

July 2002

Evaluation Expert Session
July 16, 2002 Page 1

What is process evaluation?

Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks “what,” and outcome evaluation asks, “so
what?”

When conducting a process evaluation, keep in mind these three
questions:

1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?

This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.

Why is process evaluation important?
1. To determine the extent to which the program is being

implemented according to plan
2. To assess and document the degree of fidelity and variability in

program implementation, expected or unexpected, planned or
unplanned

3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention

and the outcomes
5. To provide information on what components of the intervention

are responsible for outcomes
6. To understand the relationship between program context (i.e.,

setting characteristics) and program processes (i.e., levels of
implementation).

7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,

and funders
10. To improve the quality of the program, as the act of evaluating is

an intervention.

Evaluation Expert Session
July 16, 2002 Page 2

Stages of Process Evaluation Page Number

1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**

Also included in this workbook:

a. Logic Model Template 30
b. Pitfalls to avoid 30
c. References 31

Evaluation can be an exciting,
challenging, and fun experience

Enjoy!

* Previously covered in Evaluation Planning Workshops.
** Will not be covered in this expert session. Please refer to the Evaluation Framework

and Evaluation Module of FHB Best Practice Manual for more details.

Evaluation Expert Session
July 16, 2002 Page 3

Forming collaborative relationships

A strong, collaborative relationship with program delivery staff and management will
likely result in the following:

Feedback regarding evaluation design and implementation
Ease in conducting the evaluation due to increased cooperation
Participation in interviews, panel discussion, meetings, etc.
Increased utilization of findings

Seek to establish a mutually respectful relationship characterized by trust, commitment,
and flexibility.

Key points in establishing a collaborative
relationship:

Start early. Introduce yourself and the evaluation team to as many delivery staff and
management personnel as early as possible.

Emphasize that THEY are the experts, and you will be utilizing their knowledge and

information to inform your evaluation development and implementation.

Be respectful of their time both in-person and on the telephone. Set up meeting places
that are geographically accessible to all parties involved in the evaluation process.

Remain aware that, even if they have requested the evaluation, it may often appear as

an intrusion upon their daily activities. Attempt to be as unobtrusive as possible and
request their feedback regarding appropriate times for on-site data collection.

Involve key policy makers, managers, and staff in a series of meetings throughout the

evaluation process. The evaluation should be driven by the questions that are of
greatest interest to the stakeholders. Set agendas for meetings and provide an
overview of the goals of the meeting before beginning. Obtain their feedback and
provide them with updates regarding the evaluation process. You may wish to
obtained structured feedback. Sample feedback forms are throughout the workbook.

Provide feedback regarding evaluation findings to the key policy makers, managers,

and staff when and as appropriate. Use visual aids and handouts. Tabulate and
summarize information. Make it as interesting as possible.

Consider establishing a resource or expert “panel” or advisory board that is an official

group of people willing to be contacted when you need feedback or have questions.

Evaluation Expert Session
July 16, 2002 Page 4

Determining Program Components

Program components are identified by answering the questions who, what, when, where,
and how as they pertain to your program.

Who: the program clients/recipients and staff
What: activities, behaviors, materials
When: frequency and length of the contact or intervention
Where: the community context and physical setting
How: strategies for operating the program or intervention

BRIEF EXAMPLE:

Who: elementary school students
What: fire safety intervention
When: 2 times per year
Where: in students’ classroom
How: group administered intervention, small group practice

1. Instruct students what to do in case of fire (stop, drop and roll).
2. Educate students on calling 911 and have them practice on play telephones.
3. Educate students on how to pull a fire alarm, how to test a home fire alarm and how to

change batteries in a home fire alarm. Have students practice each of these activities.
4. Provide students with written information and have them take it home to share with their

parents. Request parental signature to indicate compliance and target a 75% return rate.

Points to keep in mind when determining program
components

Specify activities as behaviors that can be observed

If you have a logic model, use the “activities” column as a starting point

Ensure that each component is separate and distinguishable from others

Include all activities and materials intended for use in the intervention

Identify the aspects of the intervention that may need to be adapted, and those that should

always be delivered as designed.

Consult with program staff, mission statements, and program materials as needed.

Evaluation Expert Session
July 16, 2002 Page 5

Your Program Components

After you have identified your program components, create a logic model that graphically
portrays the link between program components and outcomes expected from these
components.

Now, write out a succinct list of the components of your program.

WHO:

WHAT:

WHEN:

WHERE:

HOW:

Evaluation Expert Session
July 16, 2002 Page 6

What is a Logic Model

A logical series of statements that link the problems your program is attempting to
address (conditions), how it will address them (activities), and what are the expected
results (immediate and intermediate outcomes, long-term goals).

Benefits of the logic model include:

helps develop clarity about a project or program,
helps to develop consensus among people,
helps to identify gaps or redundancies in a plan,
helps to identify core hypothesis,
helps to succinctly communicate what your project or program is about.

When do you use a logic model

Use…

– During any work to clarify what is being done, why, and with what intended results

– During project or program planning to make sure that the project or program is logical and
complete

– During evaluation planning to focus the evaluation

– During project or program implementation as a template for comparing to the actual program
and as a filter to determine whether proposed changes fit or not.

This information was extracted from the Logic Models: A Multi-Purpose Tool materials developed by Wellsys
Corporation for the Evaluation Planning Workshop Training. Please see the Evaluation Planning Workshop
materials for more information. Appendix A has a sample template of the tabular format.

Evaluation Expert Session
July 16, 2002 Page 7

Determining Evaluation Questions

As you design your process evaluation, consider what questions you would like to answer. It is only after
your questions are specified that you can begin to develop your methodology. Considering the importance
and purpose of each question is critical.

BROADLY….

What questions do you hope to answer? You may wish to turn the program components that you have just identified
into questions assessing:

Was the component completed as indicated?
What were the strengths in implementation?
What were the barriers or challenges in implementation?
What were the apparent strengths and weaknesses of each step of the intervention?
Did the recipient understand the intervention?
Were resources available to sustain project activities?
What were staff perceptions?
What were community perceptions?
What was the nature of the interaction between staff and clients?

These are examples. Check off what is applicable to you, and use the space below to write additional broad,
overarching questions that you wish to answer.

Evaluation Expert Session
July 16, 2002 Page 8

SPECIFICALLY …

Now, make a list of all the specific questions you wish to answer, and organize your questions categorically. Your
list of questions will likely be much longer than your list of program components. This step of developing your
evaluation will inform your methodologies and instrument choice.

Remember that you must collect information on what the program is intended to be and what it is in reality, so you
may need to ask some questions in 2 formats.

For example:

How many people are intended to complete this intervention per week?”
How many actually go through the intervention during an average week?”

Consider what specific questions you have. The questions below are only examples! Some may not be appropriate
for your evaluation, and you will most likely need to add additional questions. Check off the questions that are
applicable to you, and add your own questions in the space provided.

WHO (regarding client):
Who is the target audience, client, or recipient?
How many people have participated?
How many people have dropped out?
How many people have declined participation?
What are the demographic characteristics of clients?

Race
Ethnicity
National Origin
Age
Gender
Sexual Orientation
Religion
Marital Status
Employment
Income Sources
Education
Socio-Economic Status

What factors do the clients have in common?
What risk factors do clients have?
Who is eligible for participation?
How are people referred to the program? How are the screened?
How satisfied are the clients?

YOUR QUESTIONS:

Evaluation Expert Session
July 16, 2002 Page 9

WHO (Regarding staff):
Who delivers the services?
How are they hired?
How supportive are staff and management of each other?
What qualifications do staff have?
How are staff trained?
How congruent are staff and recipients with one another?
What are staff demographics? (see client demographic list for specifics.)

YOUR QUESTIONS:

WHAT:

What happens during the intervention?
What is being delivered?
What are the methods of delivery for each service (e.g., one-on-one, group session, didactic instruction,

etc.)
What are the standard operating procedures?
What technologies are in use?
What types of communication techniques are implemented?
What type of organization delivers the program?
How many years has the organization existed? How many years has the program been operating?
What type of reputation does the agency have in the community? What about the program?
What are the methods of service delivery?
How is the intervention structured?
How is confidentiality maintained?

YOUR QUESTIONS:

WHEN:
When is the intervention conducted?
How frequently is the intervention conducted?
At what intervals?
At what time of day, week, month, year?
What is the length and/or duration of each service?

Evaluation Expert Session
July 16, 2002 Page 10

YOUR QUESTIONS:

WHERE:
Where does the intervention occur?
What type of facility is used?
What is the age and condition of the facility?
In what part of town is the facility? Is it accessible to the target audience? Does public transportation access

the facility? Is parking available?
Is child care provided on site?

YOUR QUESTIONS:

WHY:

Why are these activities or strategies implemented and why not others?
Why has the intervention varied in ability to maintain interest?
Why are clients not participating?
Why is the intervention conducted at a certain time or at a certain frequency?

YOUR QUESTIONS:

Evaluation Expert Session
July 16, 2002 Page 11

Validating Your Evaluation Questions

Even though all of your questions may be interesting, it is important to narrow your list to questions that
will be particularly helpful to the evaluation and that can be answered given your specific resources, staff,
and time.

Go through each of your questions and consider it with respect to the questions below, which may be helpful in
streamlining your final list of questions.

Revise your worksheet/list of questions until you can answer “yes” to all of these questions. If you cannot answer
“yes” to your question, consider omitting the question from your evaluation.

Validation

Yes

No

Will I use the data that will stem from these questions?

Do I know why each question is important and /or valuable?

Is someone interested in each of these questions?

Have I ensured that no questions are omitted that may be important to
someone else?

Is the wording of each question sufficiently clear and unambiguous?

Do I have a hypothesis about what the “correct” answer will be for each
question?

Is each question specific without inappropriately limiting the scope of the
evaluation or probing for a specific response?

Do they constitute a sufficient set of questions to achieve the purpose(s) of
the evaluation?

Is it feasible to answer the question, given what I know about the
resources for evaluation?

Is each question worth the expense of answering it?

Derived from “A Design Manual” Checklist, page 51.

Evaluation Expert Session
July 16, 2002 Page 12

Determining Methodology

Process evaluation is characterized by collection of data primarily through two formats:

1) Quantitative, archival, recorded data that may be managed by an computerized

tracking or management system, and

2) Qualitative data that may be obtained through a variety of formats, such as

surveys or focus groups.

When considering what methods to use, it is critical to have a thorough
understanding and knowledge of the questions you want answered. Your
questions will inform your choice of methods. After this section on types of
methodologies, you will complete an exercise in which you consider what method
of data collection is most appropriate for each question.

Do you have a thorough understanding of your
questions?

Furthermore, it is essential to consider what data the organization you are
evaluating already has. Data may exist in the form of an existing computerized
management information system, records, or a tracking system of some other
sort. Using this data may provide the best reflection of what is “going on,” and it
will also save you time, money, and energy because you will not have to devise
your own data collection method! However, keep in mind that you may have to
adapt this data to meet your own needs – you may need to add or replace fields,
records, or variables.

What data does your organization already have?

Will you need to adapt it?

If the organization does not already have existing data, consider devising a
method for the organizational staff to collect their own data. This process will
ultimately be helpful for them so that they can continue to self-evaluate, track
their activities, and assess progress and change. It will be helpful for the
evaluation process because, again, it will save you time, money, and energy that
you can better devote towards other aspects of the evaluation. Management
information systems will be described more fully in a later section of this
workbook.

Do you have the capacity and resources to devise
such a system? (You may need to refer to a later
section of this workbook before answering.)

Evaluation Expert Session
July 16, 2002 Page 13

Who should collect the data?

Given all of this, what thoughts do you have on who should collect data for your
evaluation? Program staff, evaluation staff, or some combination?

Program Staff: May collect data from activities such as attendance, demographics,
participation, characteristics of participants, dispositions, etc; may
conduct intake interviews, note changes regarding service delivery,
and monitor program implementation.

Advantages: Cost-efficient, accessible, resourceful, available, time-efficient,

and increased understanding of the program.

Disadvantages: May exhibit bias and/or social desirability, may use data for critical

judgment, may compromise the validity of the program; may put
staff in uncomfortable or inappropriate position; also, if staff collect
data, may have an increased burden and responsibility placed upon
them outside of their usual or typical job responsibilities. If you
utilize staff for data collection, provide frequent reminders as well
as messages of gratitude.

Evaluation staff: May collect qualitative information regarding implementation,
general characteristics of program participants, and other
information that may otherwise be subject to bias or distortion.

Advantages: Data collected in manner consistent with overall goals and timeline

of evaluation; prevents bias and inappropriate use of information;
promotes overall fidelity and validity of data.

Disadvantages: May be costly and take extensive time; may require additional

training on part of evaluator; presence of evaluator in organization
may be intrusive, inconvenient, or burdensome.

Evaluation Expert Session
July 16, 2002 Page 14

When should data be collected?

Conducting the evaluation according to your timeline can be challenging. Consider how
much time you have for data collection, and make decisions regarding what to collect
and how much based on your timeline.

In many cases, outcome evaluation is not considered appropriate until the program has
stabilized. However, when conducting a process evaluation, it can be important to start
the evaluation at the beginning so that a story may be told regarding how the program
was developed, information may be provided on refinements, and program growth and
progress may be noted.

If you have the luxury of collecting data from the start of the intervention to the end of
the intervention, space out data collection as appropriate. If you are evaluating an
ongoing intervention that is fairly quick (e.g., an 8-week educational group), you may
choose to evaluate one or more “cycles.”

How much time do you have to conduct your evaluation?

How much time do you have for data collection (as opposed to designing the evaluation,
training, organizing and analyzing results, and writing the report?)

Is the program you are evaluating time specific?

How long does the program or intervention last?

At what stages do you think you will most likely collect data?

Soon after a program has begun

Descriptive information on program characteristics that will not change; information
requiring baseline information

During the intervention
Ongoing process information such as recruitment, program implementation

After the intervention
Demographics, attendance ratings, satisfaction ratings

Evaluation Expert Session
July 16, 2002 Page 15

Before you consider methods

A list of various methods follows this section. Before choosing what methods are
most appropriate for your evaluation, review the following questions. (Some may
already be answered in another section of this workbook.)

What questions do I want answered? (see previous section)

Does the organization already have existing data, and if so, what kind?

Does the organization have staff to collect data?

What data can the organization staff collect?

Must I maintain anonymity (participant is not identified at all) or confidentiality

(participant is identified but responses remain private)? This consideration
pertains to existing archival data as well as original data collection.

How much time do I have to conduct the evaluation?

How much money do I have in my budget?

How many evaluation staff do I have to manage the data collection activities?

Can I (and/or members of my evaluation staff) travel on site?

What time of day is best for collecting data? For example, if you plan to conduct

focus groups or interviews, remember that your population may work during the
day and need evening times.

Evaluation Expert Session
July 16, 2002 Page 16

Types of methods

A number of different methods exist that can be used to collect process
information. Consider each of the following, and check those that you think would
be helpful in addressing the specific questions in your evaluation. When “see
sample” is indicated, refer to the pages that follow this table.

√ Method Description

Activity,
participation, or
client tracking log

Brief record completed on site at frequent intervals by participant or deliverer.
May use form developed by evaluator if none previously exists. Examples: sign
in log, daily records of food consumption, medication management.

Case Studies
Collection of in-depth information regarding small number of intervention
recipients; use multiple methods of data collection.

Ethnographic
analysis

Obtain in-depth information regarding the experience of the recipient by
partaking in the intervention, attending meetings, and talking with delivery staff
and recipients.

Expert judgment
Convene a panel of experts or conduct individual interviews to obtain their
understanding of and reaction to program delivery.

Focus groups
Small group discussion among program delivery staff or recipients. Focus on
their thoughts and opinions regarding their experiences with the intervention.

Meeting minutes
(see sample)

Qualitative information regarding agendas, tasks assigned, and coordination and
implementation of the intervention as recorded on a consistent basis.

Observation
(see sample)

Observe actual delivery in vivo or on video, record findings using check sheet
or make qualitative observations.

Open-ended
interviews –
telephone or in
person

Evaluator asks open questions (i.e., who, what, when, where, why, how) to
delivery staff or recipients. Use interview protocol without preset response
options.

Questionnaire
Written survey with structured questions. May administer in individual, group,
or mail format. May be anonymous or confidential.

Record review

Obtain indicators from intervention records such patient files, time sheets,
telephone logs, registration forms, student charts, sales records, or records
specific to the service delivery.

Structured
interviews –
telephone or in
person

Interviewer asks direct questions using interview protocol with preset response
options.

Evaluation Expert Session

July 16, 2002
Page 17

Sample activity log

This is a common process evaluation methodology because it systematically records exactly what is happening during
implementation. You may wish to devise a log such as the one below and alter it to meet your specific needs. Consider
computerizing such a log for efficiency. Your program may already have existing logs that you can utilize and adapt for your
evaluation purposes.

Site:

Recorder:

Code

Service

Date

Location

# People

# Hours

Notes

Evaluation Expert Session
July 16, 2002

Page 18

Meeting Minutes

Taking notes at meetings may provide extensive and invaluable process information that
can later be organized and structured into a comprehensive report. Minutes may be taken
by program staff or by the evaluator if necessary. You may find it helpful to use a
structured form, such as the one below that is derived from Evaluating Collaboratives,
University of Wisconsin-Cooperative Extension, 1998.

Meeting Place: __________________ Start time: ____________
Date: _____________________________ End time: ____________

Attendance (names):

Agenda topic: _________________________________________________

Discussion: _____________________________________________________

Decision Related Tasks Who responsible Deadline

1.

2.

3.

Agenda topic: _________________________________________________

Discussion: _____________________________________________________

Decision Related Tasks Who responsible Deadline

1.

2.

3.

Sample observation log

Evaluation Expert Session
July 16, 2002

Page 19

Observation may occur in various methods, but one of the most common is
hand-recording specific details during a small time period. The following is several rows
from an observation log utilized during an evaluation examining school classrooms.

CLASSROOM OBSERVATIONS (School Environment Scale)
Classroom 1: Grade level _________________ (Goal: 30 minutes of observation)

Time began observation: _________Time ended observation:_________
Subjects were taught during observation period: ___________________

PHYSICAL ENVIRONMENT

Question

Answer
1. Number of students

2. Number of adults in room:
a. Teachers
b. Para-pros
c. Parents

Total:
a.
b.
c.

3. Desks/Tables
a. Number of Desks
b. Number of Tables for students’ use
c. Any other furniture/include number
(Arrangement of desks/tables/other furniture)

a.
b.
c.

4. Number of computers, type

5. How are computers being used?

6. What is the general classroom setup? (are there walls, windows, mirrors,
carpet, rugs, cabinets, curtains, etc.)

7. Other technology (overhead projector, power point, VCR, etc.)

8. Are books and other materials accessible for students?

9. Is there adequate space for whole-class instruction?

12. What type of lighting is used?

13. Are there animals or fish in the room?

14. Is there background music playing?

15. Rate the classroom condition
Poor Average Excellent

16. Are rules/discipline procedures posted? If so, where?

17. Is the classroom Noisy or Quiet?
Very Quiet Very Noisy

Choosing or designing measurement instruments
Consider using a resource panel, advisory panel, or focus group to offer feedback

Evaluation Expert Session
July 16, 2002

Page 20

regarding your instrument. This group may be composed of any of the people listed
below. You may also wish to consult with one or more of these individuals throughout
the development of your overall methodology.

Who should be involved in the design of your instrument(s) and/or provide feedback?

Program service delivery staff / volunteers
Project director
Recipients of the program
Board of directors
Community leader
Collaborating organizations
Experts on the program or service being evaluated
Evaluation experts
_________________________
_________________________
_________________________

Conduct a pilot study and administer the instrument to a group of recipients, and then

obtain feedback regarding their experience. This is a critical component of the
development of your instruments, as it will help ensure clarity of questions, and reduce
the degree of discomfort or burden that questions or processes (e.g., intakes or
computerized data entry) elicit.

How can you ensure that you pilot your methods? When will you do it, and whom will you use
as participants in the study?

Ensure that written materials are at an appropriate reading level for the population.

Ensure that verbal information is at an appropriate terminology level for the population.
A third or sixth-grade reading level is often utilized.

Remember that you are probably collecting data that is program-specific. This may

increase the difficulty in finding instruments previously constructed to use for
questionnaires, etc. However, instruments used for conducting process evaluations of
other programs may provide you with ideas for how to structure your own instruments.

Evaluation Expert Session
July 16, 2002

Page 21

Linking program components and methods (an example)
Now that you have identified your program components, broad questions, specific
questions, and possible measures, it is time to link them together. Let’s start with your
program components. Here is an example of 3 program components of an intervention.

Program Components and Essential Elements:

There are six program components to M2M. There
are essential elements in each component that must
be present for the program to achieve its intended
results and outcomes, and for the program to be
identified as a program of the American Cancer
Society.

Possible Process Measures

1) Man to Man Self-Help and/or Support Groups

The essential elements within this component are:

• Offer information and support to all men
with prostate cancer at all points along the
cancer care continuum

• Directly, or through collaboration and
referral, offer community access to
prostate cancer self-help and/or support
groups

• Provide recruitment and on-going training
and monitoring for M2M leaders and
volunteers

• Monitor, track and report program
activities

• Descriptions of attempts to schedule and advertise
group meetings

• Documented efforts to establish the program
• Documented local needs assessments
• # of meetings held per independent group
• Documented meetings held
• # of people who attended different topics and speakers
• Perceptions of need of survey participants for

additional groups and current satisfaction levels
• # of new and # of continuing group members
• Documented sign-up sheets for group meetings
• Documented attempts to contact program dropouts
• # of referrals to other PC groups documented
• # of times corresponding with other PC groups
• # of training sessions for new leaders
• # of continuing education sessions for experienced

leaders
• # and types of other on-going support activities for

volunteer leaders
• # of volunteers trained as group facilitators
• Perceptions of trained volunteers for readiness to

function as group facilitators

Evaluation Expert Session
July 16, 2002

Page 22

2) One-to-One Contacts

The essential elements within this component are:

• Offer one-to-one contact to provide
information and support to all men with
prostate cancer, including those in the
diagnostic process

• Provide recruitment and on-going training
and monitoring for M2M leaders and
volunteers

• Monitor, track and report program
activities

• # of contact pairings

• Frequency and duration of contact pairings

• Types of information shared during contact pairings

• # of volunteers trained

• Perception of readiness by trained volunteers

• Documented attempts for recruiting volunteers

• Documented on-going training activities for volunteers

• Documented support activities

3) Community Education and Awareness

The essential elements within this component are:

• Conduct public awareness activities to
inform the public about prostate cancer
and M2M

• Monitor, track and report program
activities

• # of screenings provided by various health care

providers/agencies over assessment period
• Documented ACS staff and volunteer efforts to

publicize the availability and importance of PC and
screenings, including health fairs, public service
announcements, billboard advertising, etc.

• # of addresses to which newsletters are mailed
• Documented efforts to increase newsletter mailing list

Page 23

Linking YOUR program components, questions, and methods

Consider each of your program components and questions that you have devised in an earlier section of this workbook, and the
methods that you checked off on the “types of methods” table. Now ask yourself, how will I use the information I have
obtained from this question? And, what method is most appropriate for obtaining this information?

Program Component

Specific questions that go with this

component

How will I use this

information?

Best method?

Page 24

Program Component

Specific questions that go with this
component

How will I use this
information?

Best method?

Evaluation Expert Session
July 16, 2002

Page 25

Data Collection Plan
Now let’s put your data collection activities on one sheet – what you’re collecting, how you’re doing it, when, your sample, and
who will collect it. Identifying your methods that you have just picked, instruments, and data collection techniques in a
structured manner will facilitate this process.

Method

Type of data (questions, briefly
indicated)

Instrument used

When
implemented

Sample

Who collects

E.g.: Patient
interviews in health
dept clinics

Qualitative – what services they are
using, length of visit, why came in,
how long wait, some quantitative
satisfaction ratings

Interview created
by evaluation team
and piloted with
patients

Oct-Dec; days
and hrs
randomly
selected

10 interviews
in each
clinic

Trained
interviewers

Page 26

Evaluation Expert Session
July 16, 2002

Consider a Management Information System

Process data is frequently collected through a management information system (MIS) that
is designed to record characteristics of participants, participation of participants, and
characteristics of activities and services provided. An MIS is a computerized record
system that enables service providers and evaluators to accumulate and display data
quickly and efficiently in various ways.

Will your evaluation be enhanced by periodic data presentations in tables or other
structured formats? For example, should the evaluation utilize a monthly print-out of
services utilized or to monitor and process recipient tracking (such as date, time, and
length of service)?

YES

NO

Does the agency create monthly (or other periodic) print outs reflecting
services rendered or clients served?

YES

NO

Will the evaluation be conducted in a more efficient manner if program
delivery staff enter data on a consistent basis?

YES

NO

Does the agency already have hard copies of files or records that would be
better utilized if computerized?

YES

NO

Does the agency already have an MIS or a similar computerized database?

YES

NO

If the answers to any of these questions are YES,
consider using an MIS for your evaluation.

If an MIS does not already exist, you may desire to design a database in which you can

enter information from records obtained by the agency. This process decreases missing
data and is generally efficient.

If you do create a database that can be used on an ongoing basis by the agency, you may

consider offering it to them for future use.

Page 27

Evaluation Expert Session
July 16, 2002

Information to be included in your MIS

Examples include:
Client demographics
Client contacts
Client services
Referrals offered
Client outcomes
Program activities
Staff notes

Jot down the important data you would like to be included in your MIS.

Managing your MIS

What software do you wish to utilize to manage your data?

What type of data do you have?

How much information will you need to enter?

How will you ultimately analyze the data? You may wish to create a database directly in
the program you will eventually use, such as SPSS?

Will you be utilizing lap tops?

Page 28

Evaluation Expert Session
July 16, 2002

If so, will you be taking them onsite and directly entering your data into them?

How will you download or transfer the information, if applicable?

What will the impact be on your audience if you have a laptop?

Tips on using an MIS

If service delivery personnel will be collecting and/or entering information into the MIS

for the evaluator’s use, it is generally a good idea to provide frequent reminders of the
importance of entering the appropriate information in a timely, consistent, and regular
manner.

For example, if an MIS is dependent upon patient data collected by public health officers

daily activities, the officers should be entering data on at least a daily basis. Otherwise,
important data is lost and the database will only reflect what was salient enough to be
remembered and entered at the end of the week.

Don’t forget that this may be burdensome and/or inconvenient for the program staff.

Provide them with frequent thank you’s.

Remember that your database is only as good as you make it. It must be organized and

arranged so that it is most helpful in answering your questions.

If you are collecting from existing records, at what level is he data currently available?

For example, is it state, county, or city information? How is it defined? Consider whether
adaptations need to be made or additions need to be included for your evaluation.

Back up your data frequently and in at least one additional format (e.g., zip, disk, server).

Consider file security. Will you be saving data on a network server? You may need to

consider password protection.

Page 29

Evaluation Expert Session
July 16, 2002

Allocate time for data entry and checking.

Allow additional time to contemplate the meaning of the data before writing the report.

Page 30

Evaluation Expert Session
July 16, 2002

Implement Data Collection and Analysis

Data collection cannot be fully reviewed in this workbook, but this page offers a few tips
regarding the process.

General reminders:

THANK everyone who helps you, directs you, or participates in anyway.

Obtain clear directions and give yourself plenty of time, especially if you are traveling

long distance (e.g., several hours away).

Bring all of your own materials – do not expect the program to provide you with writing

utensils, paper, a clipboard, etc.

Address each person that you meet with respect and attempt to make your meeting as

conducive with their schedule as possible.

Most process evaluation will be in the form of routine record keeping (e.g., MIS). However, you
may wish to interview clients and staff. If so:

Ensure that you have sufficient time to train evaluation staff, data collectors, and/or

organization staff who will be collecting data. After they have been trained in the data
collection materials and procedure, require that they practice the technique, whether it is
an interview or entering a sample record in an MIS.

If planning to use a tape recorder during interviews or focus groups, request permission

from participants before beginning. You may need to turn the tape recorder off on
occasion if it will facilitate increased comfort by participants.

If planning to use laptop computers, attempt to make consistent eye contact and spend

time establishing rapport before beginning. Some participants may be uncomfortable with
technology and you may need to provide education regarding the process of data
collection and how the information will be utilized.

If planning to hand write responses, warn the participant that you may move slowly and

Page 31

Evaluation Expert Session
July 16, 2002

may need to ask them to repeat themselves. However, prepare for this process by
developing shorthand specific to the evaluation. A sample shorthand page follows.

Page 32

Evaluation Expert Session
July 16, 2002

Annual Evaluation Reports

The ultimate aim of all the Branch’s evaluation efforts is to increase the intelligent use of
information in Branch decision-making in order to improve health outcomes. Because we
understand that many evaluation efforts fail because the data are never collected and that even
more fail because the data are collected but never used in decision-making, we have struggled to
find a way to institutionalize the use of evaluation results in Branch decision-making.

These reports will serve multiple purposes:

The need to complete the report will increase the likelihood that evaluation is done and
data are collected.

The need to review reports from lower levels in order to complete one’s own report
hopefully will cause managers at all levels to consciously consider, at least once a year,
the effectiveness of their activities and how evaluation results suggest that effectiveness
can be improved.

The summaries of evaluation findings in the reports should simplify preparation of other
reports to funders including the General Assembly.

Each evaluation report forms the basis of the evaluation report at the next level. The contents
and length of the report should be determined by what is mot helpful to the manager who is
receiving the report. Rather than simply reporting every possible piece of data, these reports
should present summary data, summarize important conclusions, and suggest recommendations
based on the evaluation findings. A program-level annual evaluation report should be ten pages
or less. Many my be less than five pages. Population team and Branch-level annual evaluation
reports may be longer than ten pages, depending on how many findings are being reported.
However, reports that go beyond ten pages should also contain a shorter Executive Summary, to
insure that those with the power to make decisions actually read the findings.

Especially, the initial reports may reflect formative work and consist primarily of updates on the
progress of evaluation planning and implementation. This is fine and to be expected.
However, within a year or two the reports should begin to include process data, and later actual
outcome findings.

This information was extracted from the FHB Evaluation Framework developed by Monica Herk and Rebekah Hudgins.

Page 33

Evaluation Expert Session
July 16, 2002

Suggested shorthand – a sample
The list below was derived for a process evaluation regarding charter schools. Note the use of general shorthand as
well as shorthand derived specifically for the evaluation.

CS

Charter School

mst

Most

Sch School b/c Because
Tch Teacher, teach st Something
P Principal b Be
VP Vice Principal c See
Admin Administration, administrators r Are
DOE Dept of Education w/ When
BOE Board of Education @ At
Comm Community ~ About
Stud Students, pupils = Is, equals, equivalent
Kids Students, children, teenagers ≠ Does not equal, is not the same
K Kindergarten Sone Someone
Cl Class # Number
CR Classroom $ Money, finances, financial, funding,

expenses, etc.
W White + Add, added, in addition
B Black < Less than AA African American > Greater/more than
SES Socio-economic status ??? What does this mean? Get more

info on, I’m confused…
Lib Library, librarian DWA Don’t worry about (e.g. if you wrote

something unnecessary)
Caf Cafeteria Ψ Psychology, psychologist
Ch Charter ∴ Therefore
Conv Conversion (school) ∆ Change, is changing
S-up Start up school mm Movement
App Application, applied ↑ Increases, up, promotes
ITBS Iowa Test of Basic Skills ↓ Decreases, down, inhibits
LA Language arts X Times (e.g. many x we laugh)
SS Social Studies ÷ Divided (we ÷ up the classrooms)
QCC Quality Core Curriculum C With
Pol Policy, politics Home, house
Curr Curriculum ♥ Love, adore (e.g. the kids ♥ this)
LP Lesson plans Church, religious activity
Disc Discipline O No, doesn’t, not
Girls, women, female 1/2 Half (e.g. we took 1/2)
Boys, men, male 2 To

Page 34

Evaluation Expert Session
July 16, 2002

F

Father, dad

c/out

without

P

Parent

2B

To be

M

Mom, mother

e.g.

For example

i.e.

That is

If the person trails off, you missed
information

Appendix A

Logic Model Worksheet

Population Team/Program Name __________________________ Date _______________________

If the following
CONDITIONS
AND
ASSUMPTIONS
exist…

And if the following
ACTIVITIES are
implemented to
address these
conditions and
assumptions

Then these
SHORT-TERM
OUTCOMES may
be achieved…

And these
LONG-TERM
OUTCOMES
may be
acheived…

And these LONG-
TERM GOALS can
be reached….

Page 35

Evaluation Expert Session
July 16, 2002

Appendix B

Pitfalls To Avoid

Avoid heightening expectations of delivery staff, program recipients, policy makers, or

community members. Ensure that feedback will be provided as appropriate, but may or may
not be utilized.

Avoid any implication that you are evaluating the impact or outcome. Stress that you are

evaluating “what is happening,” not how well any one person is performing or what the
outcomes of the intervention are.

Make sure that the right information gets to the right people – it is most likely to be utilized

in a constructive and effective manner if you ensure that your final report does not end up on
someone’s desk who has little motivation or interest in utilizing your findings.

Ensure that data collection and entry is managed on a consistent basis – avoid developing an

evaluation design and than having the contract lapse because staff did not enter the data.

Page 36

Evaluation Expert Session
July 16, 2002

Appendix C

References

References used for completion of this workbook and/or that you may find helpful for
additional information.

Centers for Disease Control and Prevention. 1995. Evaluating Community Efforts to Prevent
Cardiovascular Diseases. Atlanta, GA.

Centers for Disease Control and Prevention. 2001. Introduction to Program Evaluation for
Comprehensive Tobacco Control Programs. Atlanta, GA.

Freeman, H. E., Rossi, P. H., Sandefur, G. D. 1993. Workbook for evaluation: A systematic
approach. Sage Publications: Newbury Park, CA.

Georgia Policy Council for Children and Families; The Family Connection; Metis Associates,
Inc. 1997. Pathways for assessing change: Strategies for community partners.

Grembowski, D. 2001. The practice of health program evaluation. Sage Publications: Thousand
Oaks.

Hawkins, J. D., Nederhood, B. 1987. Handbook for Evaluating Drug and Alcohol Prevention
Programs. U.S. Department of Health and Human Services; Public Health Service; Alcohol,
Drug Abuse, and Mental Health Administration: Washington, D. C.

Muraskin, L. D. 1993. Understanding evaluation: The way to better prevention programs.
Westat, Inc.

National Community AIDS Partnership 1993. Evaluating HIV/AIDS Prevention Programs in
Community-based Organizations. Washington, D.C.

NIMH Overview of Needs Assessment. Chapter 3: Selecting the needs assessment approach.

Patton, M. Q. 1982. Practical Evaluation. Sage Publications, Inc.: Beverly Hills, CA.

Page 37

Evaluation Expert Session
July 16, 2002

Posavac, E. J., Carey, R. G. 1980. Program Evaluation: Methods and Case Studies.
Prentice-Hall, Inc.: Englewood Cliffs, N.J.

Rossi, P. H., Freeman, H. E., Lipsey, M. W. 1999. Evaluation: A Systematic Approach. (6th
edition). Sage Publications, Inc.: Thousand Oaks, CA.

Scheirer, M. A. 1994. Designing and using process evaluation. In: J. S. Wholey, H. P. Hatry, &
K. E. Newcomer (eds) Handbook of practical program evaluation. Jossey-Bass Publishers: San
Francisco.

Taylor-Powell, E., Rossing, B., Geran, J. 1998. Evaluating Collaboratives: Reaching the
potential. Program Development and Evaluation: Madison, WI.

U.S. Department of Health and Human Services; Administration for Children and Families;
Office of Community Services. 1994. Evaluation Guidebook: Demonstration partnership
program projects.

W.K. Kellogg Foundation. 1998. W. K. Kellogg Foundation Evaluation Handbook.

Websites:
www.cdc.gov/eval/resources
www.eval.org (has online text books)
www.wmich.edu/evalctr (has online checklists)
www.preventiondss.org

When conducting literature reviews or searching for additional information, consider using
alternative names for “process evaluation,” including:
formative evaluation
program fidelity
implementation assessment
implementation evaluation
program monitoring

Calculate your order
Pages (275 words)
Standard price: $0.00
Client Reviews
4.9
Sitejabber
4.6
Trustpilot
4.8
Our Guarantees
100% Confidentiality
Information about customers is confidential and never disclosed to third parties.
Original Writing
We complete all papers from scratch. You can get a plagiarism report.
Timely Delivery
No missed deadlines – 97% of assignments are completed in time.
Money Back
If you're confident that a writer didn't follow your order details, ask for a refund.

Calculate the price of your order

You will get a personal manager and a discount.
We'll send you the first draft for approval by at
Total price:
$0.00
Power up Your Academic Success with the
Team of Professionals. We’ve Got Your Back.
Power up Your Study Success with Experts We’ve Got Your Back.

Order your essay today and save 30% with the discount code ESSAYHELP