Introduction to Program Evaluation Professor: Dr. Jay Wilson Course name: ECUR 809.3 Fall 2009
Planning an Evaluation Program: an Improvement Approach The case of a Community Center in Ottawa
By Nelson Dordelly Rosales Graduate Student Educational Communications and Technology
E-mail:
[email protected] [email protected]
December 6th, 2009
2
OUTLINE Page Introduction…………………………………………………… Introduction……………………… ………………………………………………..……..…… …………………..……..……...... ...... 3 I. Review of Literature - What is Evaluation?.......................................................... ........................................................ . 4 - Evaluation Models, Approaches and Methods ……..…………………… ……..…………………………………......... ……………......... 7 - Practical Steps Steps of Getting a Program Evaluation Evaluation Planned Program Evaluation ….……....... ….……....... 8 II. Description of the Case Study: the Community Center in Ottawa - Description of the Center: Structure………………..…… Structure………………..…………………………….…… ……………………….…….….…10 .….…10 - Diversified Programs and Values…………………..…… Values…………………..………………………………… ……………………………..….…11 ..….…11 - Vision……………………………………………………………….….……………….…....12 - Mission, Goal or Purpose…………………………… Purpose……………………………………………………… ………………………………….….. ……….….. 13 Description of Selected Program: the Intermediate Spanish Program (SIP) - Outline and Main Goal…………………… Goal……………………………………………… ………………………………………….….….13 ……………….….….13 - Program Structure: content, activities, evaluation and materials……………… materials……………………….….14 ……….….14 - Pre-requisite and Clientele or Participants ……………………….… ……………………….………………………. ……………………. 15 III. Outlining the Program Evaluation Plan for the Assessment of ISP A suitable Approach: Ap proach: The improvement Model………………………………………..…………15 Model………………………………………..…………15 - Stage 1 - Preparation: Focusing the Evaluation on the Improvement of SIP………..……….18 - Stage 2 - Assessment: Collecting and Using the Information..................................................20 - Stage 3 – Data Collection………………………………………………………..……..…… Collection………………………………………………………..……..…….21 .21 - Stage 4 - Performing an Evaluation Assessment Process………………………………....…22 Process………………………………....…22 Work Plan Objective (intervention objective) Users of Evaluation Plan development: Questions and Indicators…………………………………..….23 Data Sources and Data Collection Method…….………………………….....…..24 Budget, Resources and Timelines……………………………………………......24 Timelines……………………………………………......24 Data analysis, Communicate Results and Designate Staff Responsibility……....25 Using the Information…………… Information………………………………………… …………………………………………........25 ……………........25 - Stage 5 – Evaluation: Role of the Evaluator……………………….........................................26 - Stage 6 – Reflection…………………… Reflection……………………………………………… ………………………………………………….….28 ……………………….….28 Summary……………………………………………… Summary………………… ……………………………………………………… ……………………………………….….29 …………….….29 Conclusion ………………………… ……………………………………………………… ………………………………………..………………… …………..………………….…30 .…30
3
Bibliography………………………………………… Bibliography……………… ……………………………………………………… ………………………………………….....31 …………….....31 Appendices……………………………………………………… Appendices………………………… ……………………………………………………… ……………………………….. …….. 33 Introduction Careful planning of an evaluation program helps to start the whole process successfully. This paper is a report of four months of work (September to December 2009) as planner of an evaluation program. The purpose is to explain the design of the evaluation program to assess success and possible improvements to the Spanish Intermediate Program (SIP), which is one of the “Gener “General al Intere Interests sts Progra Programs” ms”,, offere offered d this this past past Spring Spring/Su /Summe mmerr of 2009 by the non-pro non-profit fit Community Center in the City of Ottawa. The plan is a theoretical paper that outlines the program to be evaluated, integrates the different tools and theories addressed recently (in the Course ECUR 809) into an evaluation plan, explains why it is a suitable evaluation plan to assess the level of satisfaction of clients and proposes a survey instrument to conduct the analysis. Essentially the pur purpo pose se of this this evalu evaluat atio ion n plan plan is to evalu evaluat atee the the leve levell of clie client nts’ s’ sati satisf sfac acti tion on abou aboutt the the organization, design and teaching of SIP, and to convince the Coordinator of the “General Interests Programs” of the Community Center that an internal evaluator, with the help of the clients or student students, s, teache teachers rs and the Coordin Coordinato ator, r, should should be “the “the evaluat evaluator or for the evalua evaluatio tion” n” of the Program. An important piece of this evaluation plan is to describe, or elaborate upon, main reasons for selecting the improvement approach and the logic model, which is useful for describing group work, team work, community-based collaborative and other complex organizational processes as its seeks to promote improvement (University of Wisconsin, 2009). Through case study, this paper will lend insight to ways through a logic “improvement model” to facilitate a “holistic” approach to the evaluation.
4
This paper has been organized in three parts comprising, a review of literature, a description of the case and outline of selected Program, and an explanation of the process of developing the program evaluation plan. A survey questionnaire was developed and applied to a sample of students or clients of the program. The paper includes appendices with the preliminary and final versions of the survey questionnaire. Other appendices such as the flowchart that shows the logic model and the preliminary program evaluation plan are also included in this paper. These docume documents nts were were posted posted in www.researchphilosophy.blogspot.com www.researchphilosophy.blogspot.com..
Data Data anal analys ysis is and and thr three ee
specific suggestions for improvement of SIP were collected and published electronically in the web site: https://survey.usask.ca/results.php?sid=17783 https://survey.usask.ca/results.php?sid=17783.. Review of Literature
Making a careful search of the literature before designing or developing new instruments is important. According to Posavac (1990), evaluators can learn from the successes and failures of others and get a picture of the methodological, political and practical difficulties that must be overcome. The focus of this review of literature is on the conception of evaluation and the differ different ent models models,, methods methods and approa approache chess for evaluat evaluating ing and assess assessing ing the effect effective ivenes nesss of programs. Below I will explain each of these important aspects of the review of literature. What is Evaluation? Evaluation is a term used in many different ways. Scriven (1996)
identified sixty different ways to define the evaluation of a program that range from “appraise, analyze, assess, critique, examine, grade, inspect, judge, review, study, testing, among others” (p.151-162).
Talmage (1982) notes that “three purposes appear most frequently in definitions of
evaluation: (1) to render judgments on the worth of a program; (2) to assist decision makers responsible for deciding policy; and (3) to serve a political function” (p. 594).
5
Indeed Indeed,, there there is only only one overal overalll purpos purposee for progra program m evalua evaluatio tion n activi activitie ties, s, which which is “contributing to the provision of quality services to people in need” (Posavac et al., 2004, p. 1314). Evaluation work can be defined as a decision-making process whose focus is on “envisioning and enacting a good educational journey for all students” (Wiggins, 1996, p. 20). It requires the pract practice ice of sophis sophistic ticate ated d judgme judgments nts and sugges suggestio tions ns by the studen students ts themse themselve lvess in order order to continually improve those programs. As Dewey (1966 edn.) once suggested, when we make educational decisions, we must think broadly about the consequences of our actions. Program evaluation is “to assess academic success” (Stephenie M. Hewett, 2008, p. 3204). In this sense, evaluation evaluation must be inclusive inclusive and generous; generous; in other words, evaluation evaluation purposes purposes to value success, not failure. Hewett (2008) says that evaluation is a multi-faceted process. The author suggests eval evalua uati tion on as a cont contin inuo uous us and and ong ongoi oing ng proc proces ess, s, provi providi ding ng both both form format ativ ivee (ongo (ongoin ing) g) and summat summative ive (culmi (culminat nating ing)) opportu opportunit nities ies to monito monitorin ring g progre progress ss toward toward achiev achieving ing essent essential ial outcomes. Thus, no longer is assessment perceived as a single event. For example, recently Sherry Y. Chen (2008) evaluated students’ learning performance and their perceptions in a Web-based instructional program that was applied to teach students how to use HTML at Brunel University. Students’ Students’ task achievements achievements were affected by the levels of their previous previous system experience and their post-test and gain scores were positively influenced by their perceptions and attitudes toward the instructional program. Chen’s study has shown the importance of understanding individual diff differ eren ence cess in the the devel develop opme ment nt of a prog progra ram m eval evaluat uatio ion. n. Th Thee autho authorr sugge suggest stss test testin ing g and modification of the tests used in her research to build better instruments that can accommodate individual differences. Thus, evaluation is a never ending-cycle of educational effort. Evaluation Evaluation uses inquiry inquiry and judgment judgment methods, methods, including, including, among others, others, 1) determinin determining g standards for judging quality and deciding whether those standards should be relative or absolute,
6
2) collecting relevant information, and 3) applying the standards to determine value, quality, utility, effectiveness, or significance” (Fitzpatrick et (Fitzpatrick et al., 2004, p. 5). Through these methods there will be possible to discover discrepancies between program objectives and the needs of the target population, between program implementation and program plans, between expectations of the target population and the services actually delivered (Posavac, et al., 2004, p. 29). These methods lead to recommendations intended to optimize the evaluation object in relation to its intended purpose (s) or to help stakeholders determine whether the evaluation object is worthy of adoption, continuation, or expansion. For example, Krista Breithaupt and Colla J McDonald (2008) provided a description of the development and pilot study of the survey measure, and proposed their survey as a means of assessing the quality of e-learning programs against this standard. Their findings provide practical insights into how to support adult learners with varying needs and capabilities as they engage in learning programs. Their findings offer insights for the improvement of e-programs to meet the needs of clients. This type of program evaluation contributes to quality services by providing feedback from program activities and outcomes to those who can make changes in programs or who decide which services are to be offered. Indeed, without feedback, human service programs (or any activity) cannot perform effectively. In this sense, Chelimsky (1997) uses the terms ‘evaluation for development’ and ‘evaluation for accountability’ to refer to the formative and summative purposes, respectively. James Henderson and Katheleen Kesson (2004) talk about evaluation as “wisdom.” The evaluation worker must consciously inquire into “the quality of educational experiences in a comprehensi comprehensive, ve, penetrating penetrating,, and far-sighte far-sighted d way” (p. 2). Evaluation involves both personal personal soul searching and discerning criticism for the improvement of education. It is an affirmation of hope and aspiration. “Evaluation is necessarily grounded in a humble, pragmatic openness; it takes
7
boldness and a deep sense of responsibility to translate our visions into action” (p. 3). During evaluation we must act with integrity. This way of looking at evaluation places an enormous challenge challenge in our capacities capacities to exercise good judgment. judgment. Evaluation Evaluation is a never ending-cycle ending-cycle of educational effort. In this sense, program evaluation is important because “information is needed to meet the obligation of providing effective services, verify that planned programs do provide services, examine the results, determine which services produce the best results, select program that offer the most needed types of services, services, survey clients’ clients’ reactions reactions and judgments judgments to maintain maintain and improve quality and watch for unplanned side effects, among others” (Posavac and Carey, 2003, p. 3). Those are primary goals of program evaluation that can be met using a number of different types of program evaluations: “the evaluation of need, the evaluation of process, the evaluation of outcome, and the evaluation of efficiency” (Posavac and Carey, 2003 p. 10). Different Models, Approaches and Methods. Through the history of evaluation, a number
of different models, and approaches to evaluation have been put forward to guide the design and implementation of program evaluation. According to Posavac and Carey, 2003, those are the following: the tradition (informally made by supervisors, self-evaluations, without the disciplined analysis), social science research (to determine a program’s degree of success, experimental approaches, introduced rigor and objectivity), industrial inspection ( depends on inspecting the product at the end of the production line, reminds using a single final examination), black box evaluation (examine the output of a program without examining its internal operation, reminds a
consumer or product evaluation), objective-based evaluation (examining goals and objectives and particular structure), goal-free evaluation (focus is on the program as administered, the staff, the client clients, s, the setti setting, ng, and the record records, s, impact impactss of the progra program), m), fiscal fiscal evaluation evaluation (focus is on calcul calculati ation on of
accountability ty model (focus the financi financial al invest investmen ment, t, increase increase output), output), accountabili (focus on
8
compliance with regulations), expert opinion model (art end literary criticism to examine a work to render a judgment about its quality) naturalist (data-gathering instruments, not surveys or records, qualit qualitati ative ve methods methods,, getti getting ng rich rich underst understandi anding ng of the progra program), m),
empowerment empowerment evaluation evaluation
(requires close contact with the community stakeholders, inviting clients to participate actively to improve their own community), theory-driven evaluation (careful controlled research, the analysis consis consists ts of calcul calculati ating ng the correl correlati ations ons among among variab variables les)) and an improvement-focused model (purposes to bridge the gap between what is or can be observed and what was planned). Jody L Fitzpatrick, James R. Sanders and Blaine R. Worthern (2004) provide an explanation of alternative alternative approaches and practical practical guidelines guidelines for program program evaluation. evaluation. The authors authors identifie identified d five approaches comprising: objectives-oriented (the purposes are specified, and then evaluation focuses on the extent to which those purposes are achieved, it uses logic models, information could be used to reformulate the program or a part of it), management oriented (systems approach in which decisions are made about inputs, processes, and outputs to serve decision makers; much like the logic models, highlighting highlighting levels levels of decisions), decisions), consumer-oriented (it is predom predomina inantl ntly y a summative evaluation, uses checklists and criteria of the consumer), expertise oriented (evaluation administered by a team of professional experts, members of committees to produce a sound eval evalua uati tion) on) , and and the the participant-oriented (part (partici icipant pants’ s’ opinio opinions; ns; client clientss are the focus focus and orientation of evaluation; they organize and perform evaluation activities). An adequate way of reporting the "success and failure" of a program seems to be, according to Stake (1975), the “responsive approach or the "clock" model to reflect the prominent recurring events in a responsive evaluation: “talk with clients, program staff, audiences; identify program scope; overview program activities; discover purposes, concerns; conceptualize issues, problems; identify data needs re issues; select observers, judges, instruments, if any; observe designated
9
antecedents, antecedents, transactio transactions ns and outcomes; outcomes; thematize: thematize: prepare prepare portrayals portrayals,, case studies; studies; validate, validate, confirm, attempt to disconfirm; winnow, for audience use; and assemble formal reports, if any” (p. 19). Practical steps of getting a program evaluation planned: The Student Evaluation: Teacher
Handbook by Saskatchewan Education (1991) explains five stages a) preparation b) assessment c)
performing an evaluation assessment process, d) evaluation, and e) reflection. Posavac and Carey (2003) coincide in that the steps in preparing to conduct an evaluation program are the following: identify identify the program program and its stakeholder stakeholders, s, become familiar familiar with information information needs, planning planning and evaluation. More specifically, the authors highly value the importance of the criteria and standards that that are chosen for specific programs. To learn how to improve a program, staff members need to find out the extent to which those purposes are achieved, so that information can be used to reformulate the program or a part of it. As an evaluator, one needs to know what is not occurring as expected: “Do the clients have needs that were not anticipated? Has it been more difficult than expected to teach the staff needed skills? Has less support materialized than was promised? Has experience experience led the staff staff to questio question n the conceptual conceptual basis basis of the program?” program?” (p.12) (p.12)
According According to
these authors, objective information is needed, but such information should be interpreted using qualitative information as well. In this sense, the primary goals of program evaluation that can be met using a number of different types of program evaluations, which in logical sequence are: “the evaluation of need, the evaluation of process, the evaluation of outcome, and the evaluation of efficiency” (p. 10) The authors have found that personal observations provide direction in selecting what to measure and in forming an integrated understanding of the program and its effects; that is, does the program or plan match the values of the stakeholders? Does the program or plan match the needs of the people to be served? Does the program as implemented fulfill the plans? Do the
10
outcomes achieved match the goals? Are the resources devoted to the program being expended appropriately? In short, planning program evaluation focuses on improvement improvement and implementing the plan means to be wise in managing students, time, materials, budget and resources. II Description of the Case Study: The Community Center in Ottawa
The Glebe Neighborhood Activities Group (GNAG) was selected as the organization to use as a model for the work of program evaluation. This section will focus on the description, goals or object objective ives, s, stakeho stakeholde lders, rs, philos philosophy ophy and analysi analysiss of the progra programs’ ms’ struct structure ure of this this select selected ed Community Center in the City of Ottawa. Below is the description of the Center, based on existing materials and its Web site: http://www.gnag.ca/index.php Structure : GNAG is a non-profit community group working in partnership with the City of
Ottawa and other community organizations to deliver social, cultural and recreational activities in the heart of the Glebe. Their mission is to enhance and enrich life in our community by creating opportuniti opportunities es through dynamic, dynamic, innovative innovative and affordable affordable activities activities and services. services.
GNGA is
organized hierarchically in the following manner: a) Ex-Officio Board Members, b) Chair, Vice Chair and Treasurer, and elected Board Members that attend monthly board meetings and the annual general meeting to oversee the financial, personnel and operational management of the Group; b) Committee Members and special committees of the Board dealing with such issues as perso personnel nnel managem management ent,, strate strategic gic planni planning, ng, specia speciall projec projects ts or events events;; c) Special Special Event Event Coordinator that takes the lead as the Coordinator of a special event (i.e. (i.e. Glebe House Tour or Taste of the Glebe) Glebe).. This This involv involves es event event planni planning, ng, organi organizin zing g and coordi coordinat nating ing the tasks tasks of other other members that help out at special events such as the Glebe House Tour, Glebe Fall Craft and Artisan Fair, Snow Flake Special, Taste of the Glebe and Family Dances, among many others; d)
11
Secretaries and Volunteers that attend to Committee meetings or help to complete any assignments or tasks between meetings. GNAG invites the community to their Annual General Meeting at the Glebe Community Centre on a periodical basis. b asis. Community Centre opened its doors on October 2, 2004, Diversified Programs : The Glebe Community almost thirty years since the first official opening of the Centre was held on November 28, 1974. GNAG has provided feedback and input to The City of Ottawa in the development of a new set of principles to guide the delivery of recreation services for the next 10 to 20 years, specifically about how cultural and recreation services are provided and the challenges in meeting the demands of the future, focusing on Service Delivery, Accessibility and Inclusion, Tax Support, Subsidization and Revenue Generation. The resulting responses were incorporated into a final strategic direction presented presented to City Council for considerat consideration ion and approval (available in website website at www.gnag.ca www.gnag.ca). ). Under the leadership of GNAG, in partnership with the city's Recreation and Parks Department, the centre became a hub of activity offering special events and a full slate of cultural, educational and recreational programs for all ages. It offers diversified programs. Currently it offers the following programs: Community Theater, Craft & Artisan, Artisan, Jewelry show, Lobster Kitchen Kitchen Party, Exercise with baby, Infants, Preschooler, Parents & Caregivers, Breakfast Club, Family Children and Youth, Birthday Party Central, Dance Studio, Pottery Studio, Workshop for all ages, Health, Wellness & Group Fitness and Adults General Interests, which include the following specific pro progr gram ams: s:
Span Spanis ish h
Begin Beginner ner,,
Span Spanis ish h
Inte Interm rmed edia iate te/C /Con onver versa sati tion onal al,,
Spani Spanish sh
Adva Advanc nced, ed,
Photography, Sports, and Painting/Drawing. Values: The aim of GNAG is the inclusion of all. To GNAG a community is better off when
its members: members: a) care for each other, b) participate and contribute, c) share their their skills skills and talents, d) celebr celebrate ate togeth together. er. The purpos purposee is servin serving g the commun community ity with with compas compassio sion, n, caring caring and
12
commit commitmen mentt through through a handshands-on on approac approach h by volunt volunteer eers. s. It offers offers creati creative ve and innovat innovative ive programming that keeps up with trends and demographic changes. Overall, the goal is having a rich cultural environment within the community. Vision - In ten year’s time GNGA “Strategic Plan of September 25th, 2008) visualizes its
vision related to people, community, programs and organization. Below is GNAG’s vision for each of them:
People:
Community:
- Staff members are happy, fulfilled, challenged and well-rewarded. - Visitors and program participants consistently report high levels of satisfaction with our centre, personnel and programs. - Volunteers are growing in number and are integral to the spirit of the Community centre. - Board of Directors is empowered, well-qualified, involved and dynamic.
- Our clients live in our community, the city-atlarge and beyond. - All members of our community feel a connection to and ownership of their community centre. - Our community centre is the cultural, social, and recreational heart of the Glebe.
13
Programs:
Organization:
- GNAG will be Ottawa’s flagship centre, offering the most innovative, responsive and wide ranging programs in the City.
GNAG has facilitated all the achievements in the People, Community and Programs categories by:
- The most frequented facility, which now includes satellite centers in partnership with local schools, churches and senior’s residences.
- Having efficient, up-to-date business tools.
- Create a gathering place for all ages, especially for children and youth. - ensuring that GNAG program reflects the needs of our community by monitoring closely demographic change in the neighborhood.
- The on-going recruitment, training and development of excellent staff. - Having effective and engaging partnerships with local businesses and Government. - Ensuring strong and stable financial operations.
- an ongoing programming evaluation evaluation and review process to ensure that courses are relevant, have the best possible quality, and are cost effective or meet the goals of our strategic plan
Mission: GNAG’s mission is to enhance and enrich life in our community by creating
opportunities through dynamic, innovative, and affordable activities and services. GNAG achieves this by engaging highly competent, experienced and friendly staff and dedicated and committed volunteers in alliances and partnerships with the City of Ottawa, local businesses, churches, schools and other community organizations. Goal or Purpose: GNAG is a community-driven, not-for-profit, volunteer organization
working in the heart of the Glebe in the City of Ottawa to deliver social, cultural and recreational activities in cooperation with other groups in the community. Outline of a Selected Program : Spanish Intermediate Program (SIP)
GNAG considers the study of Spanish to be essential components in the acquisition of a liberal arts education for the community members, right at the heart of the City of Ottawa. At
14
present, Spanish is the third most widely spoken language in the world. The Spanish speaking population of Europe and North and South America is estimated at 500 million people. These numbers increasingly include the thriving Latino communities in the United States. These are demographic realities that Canada cannot ignore. Relatively few Canadians know Spanish and have an understanding of Hispanic culture. The mission of the GNAG is to lay a conversational founda foundatio tion n and promot promotee the knowled knowledge ge of the liter literatu ature re and cultur cultures es of Spanis Spanish h and Latin Latin American societies, to which many Canadians have strong historic ties. SIP’s Main Goal: The purpose is to achieve Spanish Conversation at intermediate level.
With this program the student will acquire fluency and precision of speaking Spanish in a very friendly atmosphere. The student will develop both spoken and written communication skills. The student will attain development of listening and reading comprehension. Program structure : The Intermediate Spanish Program (SIP) is designed to be used not only
at the computer and in the Community Center classrooms, but also on the commute, at the gym, on tennis lessons, on a walk—anywhere the student can take guiding materials, a CD player or iPod. Because the program materials are so convenient to use, the student will find himself/herself connected with the program. Content: Arts, language and literature; it is a step-by-step, systematic method to get
participant speaking Spanish conversationally. 1. 2. 3. 4. 5.
List Listen en and and Rep Repea eatt Subst Substit itut utio ion n Dril Drills ls Repl Replac acem emen entt Dril Drills ls Tran Transl slat atio ion n Dril Drills ls Vari Variat atio ion n Dri Drill llss Activities: Students learn real life Spanish. Always, the goal is learning to have
meaningful conversations. The vocabulary, selected very carefully, only includes high-frequency
15
words that are going to be very useful. Rather than distracting the participant with lots of words, the the focu focuss is more more on stru struct ctur ures es—s —str truc uctu ture ress that that can be expa expande nded d and and used used in a varie variety ty of situations. The sequence is as follows: 1. Practi Practice ce with with the writ written ten trans transcri cript. pt. 2. Practice Practice without without the the transcri transcript pt with with Spanish Spanish speaking speaking people. people. 3. To speak speak out loud, loud, at normal normal conver conversat sation ional al volume. volume. Continuous Evaluation and Improvement : will cover a relatively small body of material so
well that it becomes easy for the student to reproduce it. Focus is on pronunciation, vocabulary, grammar, verb drills, and conversational dialogues. Materials: illustrative materials, guides, games, outdoor activities, music, audio CDs with the
interactivity of the Internet, and with Spanish speaking instructors. Spanish level 1 (Beginner). (Beginner). Other Criteri Criteria: a: authenticit authenticity. y. The Spanish Pre-requisite: Spanish language as it is spoken in actual conversations—things students can really use when they travel and interact with native speakers. The program offers guided conversation and discussions, a varied practical exercise with emphasis on vocabulary vocabu lary and grammar (theory and practice). Relatively few Canadians Canadians know Spanish and have an Clientele, Clientele, Participants Participants or students: Relatively understanding of Hispanic culture. The clientele of Spanish in Conversational Programs are mostly Canadians that live in the Community. III Outlining the Program Evaluation Plan This past year 2008/2009 the Program of Spanish Intermediate level (SIP) was successfully implemented. There is a need to evaluate if the clientele (students) are satisfied with its organization, design and implementation. The idea is to know what should be improved for next
16
year. The stakeholder stakeholderss are, the Coordinator Coordinator of ‘Adults ‘Adults General Interests Interests Programs,’ Programs,’ two Spanish teachers or instructors and participants or students. This section will (1) explain the best approach to carry out a program evaluation for the Community Center, particularly, the Spanish Intermediate Program, in order to determine the level of clients satisfaction; (2) describe a suitable outline of planning the program evaluation, which explai explains ns how to develop develop a progra program m evalua evaluatio tion n plan, plan, includ including ing:: a) prepar preparati ation: on: focusi focusing ng the evaluation on the main 3 things that should be improved, b) assessment: collecting and using the information, c) performing an evaluation assessment process, d) evaluation, the role of evaluator, and e) reflection; (3) summarize the whole process with the specific case of SIP. As a community member of the non-profit Community Center in the City of Ottawa, and as a voluntary member, my contribution in this preliminary plan of program evaluatin is to explain a suitable approach for program evaluation –the improvement oriented evaluation -- why and how it can be applied in SIP, whic which h is suppo suppose sed d to be a nov novel elty ty with with posit positiv ivee impa impact ct in part partic icip ipant ants’ s’ sati satisf sfac acti tion on and and achievement. A Suitable Approach : Outlining an evaluation program plan means firstly to have clear
reasons for initiating evaluation. Sometimes the ‘evaluation’ client can help us to find out: whose need? What does s/he want to know? Why? What is its purpose? So, by listening closely to client’s reasons for initiating the evaluation and talking to other stakeholders is important to determine the reasons for initiating the evaluation and the purpose for the evaluation (Fitzpatrick et al., 2004, p. 175). In this sense, Chen (1996) proposed the ‘needs assessment, assessment, process and outcome’ to refer to the type of questions the evaluation program should focus. Thus, questions are concerned with esta establ blis ishi hing ng a) whet whether her a prob proble lem m or need need exist existss and and desc descri ribi bing ng that that prob proble lem, m, b) maki making ng recommendations for ways to reduce the problem. Process or monitoring studies typically describe
17
how the program is delivered. Such studies may focus on whether the program is being delivered according to some delineated plan or model or may be more open-ended, simply describing the nature nature of delivery and the successes and problems problems encountered. encountered. “Outcome “Outcome studies are concerned concerned with describing, exploring, or determining changes that occur in program recipients, secondary audiences, or communities as a result of a program” (Fitzpatrick, et al., 2004, p.21). It is important to have clear evaluation philosophy, theory or model. What approach will be taken to accomplish an effective program evaluation? evaluation? To Posavac Posavac and Carey (2003) to assess assess the “evaluabil “evaluability” ity” of the program we need to review review literatur literaturee and to examine examine previous previous experiences, experiences, and select a model of evaluation that suits the need of the case. It is also necessary to define the role of evaluator, and the purpose and steps in preparing to conduct the evaluation. For purposes of this work, as previously stated in the review of literature, the ‘improvement oriented approach” is the best way to start evaluation of the Community Center Programs. This model takes some insights from different theories such as, the improvement-focused approach of Posavac and Carey (2003), the ‘objectives-oriented evaluation’ model of Fitzpatrick et al., (2004) and the approach “State Program Evaluation Guides: Developing an Evaluation Plan” taken by CDC/Department of Heal Health th and and Huma Human n Servi Service cess (2009 (2009)) and and The Stud Student ent Eval Evalua uati tion on:: A Teach Teacher er Hand Handboo book k (Saska (Saskatch tchewan ewan Educat Education ion,, 199 1991), 1), which which consid consider er that that the main main phases phases are the follow following ing:: preparation, assessment, evaluation (formative, diagnostic, and summative) and reflection. Another important source is the University of Wisconsin (Program Development and Evaluation). These models provide insights for outlining a holistic program evaluation focusing on the ‘improvement oriented approach.” The focus of evaluation is on the evaluation of the organization, design and teaching of the program, on how well the program is done, its strengths and weaknesses, or how well specific aspects are done e.g., inputs, activities, and outcomes, or to discover some possible
18
discrepancies between the design of program objectives and its practice, that is, between plan and program implementation, between expectations of the target population and the services actually delive delivered red.. The focus is also on what what improv improveme ements nts can be made made to specif specific ic progra program. m. This This approach takes more a formative orientation than summative. The primary purpose is to provide information for program improvement. Because formative evaluations are designed to improve programs, it is critical that the primary primary audience be the the clients or students. The target is also the people who are in a position to make changes in the program and its day-to-day operations e.g., coordinators, teachers and students. So, the focus is not in potential consumers (although at the end of the day they will benefit from better programs) or policy makers or administrators (Fitzpatrick et al., 2004, p. 18). In sum, this paper concerns with theory but primarily with process, how the program was organized, designed and delivered. The use of an “improvement-oriented approach” for program evaluation is important because information is needed to meet the obligation of providing effective services, verify that objectives, conten contents, ts, strate strategie gies, s, and activi activiti ties es of evaluat evaluation ion,, and resour resources ces are proper properly ly organi organized zed and designed, “devoted to meeting unmet needs, verify that planned programs do provide services, examine the results, determine which services produce the best results, provide information needed to maintain and improve quality and watch for unplanned side effects, among others” (Posavac and Carey, 2003, p. 3). In this sense, the key is to conduct or manage the evaluation successfully. The stag stages es and and task taskss that that are are nece necess ssar ary y in order order to succ succeed eed in this this ende endeavo avour ur compr compris isin ing: g: a) preparati preparation: on: focusing focusing the evaluation, evaluation, defining the objectives, objectives, b) assessment assessment:: collecting collecting and using the information, c) performing an evaluation assessment process, d) evaluation, in which the role of evaluator is explained, and e) reflection. These stages enlighten the program evaluation plan and
19
help in the assessment assessment process of the Spanish Intermedia Intermediate te Program (SIP). (SIP). Below each stage is briefly described: Stage 1 - Preparation According to the Ellen Ellen Taylor-Powe Taylor-Powell, ll, Preparation: Focusing the evaluation. According Sara Steele, Mohammad Douglah (1996) of the University of Wisconsin, this phase requires to answer questions such as, what is to be evaluated, what is the purpose of the evaluation? Who will use the evaluation and how will they use it? What questions will the evaluation seek to answer? What information do you need to answer? (The questions, indicators, kinds of info –qualitative or quantitative), when is the evaluation needed? What resources do you need? (Time, money and people). How does one determine whether a program is evaluable? To Fitzpatrick et al., 2004), this means that as evaluator, one should clarify “the intended program model or theory, examine the progr program am in implem implement entati ation, on, and explor exploree approa approaches ches and priori prioritie ties” s” (p.183) (p.183).. In the Student Evaluation: A Teacher Handbook (Saskatchewan Education, 1991) the preparation phase requires
to focus on the type of evaluation (formative, summative, or diagnostic) to be used, the criteria against which student learning outcomes will be judged, and the most appropriate assessment strategies with which to gather information on student progress. Decisions made during this phase form the basis for planning during the remaining phases. Evaluations are carried out to learn about programs. Chemlisky, E., & Shadish, W.R (1997) call this ‘evaluation for knowledge.’ In order to determine the overall scope and design of this evaluation plan, one must first obtain a complete progr program am descri descripti ption, on, meet meet with with stakeho stakeholde lders, rs, become become famili familiar ar with with inform informati ation on needs needs and previous evaluation needs reports: Who wants an evaluation, what should be the focus of the evaluation, why is an evaluation wanted?, When is an evaluation wanted? What resources are available?). Rouda (1995) focusing on the evaluation means to assess an initial “gap analysis” of the current skills, abilities, and knowledge of the major stakeholders and compare it to the desired
20
skill levels and knowledge base of all stakeholders. The main differences or the “gaps” between the current and the desired determined the nature and direction of future evaluation. In this sense, the Improvement-F Improvement-Focused ocused Model seems to be an adequate way to reporting the "success and failure" of a program. It will help us in reporting evaluation assessment of Intermediate Spanish Program. It is important to notice that not only the survey questionnaires can provide feedback of the quality of the program but also, it can be complemented with other activities such as specific tests, sample work portfolios among others. After assessing the “evaluability” of the program, revi review ewin ing g the the lite litera ratu ture re,, the the ‘gap ‘gap anal analys ysis is,, the the eval evalua uato torr shoul should d be read ready y “to “to make make some some methodological decisions regarding sampling procedures, research design, data collection, and statistical analysis” (Posavac and Carey, 2004, p. 39). Stage 2 - Assessment: collecting and using the information . Once one has selected a program –in this case, the Spanish Intermediate Intermediate and Conversation Conversation Program (SIP) of the "Adults General Interest Program" of the Community Center, City of Ottawa, and once, one has identified the focus of evaluation, which in this case is to assess the clients’ satisfaction with organization or design and teaching of the selected program, one can proceed to identify information-gathering strategies and to construct or select instruments, administer them, and collect the information on a sample of clients to evaluate the instruments. Instruments to collect the information: the Planning Evaluation Worksheets by TaylorPowell et al., (1996) of the University of Wisconsin are excellent examples of instruments with the purpose of focusing the main aspects of evaluation. These instruments were used as models for constr construct ucting ing our instru instrumen ment. t. Thus, Thus, a new instru instrumen ment, t, a survey survey questi questionn onnair aire, e, was develo developed ped specifically for the evaluation of the SIP.
21
Constructing the survey questionnaire: According to Ellen Taylor-Powell et al., (1998) of the Univer Universit sity y of Wiscons Wisconsinin-Ext Extens ension ion,, four four types types of inform informati ation on may be distin distingui guishe shed: d: (a) knowledge, (b) beliefs-attitudes-opinions (c) behaviours (what people do) and (d) attributes (what people are or have) and I add a fifth one, (e) reactions (impacts of programs to the participants and their suggestions to improve them). These types of information were taken in consideration to design the first version of the survey questionnaire (preliminary version in Appendix # 3). It was designed to evaluate students or participants’ reactions. The purpose was to obtain information regarding the level of satisfaction with the program and the aspects that need to be improved in their own views. Identifying and selecting the evaluation questions and criteria: Generally, according to Fitzpatrick et al., 2004, p. 234) the single most important source of evaluation questions are the progr program’ am’ss stakeho stakeholde lders. rs. To obtain obtain such such input input or inform informati ation, on, the evalua evaluator tor needs needs to identi identify fy individuals and groups who are affected by the program, e.g., policy makers, administrators or managers, managers, practitio practitioners, ners, primary primary and secondary secondary consumers, consumers, the clients, clients, participant participants, s, students, students, teachers or members of the community. Once identified stakeholders, they should be interviewed to find out what kind of concerns or questions do they have: How they perceive the program to be evaluated? evaluated? The information information gathered gathered during the preliminary preliminary meetings meetings or interviews interviews helped helped to design a suitable survey questionnaire. This first version or list of possible questions is then corrected by a sample of participants. An improved version (see Appendix 4) was then evaluated by instru instructo ctors rs and partic participa ipants nts.. Indeed, Indeed, a survey survey questi questionn onnair airee serves serves as a major major source source of information. The improvement oriented approach guides the evaluator to ask whether specific object objective ivess are clearl clearly y stated stated and connect connected ed to content contents, s, teachi teaching ng strate strategie giess and activi activitie tiess of evaluation, or what would prevent their achievement? Or what specific aspects should be improved
22
to lead program success (See third version in Appendix # 5). The information gathered during the assessment phase with final version of instrument (see final version of instrument in Appendix # 6 and in the web site: www.researchphilosophy.blogspot.com www.researchphilosophy.blogspot.com)) was used to make judgments about what things to improve. Stage 3 – Data Collection means that the evaluator should pay attention to important issues, such as, what sources will you use? What data collection, methods will you use? What collection procedures will you use? (Taylor-Powell et al., 1996). It is also important the identification and elim elimin inat atio ion n of bias bias (suc (such h as gende genderr and and cultu culture re bias bias)) from from the the asse assess ssme ment nt stra strate tegi gies es and instruments, and the determination of where, when, and how assessments will be conducted (Saska (Saskatch tchewan ewan Educat Education ion,, 1991). 1991). These These criter criteria ia were were taken taken into into account account in the proces processs of collecting collecting informati information on regarding regarding evaluation evaluation of the Spanish Spanish Intermedia Intermediate te Program. Program. Collecting Collecting information during the month of November and the first days of December was useful to be able to write this report. Performing the Evaluation Assessment Process of the Spanish Intermediate Program
It has been brought to this program evaluator’s attention that the current Coordinator of the “Interest Programs” had some concerns regarding the improvement of the Programs. In the case of Spanish Intermediate Program (SIP), no previous evaluation was made to this specific program. There There were were some some concer concerns ns about about whethe whetherr its its organi organizat zation ion and design design of the progra program m was sati satisf sfact actor ory, y, whet whether her the the stude student ntss are are sati satisf sfie ied d with with the the progr program am and and whet whether her ther theree were were improvement improvementss to make. Following consultations consultations with the major stakeholder stakeholderss of this program (mainly the Coordinator, the teachers and the students), it was determined that, indeed there was a need to perform an evaluation assessment process: to find out whether or not the program requires certain improvements, or whether or not students’ satisfaction was being met, and whether or not
23
the program was doing what it has set out to do. The purpose then of performing the evaluation assessment process of SIP is to find reasons for improving the program and therefore, to increase clients’ satisfaction. Intervention Objective
This paper uses the following SMART objective to develop the evaluation plan:
By December 5th, 2009, identify from 1 to 3 the number of improvements that should be made on the organization, design and teaching of the “Spanish Intermediate Program,” according to The Community Center’s members and clients. Overall, the purpose is to find out if the clientele is satisfied with the quality of the program organization, design and teaching, and what should be improved, that is, what changes or improvements should be made to satisfy the clientele’s needs. At the end of the day, the idea is to find out a number n umber of possible improvements to be made to current program. Users of Evaluation are the Coordinator of Adults’ General interests Programs, the two teachers of Spanish Intermediate Program and the students, participants or clients. For this report only 5 from a total of 15 students were active contributors, responding the survey questionnaire. The survey questionnaire is electronically open in the web site to collect information until January 2010. Plan Development : Once I have clarified the intervention objective and the use and users of the evaluation, developing the plan included these steps:
(a) Develo Develop p evalua evaluatio tion n questi questions ons.. Differ Different ent versio versions ns were were develo developed ped during during the proces processs of designing and testing of the survey questionnaire. Each one included a sample checklist, a variety of question types such as scale rating, short answer, and open-ended. Two preliminary versions were designed and evaluated together with four students. The students answered the questionnaire but their concern was on correcting some discrepancies between items and the characteristics of the actual program. They provided suggestions regarding the following issues: clarity of questions, wording, style, and importance. These aspects were taken into account for developing a third version of the survey questionnaire. This third version was posted in the web site of the University of Saskatchewan. The respondents made suggestions to improve clarity in the organization and desi design gn of SIP. SIP. All All thre threee vers versio ions ns are are post posted ed on the the web web site site of ECUR ECUR 809 809 Cour Course se:: www.researchphilosophy.blogspot.com.. The fourth version is the final one, which is now on: www.researchphilosophy.blogspot.com https://survey.usask.ca/survey.php?sid=17783 as part part of the the web web sit site of the Univ Univer erssity of Saskatchewan. See appendices and www.researchphilosophy.blogspot.com www.researchphilosophy.blogspot.com.. (b) Determine Determine indicators. See the following following Graphic.
24
Graphic Questions and Indicators Questions
Indicators
Data Sources
Organization Are the outcomes or objectives clearly stated? Is the program content up-to-date? Is the program level appropriate for most students? Are competencies or tasks satisfactory stated? Is the program presented in a logical sequence? Do performance checklists match with objectives?
Level of participant satisfaction Scale: Excellent Good Fair Poor
Participant Survey satisfaction
Are directions or instructions for how students are to proceed through the materials clearly explained? Are content-map and competencies covered by the program? Is prerequisite knowledge applied? Are the materials and academic guides to students useful? Does academic strategy fit with knowledge base and program? Are the visuals, videos, games, experiences and practices meaningful? Is overall design of the learning activities satisfactory for individualized instruction? Do learning activities and objectives match? Do the tests and rubrics match with objectives?
Level of participant satisfaction Scale: Excellent Good Fair Poor
Participant Survey satisfaction
Teaching How is the preparation and punctuality? Overall, does the program meet your expectations?
Level of participant satisfaction Scale: Excellent Good Fair Poor
Participant Survey satisfaction
Registration Process Would you register again for this program or recommend it to your friends? Do you find our customer service staff knowledgeable and courteous?
Number of responses: Yes or not
Follow-up written survey of attendees.
Demographic How many years/months have you participated in Community Center program activities? How old are you now? In which county do you live? What is your gender? What is your level of study? elementary, secondary, university, post-graduate How would you describe the area where you live? What is your ethnic background?
Participant survey/demograp hic questions
Participants’ demographic answers.
Suggestions/ Improvements In which program or activity would you like to participate in the future? Please feel free to make any suggestions, comments that can help us to improve our Program (s) If you would like us to respond to your comments please write name, phone and/or Email.
List of suggestions or recommendation s provided by the participants.
Follow-up written survey of attendees.
(c) Identify Data sources: Existing information or Program Kit, and the Survey Questionnaire. People: Teachers or instructors and students or participants (clientele) of The Neigborhood or Community Center in Ottawa. What sources of information will I use? Existing information: Web site – Programs – Written materials provided by the Community Center – Teachers materials – samples of students’ work and/or ex periences (videos, photos, etc).
25
(d) Determine the data collection method. How I gathered the data? Using a final survey questionnaire (click here: https://survey.usask.ca/survey.php?sid=17783 https://survey.usask.ca/survey.php?sid=17783). ). This process of evaluation implied to conducting the survey questionnaire: firstly to the teacher (s) of SIP, and secondly to a sample of participants of the Spanish Intermediate and Conversation Programs as part of the "Adults General Interest Program" of the Community Center, City of Ottawa, to get their feedback. This required an appropriate methodology to evaluate the program. (e) Budget, Resources and Timelines: Specify the time frame for data collection. When will I collect the data? Budget comes from students’ payment of tuition or registration, contributions of the community members and from donations of the City of Ottawa. Time frame for the evaluation: Preliminary version of this program evaluation was developed on a voluntary basis. This is my contribution to the Community Center. However, to continue with the ongoing process of evaluation or full implementation of it requires a specific budget to pay an evaluator (at least $22.50 per hour as Graduate student). Number of required hours: 40 hrs of work during four months. Preliminary versions of survey questionnaire are presented in the web site: www.researchphilosophy.blogspot.com.. www.researchphilosophy.blogspot.com (f) Plan the data analysis. How will data be analyzed and interpreted? Preliminary Data was collected during the month of November, 2009. Deadline: December 6, 2009. 2009. Count Count numb number er of answer answerss (mul (multi tipl plee choi choice cess – scal scalee of level level of sati satisf sfac acti tion on). ).Us Usee of perce percenta ntages ges and analysi analysiss of demogr demographi aphicc inform informati ation. on. Data Data was analyz analyzed ed electr electroni onical cally ly by percentages (see Appendix # 7 and the results of survey questionnaire on the following link: https://survey.usask.ca/survey.php?sid=17783.. Data will continue to be collected until January https://survey.usask.ca/survey.php?sid=17783 2010. The collected data will be used to improve the parts or the whole program, if necessary. Corrections and changes will be made in concordance to students’ reactions and suggestions. Based on the judgments (evaluations), changes and decisions to improve the program, a new program will be offered to students, parents, and appropriate community center personnel. (g) Communicate results. With whom and how will results be shared? Result Resultss were were shared shared with with the Coordin Coordinato atorr of Genera Generall Intere Interest st Progra Programs ms and with with other other two teachers of Spanish, and with some participants, members of the community. (h) Designate staff responsibility. Who will oversee the completion of this evaluation? Voluntary Evaluator: Nelson Dordelly Rosales will oversee the completion of this this evaluation. In general, it seems the survey questionnaire really helped program evaluator “to discover discre discrepan pancie ciess and expect expectati ations ons of the target target pop popula ulatio tion n and the servic services es actual actually ly delive delivered red”” (Posavac and Carey, 2003, p. 29) Once the instrument was applied to the selected group of participants and instructors, content information was tabulated and data presented to Coordinator.
26
Using the Information : How will the information be interpreted, and by whom? How will the
evalua evaluatio tion n be communica communicated ted and shared shared?? How will data be analyze analyzed? d?
(Taylo (Taylor-P r-Powe owell ll et al., al.,
1996). Information collected from a sample of participants (pre-testing the questionnaire), was used used to make make appr approp opri riat atee corr correc ecti tion onss to the the inst instru rume ment nt and and a fina finall vers versio ion n came came up: up: https://survey.usask.ca/survey.php?sid=17783.. The questionnaire in its final version is now ready https://survey.usask.ca/survey.php?sid=17783 for application to the whole group of students or participants. Information collected will be used to improve the SIP for the next year. Phase 5 - Evaluation and the role of evaluator : According Fitzpatrick et al., (2004), the evaluator’s primary responsibility is to interpret information that can help key individuals and groups improve improve efforts, efforts, make enlightened enlightened decisions, decisions, and provide credible credible informati information on to the public. This author distinguishes between internal (program employees) and external evaluators (conducted by outsiders). Among the advantages of internal evaluators are: they have “more knowl knowled edge ge of the the progr program am model model and and its its hist histor ory; y; they they are are more more fami famili liar ar with with the the vari various ous stakeholder stakeholderss and their interests, interests, concerns and influence; influence; his/her knowledge knowledge can help increase the use of the evaluation; and they will remain with the organization after the evaluation and can continue continue to serve as advocates for use of its findings” findings” (p. 187). In terms of knowledge knowledge about a program, “internal evaluators have an advantage since they have better access to program directors and to the administrators of the organization” (Posavac and Carey, 2004 p.17). A person who is physically present on a regular basis is likely to see the program in action, to know its staff, and tolerant about its reputation from other people in the organization. Such information is unavailable to an external evaluator. The more that is known about the actual workings of the program, the easier it is to ask relevant questions during the evaluation planning and interpretation. An internal evaluator often works with a small group of two or three; some evaluators work alone. In this
27
work, I consider myself an insider rather than outsider, because I am one of the instructors of the Community Center, and I volunteer to evaluate the program: Spanish Intermediate, which is the program that I taught this past summer. This work was done alone, but the whole process was developed developed with the help of 5 students students or participant participantss that responded responded to the Survey Questionnaire Questionnaire and provided some suggestions. Also the other teacher and the Coordinator of the “General Interests Programs for Adults” of the Community Center were of great help. In this work, I took a variety of roles. Evaluators often take on many roles including facilitator, collaborator, teacher, management consultant, specialists etc (Patton, 1996). In this sense, Ryan and Schwandt (2002) describe the evaluator’s role as a teacher, helping practitioners develop critical judgment. Technical expertise, skills in sampling, qualitative analysis, or statistics, experience with the services being evaluated is also an asset. In addition to technical technical competence, an evalua evaluator tor’s ’s person personal al qualiti qualities es are import important ant.. Among Among them, them, “objec “objectiv tivity ity,, fairne fairness, ss, trusty trusty,, credible. Having these qualities increase the likelihood that people are more willing to devote time to the evaluation, to admit problems, to share confidences and to participate in the improvement of programs” (Patton, 1996, p. 18). Overall, according to Fitzpatrick et al., 2004) the key role for evalua evaluator torss is helpin helping g policy policy makers makers and manage managers rs select select the evalua evaluati tion on model, model, perfor performan mance ce dimensions to be measured as well as the tools to use in measuring those dimensions. There are two general ways an evaluator can relate to an organization needing an evaluation: a) evaluators can work for the organization and do many evaluations, or b) they can come to the organization from a research firm or a university to work on a specific project. Particularly evaluators in program planning play an important role in helping articulate theories or logic models (Fitzpatrick et al., 2004, p.13). A logic model starts with the long-term vision of how program participants will
be better changed or satisfied by the quality of the program. In this work, as evaluator I had to
28
apply different roles. A preliminary plan and a logic model as a way of planning how to conduct evaluation was outlined as part of this paper (See Appendices # 1 and # 2) and on the following link: www.researchphilosophy.blogspot.com www.researchphilosophy.blogspot.com.. Stage 6 - Reflection allows pondering the successes and shortfalls of the previous phases. Specifically, reflection helps to evaluate the utility and appropriateness of the assessment strategies used, used, and helps helps to make make decisio decisions ns concer concernin ning g improv improveme ements nts or modifi modificati cations ons to subseq subsequen uentt teachi teaching ng and assess assessmen ment. t. Instru Instrumen ments ts contai contain n questi questions ons that that encoura encourage ge reflec reflectio tion n on student student assessment, teachers' planning, and on the structure of the curriculum. Reflec Reflectio tion n on the Strengt Strengths hs and potent potential ial Limita Limitatio tions: ns: Indeed Indeed,, carefu carefull planni planning ng of an evaluation program helps to start the whole process successfully; this program evaluation plan is useful, but obviously it is not the ultimate solution to every problem or any sort of solution. Evaluation, in this case, served to identify the level of satisfaction of a sample of students with the organization, design and implementation of the SIP, its strengths and weaknesses; it highlights the good, and expose the faulty, but the results of the survey questionnaire obviously cannot correct the problems, for that is the role of management and other stakeholders, using evaluation findings as one tool that will help them in that process of change (Fitzpatrick et al., 2004, p. 27). The main strength of this work is the quality of the whole process, method, people and sources used. The resu result ltss of this this plan planni ning ng can can be one one of many many infl influe uenc nces es on impr improvi oving ng the the polic policie iess and and or organi organizati zation on practi practices ces and decisio decisions ns in the Commun Communit ity y Center Center.. Perhaps Perhaps,, nowaday nowadayss the main main constraint or weakness is money. Summary
Overall, the main aspects of evaluating “Spanish Intermediate Program” are listed below:
29
What questions the evaluation seeks to answer? What information was needed to answer the questions? See attach survey questionnaire or click here: https://survey.usask.ca/survey.php? sid=17783.. sid=17783 Indicators: The students’ responses to survey questionnaire and testimonials of their experience with Spanish Intermediate Program (SIP) in order to improve it (see Graphic in page 24). Knowledge, beliefs, opinions, reactions – How will w ill I know it? Answers to multiple choice and scale rate questions. When was the evaluation needed? December 2009. What evaluation approach was used? The survey questionnaire to assess assess the level of clients’ clients’ satisfaction with the organization, design and teaching of SIP helps in identifying suggestions for the improvement of the program. Collection of the information: What sources of information was used? Existing information: Program Kit, and the Survey Questionnaire. Web site – Programs – Written materials provided by the Community Center – Teachers materials – samples of students’ work and/or experiences (videos, photos, etc). People: Teachers or instructors and students or participants (clientele) of The Neigborhood or Community Center in Ottawa. What data collection method(s) was used? Mainly a Survey Questionnaire on a sample of students. Corrections and improvements were made to the previous drafts of the instrument. Data was collected electronically from a small sample of students and teachers: https://survey.usask.ca/survey.php?sid=17783 Who was involved or who should be involved? Stakeholders: teachers, administrators or coordinators, and the students or participants. How they were engaged? Staff meetings, email correspondence, volunteering, survey questionnaires. Focus of the Evaluation: Description, organization, design and teaching of the Program, level of satisfaction of clients and suggestions for the improvement of SIP (see p age 14 of this paper and the attach logic model). Participants’ reactions or answers to the survey questionnaire and written suggestions for the Program’s improvement. Goals or objectives to be evaluated: What was the purpose of the evaluation? ev aluation? The purpose was to evaluate the extent or level of satisfaction of students of S panish Intermediate Program with the organization, design, implementation of the program. In other words: By December 5th, 2009, identify from 1 to 3 the number of improvements that should be made on the organization, design and teaching of the “Spanish Intermediate Program,” according to The Community Center’s members and clients.
30
Resp Respon onsi sibi bili lity ty:: Who Who will will use use the the eval evalua uati tion on?? How How will will they they use use the the info inform rmat atio ion? n? Admini Administr strato ators, rs, coordi coordinat nators ors,, and teacher teacherss might might use the inform informati ation on to assess assess the level level of satisfaction of students and to identify their suggestions for improvement of SIP. This evaluation provided insights to assess the quality of the program organization, design and teaching: A list of three main suggestions was created on the basis of the participants’ responses. This list will be very helpful to coordinators and teachers make changes in the organization and re-design of the program and meet the goals. Specific Suggestions or recommendations provided by respondents: https://survey.usask.ca/results.php?sid=17783 1. Improve clarity regarding statement and match of objectives, updating content map, and
tests/rubrics. 2. Enhance materials and academic academ ic guides: better directions or instructions for how students are to proceed through the materials and the logical sequence (pre/requisites). 3. More practice or application of an academic strategy: improving fit with knowledge base and program; making meaningful use of the visuals, videos, games, experiences and practices of competencies, tasks, and activities.
Conclusion
The paper explains how to perform evaluation to assess merit, worth, quality and significance of a program. To that end, the paper integrates different tools and theories of program evaluation into an evaluation plan for evaluating a selected Program. Evaluations are conducted to answer questions concerning program adoption, continuation, or improvement. This study focuses on the last one. Specifically, this paper dealt with planning an evaluation project for the improvement of the Spanish Intermediate Program (SIP), which is one of the ‘interests programs for adults’ offered by the Community Center in Ottawa. To that end, it took a theoretical approach, the ‘improvement model.’ The focus was on the organization, design and teaching of SIP; in this sense, evaluation was undertaken, using a survey questionnaire, to identify and apply defensible criteria to determine worth, merit or quality (Scriven, 1991) and to list a number of suggestions to further improvement of SIP. In this work, program evaluation meant disciplined searching and caring imagination by
31
the evaluator, envisioning a better educational journey for clients. This approach best meets the criteria necessary for effective evaluation that requires the inquiry of clients to find out their judgements and to make them participate in the process of enhancing the quality of the Program.
Bibliography
CDC “State Program Evaluation Guides: Developing an Evaluation Plan” Retrieved December 2, 2009 from: http://www.cdc.gov/DHDSP/state_program/evaluation_guides/evaluation_plan.htm Chemlisky, E., & Shadish, W.R. (1997). Evaluation for the 21 st century: A Handbook . (Thousand Oaks, CA: Sage). Dewey, J. (1966 edn.) Democracy and Education. An introduction to the philosophy of education (New York: Free Press). Fitzpatrick, Jody L., Sanders R. James and Worthen R. Blaine, (2004) Program Evaluation: Alternative Approaches and Practical Guidelines (Boston: Allyn and Bacon). Hewett, Stephanie M. (2008). Electronic Portfolios and Education: A Different Way to Assess Academic Success in L.Tomei (Ed.) Online and Distance Learning: Concepts, Methodologies, Tools, and Applications. (1 ed. USA: IGI global, v. 6, p. 3200-3213). Hershey, PA: Information Science Reference. Henderson, James G., and Kesson R. Kathleen (2004) Curriculum Wisdom: Educational Decisions in Democratic Societies (New Jersey: Merril Prentice Hall). http://www.managementhelp.org/evaluatn/chklist.htm
32
Krista Breithaupt and Colla J. MacDonald (2008). Qualitative Qua litative Standards for E-Learning: The Demand Driven Learning Model in L.Tomei (Ed.) Online and Distance Learning: Concepts, Methodologies, Tools, and Applications. (1 ed. USA: IGI global, v. 2, pp. 1165-1177). Hershey, PA: Information Science Reference Patton M. Q. (1996) Utilization Focused Evaluation: The New Century Text (3rd Edition) Thousand Oaks, CA: Sage Patton, M. Q. (1987). How to Use Qualitative Methods in Evaluation. Newbury Park, CA: Sage. Posavac Emic J. and Carey G. Raymond (1990), Program Evaluation: Methods and Case Studies (New Jersey: Prentice Hall) Posavac Emic J. and Carey G. Raymond (2003), Program Evaluation: Methods and Case Studies (New Jersey: Prentice Hall) Posavac Emic J. and Carey G. Raymond (2004), Program Evaluation: Methods and Case Studies (New Jersey: Prentice Hall) Rouda, Merrill. (1995). “Needs Assessment--The First Step”. Retrieved October 12, 2009, from http://alumnus.calech.edu/~rouda/T2_NA.html Ryan E. Katherine and Schwandt A.Thomas. (2002) Exploring Evaluator Evaluator Role and Identity, (Greenwich, CT: Information Age Publishing) Saskatchewan Education (edit.) 2009. “Student Evaluation: A Teacher Handbook”. Retrieved October 24th, 2009 from: http://www.thephysicsfront.org/items/detail.cfm?ID=6650 http://www.sasked.gov.sk.ca/docs/policy/studeval/index.html.2009.waq13102009 Scriven M. (1996) Types of Evaluation and types of evaluator. Evaluation practice, 17, 151-162. Sherry Y. Chen (2008). Evaluating the Learning Effectiveness of Using Web-Based Instruction: An Individual Differences Approach in L.Tomei (Ed.) Online Online and Distance Distance Learning: Learning: Concepts, Concepts, Methodologies, Tools, and Applications. (1 ed. USA: IGI global, v. 3, pp. 1740-1751). Hershey, PA: Information Science Reference. Stake, R.E. (1975) Evaluating the arts in education: A responsive approach. (Columbus, OH) Talmage, H. (1982). Evaluation of Programs. In H. E. Mitzel (Ed.), Encyclopedia of educational research (5th ed.). New York: Free Press. Taylor-Powell, E., Steele, S., & Douglah, M. (1996). Planning a program evaluation. Retrieved April 2002, from University of Wisconsin-Extension-Cooperative Extension, Program Development and Evaluation Unit. Retrieved November 27th, 2009 from: http://learningstore.uwex.edu/Planning-a-Program-Evaluation--P1033C0.aspx
33
University of Wisconsin “Program Development and Evaluation” Retrieved September 2, 2009: http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html http://www.uwex.edu/ces/pdande/ Wiggins, G. (1996). Practicing what we preach in designing authentic assessments. Educational Leadership, 55 (1), 18-21. Example: http://www.geegees.ca/forms/program_evaluation
Appendix # 1 Logic Model www.researchphilosophy.blogspot.com Appendix #2 Preliminary Plan of Program Evaluation www.researchphilosophy.blogspot.com
Appendix # 3 First Version Survey Questionnaire Short Survey: Design and test a short survey that includes a Sample Checklist, a variety of question types such as scale rating, short answer, and open-ended. A. - Original version Short Answer: yes or not 1. Are objectives, competencies, or tasks stated in the student materials? 2. Does the content cover a significant portion of the program competencies? 3. Is the content up-to-date 4. Is the course level appropriate for most students? 5. Is a student’s guide included that offers how to manage and perform the course theory and practice? 6. Is the material presented in a logical sequence? 7. Are performance checklists included? 8. Are tests included in the materials? 9. Is evaluation an integral part of (a) the development and (b) the implementation of the program? 10. Are the resources devoted to the program being expended appropriately?
34
Scale rating: Quality and satisfaction Judgments. Use +, 0, - to rate or degree of the quality or your satisfaction with specific aspects of the course: 1. Quality and satisfaction of objectives, competencies, and/or tasks_____ 2. Degree or match between learning activities and objectives______ 3. Quality of test tests and degree of match with objectives________ 4. Quality and satisfaction with of performance checklists and degree of match with objectives________ 5. Quality and satisfaction of directions for how students are to proceed through the materials_______ 6. Quality of visuals, videos, games, experiences, practices_______ 7. Overall design of the learning activities for individualized instruction_____ 8. Quality and satisfaction on safety practices_____ 9. Satisfaction with degree of freedom from bias with respect to sex, race, origin, age, religion, etc,? ________ 10. Quality and satisfaction of content list or the course content-map and competencies covered by the course_________ Short answer: brief comment Does the program have basic elements, such as those listed below? Please mark with an “x” and make a comment if necessary: 1. Clearl Clearly y stated stated outc outcome omess object objective ives__ s____ __ 2. Suffic Sufficien ientt direc directio tions_ ns____ ____ _ 3. Prerequisi Prerequisite te knowl knowledge edge based and existing existing programs_ programs___ __ 4. Fit with with knowledge knowledge base and exist existing ing progra programs___ ms___ 5. Mate Materi rial alss requ requir ired ed 6. Relevance of Frequency of interactions among participants: participants: help to achieve achieve the goals? Have these interactions being evaluated?_______ Open ended questions: Please explain or illustrate - What aspects of the program require improvement?
- Do the outcomes achieved match the goals? - Is there evidence of effectiveness available regarding the program? - - Does the program or plan match the values of the stakeholders?
Reflection : the internal evaluator should make a reflection -Does the program or plan match the needs of the people to be served? -Does the program as implemented fulfill the plans? Adapted from: Jody L. Fitzpatrick, James R. Sanders & Blaine R. Worthen, 2004, p.100) Appendix # 4 Second Version: Modified Survey Questionnaire (after sampling application)
A. Knowledge : short answers - yes or not 1. Is the program content of Intermediate Spanish up-to-date?_____ 2. Is the program level appropriate for most students?_____ 3. Are objectives, competencies, or tasks satisfactory stated?____ 4. Is the program presented in a logical sequence?_____ 5. Are you satisfied with the program have basic elements, such as those listed below? B. Judgments/Opinions: Please write appropriate letter on each spaces below: Very Good (VG), Good (G) or Bad (B), and make a comment if necessary:
35
a) Outcomes, objectives, competencies or tasks____ b) Directions or instructions for how students are to proceed through the materials___ c) Materials ____ d) Prerequisite knowledge based ___ e) Performance checklists____ f) Student’s guide_____ g) Fit with knowledge base and program___ h) Tests and Rubrics___________ o r degree of the quality or C. Behaviors/Reactions - Scale rating: Use +, 0, - to rate or your satisfaction with specific aspects of the course: - Degree or match between learning activities and objectives______ - Quality of test tests and degree of match with objectives________ - Quality and satisfaction with of performance checklists and degree of match With objectives_____ - Quality of visuals, videos, games, experiences, practices_______ - Overall design of the learning learning activities for individualized individualized instruction_____ - Quality and satisfaction satisfaction on safety safety practices_____ - Satisfaction with degree of freedom freedom from bias with with respect to sex, race, origin, origin, age, religion, etc,?__ - Quality and satisfaction satisfaction of content list or the course content-map and competencies covered by the program___ Open ended questions: Please feel free to make any suggestions, comments that can help us to improve our o ur Program on Spanish Intermediate: Appendix # 5
Third Version Survey Questionnaire
Thank you for taking the time to complete this evaluation. Your input will help us continue to offer quality programs; making changes as we can to better serve your needs. Please mark an “x” or fill the blanks wherever is necessary Program Name: __________________________________________ Instructor Name: _________________________________________ I am: - Alumni - Teacher - Staff - Student - Member of Community Session: - Fall - Winter
36
- Spring/Summer PROGRAM DESIGN Objectives - Scale rating: Use +, 0, - to rate or degree of the quality or your satisfaction with specific aspects of the course: - Degree or match between learning activities and objectives______ - Quality of test tests and degree of match with objectives________ - Quality and satisfaction with of performance checklists and degree of match With objectives_____ Content/Knowledge : short answers - yes or not 1. Is the content up-to-date?_____ 2. Is the program level appropriate for most students?_____ 3. Are competencies or tasks satisfactory stated?____ 4. Is the program presented in a logical sequence?_____
space s below: Materials/Guidelines: Please write appropriate letter on each spaces Very Good (VG), Good (G) or Bad (B) a) Outcomes, objectives, competencies or tasks____ b) Directions or instructions for how students are to proceed through the materials___ c) Materials ____ d) Prerequisite knowledge based ___ e) Performance checklists____ f) Student’s guide_____ g) Fit with knowledge base and program___ PROGRAM DEVELOPMENT (registered or taken in previous session) _____________________________________________________________________ Organization: -excellent -good -adequate -unsatisfactory Comments:____________________________________________________________ Content: -excellent -good -adequate -unsatisfactory Comments:____________________________________________________________ Program Quality: -excellent -good -adequate -unsatisfactory Comments:____________________________________________________________ Visuals, videos, games, experiences, practices:
37
-excellent -good -adequate -unsatisfactory Degree of freedom from bias with respect to sex, race, origin, age, religion, etc,? -excellent -good -adequate -unsatisfactory Comments:____________________________________________________________ Content-map and competencies covered by the program: -excellent -good -adequate -unsatisfactory Comments:____________________________________________________________ Activities: Overall design of the learning activities for individualized instruction -excellent -good -adequate -unsatisfactory Comments:____________________________________________________________ Tests and Rubrics: -excellent -good -adequate -unsatisfactory Comments:____________________________________________________________ INSTRUCTOR: Rapport with participants: -excellent -good -adequate -unsatisfactory Comments:____________________________________________________________ Professionalism: -excellent -good -adequate -unsatisfactory Comments: Organization: -excellent -good
38
-adequate -unsatisfactory Comments:_____________________________________________________________ Preparation and punctuality: -excellent -good -adequate -unsatisfactory Comments: Open ended questions: Did the program meet your expectations? Yes___ No_____ Undecided_____ Comments:____________________________________________________________ Would you register again for for this program or recommend it to your friends? friends? Yes__No__ Program: Comments:______________________________________________________________ Please feel free to make any suggestions, comments that can help us to improve our Program on Spanish Intermediate: _______________________________________________________ ABOUT YOU How did you hear about this program? - Activity Brochure - Internet/Website -Friend/Family -Live in the Neighbourhood -Students -Other:______________ Are you a first time participant in this program?: Yes____ No____ REGISTRATION PROCESS Did you find our customer service staff knowledgeable and courteous? Yes____ No____ Were the times offered for this program/class convenient?: Yes_____ No___ __ What other programs/classes would you like us (Languages/Sports Services) to run?: ________________________, _______________________, ______________________ When would you like them to run?:___________, _____________, _________________ If you would like us to respond to your comments please complete below: Name:__________________________________________________________________ Day-time phone number:___________________________________________________ E-mail:_________________________________________________________________ Appendix # 6 Final Version of Survey Questionnaire
www.researchphilosophy.blogspot.com
39
Appendix # 7 Data analysis www.researchphilosophy.blogspot.com