Name
Institution
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Evaluation and Communication Plan – Online Education for Bennett College
Evaluation Plan
            For any initiative to be termed as valid, it must undergo all the stages of implementation, one of them being the evaluation phase. Program evaluation helps achieve a number of objectives that include: (1) make the decision of whether to develop the program or not (needs assessment); (2) understand the best way of implementing the program- also known as formative evaluation; (3) deciding on if it is prudent to modify the current program or even continue with it (Ball, 2011). As far as all the objectives stated above are concerned, this paper provides a compact plan of evaluation for the online education program proposed for Bennett College (BC).
According to Ball (2011) evaluation activities span a wide range of educational skills and knowledge. According to Ball (2011), the skills and knowledge required are obtained from numerous knowledge realms that include “educational psychology, developmental psychology, psychometrics, sociology, statistics, anthropology, educational administration” (p.4). Evaluators must be privy to the issues that affect the education system and as they would apply to myriad situations. Most importantly, evaluation of educational programs must never be construed to be unique for every program (Sener, 2005). Each online learning program must be evaluated individually since comparison has a way in which it fails to address the pertinent issues that affect each situation (Diaz, 2000). All the basic tenets of program evaluation must be described and tested for consistency.
According to Wall (2014), the model education evaluation model has 9 back-to-back steps that must be implemented for objective reporting of findings. The steps are:

  1. Developing explicit goals that target specific elements of the program
  2. Setting the evaluation questions
  3. Developing the evaluation design
  4. Preparing a data collection plan
  5. Actual data collection
  6. Data analysis
  7. Documentation of findings
  8. Communicating findings
  9. Providing feedback for improvement

The nine steps detailed above can be summarized into 5 broader stages as: (1) Goals, questions, and stakeholders; (2) program description; (3) evaluation design; (4) collection of credible evidence; (5) analysis and interpretation of data.
Goals, Questions and Stakeholders
Evaluation goal. This program evaluation aims at ensuring that the newly proposed system will promote learning outcomes at Bennett College. The findings from this evaluation will be used to advise the management of the college on how to implement online degree programs.
Questions. The following questions sum up the overall objective of the evaluation process:

  • In what ways will this program bring about the desired change?
  • How will this program affect other learning programs in the college?
  • What effect will the program have on regular admissions in the college?
  • How effective will the current program be ten years from now?

Stakeholders: This evaluation program will be useful to the school management, lecturers, the students and education officers who are needed to approve the intended program. The table below is a template that will be used to assess the need for bringing some stakeholders on board.
Table F.1. Stakeholder Assessment sheet

Name/title of stakeholder Category Designation Role Played
  Primary – P
Secondary- S
Tertiary- T
Participant –P
Other- O
 
       
       
       

 
Program description
Need. This program aims at making learning for busy students who join Bennett College easy. The students should be in a position to find all the lecture material in their online account whether they attended class or not.
Context. This program is supported by the availability of complementary resources such as computing devices, the internet, and computer savvy teaching staff. The modern student lives in a fast paced world where technology is essential o make work easier.
Population addressed. This intervention program addresses the school management, students and lecturers, and other interested stakeholders.
Stage of development. This program is in the pilot stage. Tests are being done concurrently with partial implementation.
Resources/Inputs. Resources available include: well trained staff, Bring Your Own Device (BYOD) capacity, the internet, money, possible partnership with tech companies.
Activities. Development of curriculum and learning material, creation of a website and content updating, in-service training of teaching staff, modification of lecture halls to accommodate the new technology
Outcomes. Short term goals include: (1) Improved student grades, (2) increased enrolment through online degree courses. Long term goals include: (1) reduced number of teaching staff, (2) expanded online degree enrolments. 
Table F.2. Program Description Template

Inputs Initial activity Follow up activity Short term outcome Long term outcomes
         
         
         
         

 
Timeline. The table below is a summary of the timeline of all the activities in the evaluation program.
Table F.3. Program Description Template

Event Start End
Planning and Administrative work    
Data collection exercise    
Pilot exercise for testing data collection tools    
Analysis and interpretation of data    
Communication in the interim period    
Communication at the end of the evaluation    

 
Evaluation Design
Evaluation Questions. The following evaluation questions will be asked:

  1. Is this program needed?
  2. Is this program technologically feasible?
  3. Is this program financially sustainable?
  4. Is the teaching staff ready to support this program?
  5. Will this program improve grades of students?

Evaluation Design. This evaluation design is a pre-test study. The evaluation design was used since there is need to make a quick based on some empirically tested outcomes.
Collection of credible evidence
Data collection methods. In this evaluation program, new data will be collected to help understand the new system. A representative sample of 30 students taking online courses will be studied to analyze the efficiencies of the new system. The parameters that will be measured include hours spent in the online class, number of assignments submitted on time, lessons missed, and grades after enrolling and before. Data will be collected through self-evaluation questionnaires that will be handed to students who will be promised anonymity and asked to answer with utmost god faith.  Additionally, personal interviews with the same sample of students will be conducted to
Table F.4: Evaluation Questions and Respective Data Collection Methods

Questions Data Collection Method Data Source
1.    
   
2.    
   
   

 
Analysis and interpretation of data
Indicators and Standards. The parameters that will be measured include hours spent in the online class, number of assignments submitted on time, lessons missed, and grades after enrolling and before. Success will be determined if: (1) more hours are spent online studying, (2) all assignments are submitted without delay, (3) Students miss to go through very few lessons, (4) improved overall grades for continuing students, and above average grades for new admissions.
Table F.5. Indicators and Success

Question Criteria standard
1.    
   
2.    
   
   

 
Analysis. Data collected in this evaluation will be analyzed using descriptive statistics. He measures of central tendency and spread will be calculated to evaluate the outcomes.
Interpretation. The analyzed data will be interpreted to give justifiable conclusions as to whether the program is viable or not. All the stakeholders in this program will be part of the interpretation with emphasis placed on management decisions about implementation of the program.
Communication Plan
Communication is another important element of the evaluation program. For the program to see its proper implementation, the evaluation team must endeavor to disseminate information to the right people at the right time always. Communication must not just be done, but rather, it must be effective (Posavac, 2015; Thoumaian, Frank, Stark & Aguirre, 2015). This section details how the evaluation team will inform the stakeholders of the findings at each stage of the evaluation program to the end.
Utility of the Report
The final evaluation report will be shared with the management who are desirous to know if the program is feasible in the long term. Apart from the management, there are other several intended users of the final evaluation report that include students and lecturers.
The appropriate time for reporting these outcomes is at partner meeting where the management will brainstorm on the possible benefits of the program as well as the preparedness of the university to implement the program. The partner meeting is significant in the sense that it will be a time when the monitoring team of the action plan will be constituted if it is adopted. Moreover, lessons about evaluation and implementation proposal will be shared for understanding of the common purpose.
The Communication Process
            At various times during the implementation process, the students and lecturers will be invited to give advice on how the program should be implemented. The interim evaluation outcomes will be shared by all the stakeholders while the final report will be submitted to the management who will decide who to share the results with. During the communication process, the evaluation team will communicate with stakeholders by way of email and other forms of written reports. Email is preferred since it is easy to reproduce, and respondents are sent one mail at once. Moreover, the respondents have the opportunity to reply in form of a thread for easy feedback evaluation.
Communicating and Reporting Management
            As already determined, the audience of the reporting progress and the final outcome is he stakeholders who include the management, students and the lecturers. The purpose of communicating with the audience during the interim periods is to ensure that the evaluation team reports on the level of achievement of the interim goals. On the flipside, the audience will provide meaningful feedback that will inform better modification of the current program. For this purpose, email will be use to communicate the progress and responses sought. In this arrangement, communication will be monthly
Table F.8. Communication Diary

    Name of Audience…………………………………..
Appropriate
(√)
Reason for contact potential Formats  
potential courier
Dates Remarks
  To participate in making a particular decision        
  Alert of a particular event that is going to happen in the near future        
  Update on the progress of the evaluation process        
  The first report on the findings        
  Communication of the final outcomes        
  Documenting evaluations and findings        
  Proposed implementation program        

 
Conclusion
The evaluation and communication plan is essential for management rubberstamping of the program. While the two look independent in their approaches, they are in fact corollaries and none can operate without the other. A good evaluation program may be useless unless communicated. On the other hand, communication cannot be bare. It must have substance that is supported by factual evidence obtained form evaluation.
References
Ball, S. (2011). Evaluating Educational Programs. ETS Research Report Series, 2011(1), i-23.
Diaz, D. (2000). Carving a new path for distance education research. The Technology Source. Retrieved November 22, 2005, from http://technologysource.org/article/carving_a_new_path_for_distance_education_research/
Posavac, E. (2015). Program evaluation: Methods and case studies. Routledge.
Sener, J. (2005). Escaping the comparison trap: Evaluating online learning on its own terms. Innovate: Journal of Online Education, 1(2), 5.
Thoumaian, C. A. H., Frank, U. C., Stark, D., & Aguirre, M. C. (2015). Reporting Program Evaluation Findings: How to Demonstrate Effectiveness to Stakeholders.
Wall, J. E., & Solutions, S. (2014). Program evaluation model 9-step process. Sage Solutions. http://region11s4.lacoe.edu/attachments/article/34, 287(29), 209.