Continuous Improvement System on Outcome Planning and Assessment: Case Study of UFE
Article information
Abstract
Purpose of Research: The move to outcome-based education necessitates a process of continuous improvement for academic programs including outcome-based planning, implementation, and assessment. UFE has been demonstrating a commitment to academic excellence through emphasizing outcome-based education since 2009. To do so, the university initiated and developed undergraduate program policy and regulations that indicate learning outcomes at each level including the course and programs. In other words, it shows how course learning outcomes support the programs and we constantly try to measure and evaluate them accordingly. Our learning outcomes model is based on a three-level hierarchically structured definition of learning outcomes that consistently apply to both the entire undergraduate program as well as to each individual course and faculty use this model to design, monitor, and revise both the entire curriculum as well as each individual course on an ongoing basis. As part of this development, the program learning outcomes are being integrated into the information system to link each course's learning outcomes. The new system will make it possible to perform a detailed data analysis to directly assess the program learning outcomes. The overall process guides the university in planning, improving, implementing, and monitoring based on stakeholder satisfaction and performance analysis. This research focuses on quantitative and qualitative analysis of program learning outcomes based on student performance and stakeholder assessment to determine educational achievement, and which can be used in the decision-making about developing or updating programs. 1. During the planning process, we developed a curriculum matrix mapping program learning outcomes onto each of the individual courses and its tasks that would be automated by our information system. So, based on this data, to ensure that all outcomes are covered in at least one course, and preferably more than one. 2. Reviewing the assessment results of determined learning outcomes and its performance. Additionally, Student self assessment is used for this investigation. 3. As well as stakeholders’ assessments are designed and used as significant indicators of educational achievement. So, the gap analysis was employed to examine differences between student performance and the stakeholders’ assessments. 4. To review external stakeholders’ assessment which learning outcomes are preferred or not and their expectations are met. Methods of Research: The research employs a mix of quantitative and qualitative data collection methods to comprehensively assess program education objectives (PEO) and program learning objectives (PLO) at UFE. The following research methods are used in this research. • Quantitative Analysis: Quantitative data from assessments and student performance is statistically analyzed to measure the achievement of program learning outcomes. This includes descriptive statistics, inferential statistics, and potentially regression analysis to identify significant factors influencing outcomes. • Qualitative Analysis: Qualitative data, including stakeholder assessments and student self-assessment narratives, is analyzed using thematic analysis or content analysis to identify recurring themes, patterns, and insights. • Gap Analysis: The gap analysis assesses the differences between student performance and stakeholder assessments, high lighting areas where perceptions may differ. • External Stakeholder Assessment Review: The preferences and expectations of external stakeholders are analyzed to determine which learning outcomes are preferred and if they align with UFE's educational objectives. • By employing these research methods, this study aims to contribute valuable insights into the continuous improvement system of UFE and provide evidence-based recommendations for enhancing the outcome planning and assessment processes within the institution.
Introduction
University of Finance and Economics (UFE) is one of the oldest universities in Mongolia and was established in 1924. Its focus was to prepare professionals to ensure the continuous process of Mongolian socio-economic and customs tax policy. Still, it has become a university that implements policy in the higher education sector with new 21st-century management.
UFE approved the new strategic plan to be implemented during 2021 to 2031. In range of this strategic policy, UFE is aiming to be an internationally – recognized, digital and entrepreneur university. Consequently, UFE emphasizes the interdisciplinary approach in its academic and research activities, to seek solutions to any challenges faced in economics, society, and environment. Briefly, its strategy is to serve society and to be a connector between all stakeholders such as government, business, and local community as well as international organizations and universities. Due to its efforts and dedication for this goal, UFE has been ranked in the Times Higher Education Impact Ranking in 2022 for the first time from Mongolian higher education institutions.
This paper presents the case study of continuous improvement effort to develop and implement a learning outcome for the undergraduate programs offered by UFE. The university has been continuously revised and redesigned outcome-based approaches due to rapid changes since 2009 when was the first initiative. The latest learning outcomes framework includes implementation, evaluation guidelines which enables faculty to implement learning outcomes into both course-level and program-level and as well as, system integration for supporting this process.
This paper begins Section I how we initially developed Program educational objectives and Program learning outcomes to provide policy and guidance for the development of undergraduate academic programs. It follows Section II, with an integration of planning learning outcomes in course plan and its curriculum. The next part covers a data analysis of implementation based on each course’s plan and its integration to program learning outcomes. The assessment method and results are described in the following Section III. The remainder of the paper summarizes the result and concludes how our learning outcomes framework work and lessons learned.
Theoretical Framework
Uptick of informatization and technologization of society, increased competition both in labor market and in learning services have caused dramatic changes in higher education and have led to its reorientation from knowledge-centric to the competency-based learning paradigm (Andrii Vitchenko, 2022). Originally associated with vocational learning it has represented an educational paradigm shift in higher education with external accreditation bodies now “requiring units of higher education to document how they know that graduating students have obtained the necessary competencies that support their respective degrees” (Blaine T. Garfolo, 2016). University can, theoretically, use any acceptable instructional method to move a student toward mastery. When utilizing a CBE approach, a university must ensure that assessment is the foundation of every step in the learning process. This will ensure that the student will progress successfully toward mastery of the competency. Additionally, this allows for competencies to be master’s in real time (Blaine T. Garfolo, 2016).
Outcome-based learning approach has been used in vocational curriculum before four-year programs developing and assessing learning outcomes. The term “learning outcomes” has been used in educational settings to refer to the competencies the student is supposed to develop, as well as the assessment process that provides evidence of improvement in competencies, capabilities, or knowledge because of education (Filipp, 2001). A clear articulation of learning outcomes serves as the foundation to evaluating the effectiveness of the teaching and learning process (Osters Sandi, 2006). Oster clarifies that learning outcomes describe what students can demonstrate in terms of knowledge, skills, and values upon completion of a course, a span or several courses, or a program (Joelle Ducrot, 2008).
Gopal Chandra Saha (2023) determined two paradigms shift for education influenced by outcome-based education approach.
1. 1st Paradigm Shift-Teacher-Focused to Learner-Focused Approach
A change in mindset among educators and educational leaders is necessary for the evolution of result-based education, and the change necessitates a shift in training strategy from trainer-focused to learner-focused. Instead of transmitting knowledge, instructor’s responsibility is changing more learner-centered and more imaginative and creative in building their programs with the help of learning programs, which are viewed as guides.
2. 2nd Paradigm Shift-Shifting to Outcomes Oriented
Everything in a framework for education focused on results must be founded on those results. Therefore, evaluation methods and tactics should be consistent with the desired training outcomes. The industries expectation to educate and teach the scholars at Institutional give up as in step with requirement of the industries. This is why the outcome Base training is most important rather than traditional schooling (Pradhan, 2021).
The framework of Outcome Based Education
The outcome-based education framework is shown below Figure 1 by (Gopal Chandra Saha, 2023). Outcome-based education (OBE) implementation uses the following components.
• Program educational objectives: Characteristics or particular objectives that graduates should pursue in their employment and life after graduation. These goals aligned with the branch's vision and purpose statement and were created in collaboration with enterprise partners, learners, parents, alums, university, and administration stakeholders.
• Program outcomes: The cornerstone of Outcome Based Education is the Graduate Attributes which is fundamental soft skills for every student gained at the end of their study.
• Program-Specific Outcomes: Having business knowledge as a fundamental educational goal, learners who approach learning from a scientific perspective are more likely to be interested and open-minded.
• Course outcomes: What is expected from the learners throughout the entire instructional process is outlined in the statements known as course outcome.
• (Pradhan, 2021), Outcome-based education ssessment can provide direct or indirect measures of scholar learning
• Direct measures require students to demonstrate their achievement and often involve quantitative dimension techniques.
• Indirect evaluation is primarily based on critiques.
According to accreditation Council for Business Schools and Programs (ACBSP), A student learning outcome is one that measures a specific competency attainment. The description of the measurement instrument are:
• Direct - Assessing student performance by examining samples of student work
• Indirect - Assessing indicators other than student work such as getting feedback from the student or other persons who may provide relevant information.
• Formative – An assessment conducted during the student’s education.
• Summative – An assessment conducted at the end of the student’s education.
• Internal – An assessment instrument that was developed within the business unit.
• External – An assessment instrument that was developed outside the business unit.
• Comparative – Compare results to external students using data from the U.S. Department of Education Research and Statistics, or results from a vendor providing comparable data. Internal comparative data may be between classes, online and on ground classes, professors, programs, campuses, etc (ACBSP, 2022).
Our university’s initial development of learning outcome has been developed since 2009 according to requirement of ACBSP. The ACBSP standard defines that academic programs must have a systematic student learning outcomes assessment process and plan that leads to continuous improvement. Student learning outcomes must be developed and implemented for each accredited program, and the results must be communicated to stakeholders. The fundamental framework for our latest redesigned of learning outcome is based on a continuous loop of planning, doing, checking, and acting (PDCA). A key principle of the PDCA cycle is that it promotes learning through iteration; the findings from one cycle generate a new cycle, extending knowledge even further (Nicholas Loyd, 2016). Also, the Conceiving - Designing - Implementing - Operating (CDIO) Initiative has been adopted as a framework of our curricular planning and outcome-based assessment. However, it was designed for engineering education. According to CDIO standard 2 defines Learning outcomes are reviewed and validated by key stakeholders, that is, groups who share an interest in the graduates of engineering programs, for consistency with program goals and relevance to engineering practice (CDIO office, n.d.).
In UFE, the assessment of the undergraduate program’s learning outcomes will be done in the following steps.
1. Planning - Planning the goals and objectives of the undergraduate program.
2. Implementation - Implement the goals and objectives of the bachelor's program according to the plan.
3. Monitoring - Monitor the implementation of the goals and objectives of the bachelor's program, develop, and discuss the evaluation report of the learning outcomes.
4. Improvement - Develop suggestions, recommendations, and action plans for further improvement based on the results of the evaluation.
This process is a crucial part of the curriculum development process, and the following procedures are used to ensure the continuous development of the curriculums (Figure 2). The professional departments and teams will develop the curriculum, which will be discussed by the professional council with the program committee. Changes in the curriculum shall be discussed by the sub-committee based on the suggestions, recommendations and conclusions made to the head of department. The need for change is identified based on feedback from stakeholders to ensure continuous improvement of the curricula.
Continuous Improvement System on Outcome Planning and Assessment
1. Planning and Implementation
In connection with the approval of the new strategy, changes were made to ensure the continuity of the undergraduate policy and its learning outcome development as part of the implementation of the objectives set out in the UFE's strategic plan. In UFE's vision and mission, the bachelor's program aims to “Prepare professionals who contribute to the sustainable development of their country, who are professionally competitive, who have an entrepreneurial mindset to work in the digital age, and who are industry leaders and ethics professionals”.
In line with the revision of the UFE's strategic plan, an undergraduate policy document and its guidelines were developed and approved to determine the management and organizational direction regarding the implementation of the undergraduate program admission, study, research, and graduation activities. As part of the next phase of the reform of the UFE information system, the program learning outcomes are being integrated into the system to link each course's learning outcomes. Additionally, the program is developed to add each criterion of the course assessment rubric. The development of the new system will make it possible to perform a detailed data analysis to directly assess the program learning outcomes.
The policy determines program educational objectives (PEO) of the UFE which describes the future goals of graduates and policy documents of each program determine the objectives. PEO of UFE is “to train ethical entrepreneurs and professionals who lead others in their areas to contribute to the sustainable development of the country and have an entrepreneurial mindset to work and compete professionally in the digital world.”. For undergraduate level, preparing active citizens in sustainable development who capable of identifying and implementing innovative solutions to pressing any challenges. To align with SDGs1, learning outcomes were redesigned based on international initiatives like (World economic forum, 2020) and united nations’ global citizen concept. To embed global perspective into academic program, referring various concepts/terms to learning outcomes such as entrepreneurial, moral, sustainable, is crucial, and a framework for shaping graduate attributes promotes volunteering, self development, and career. Ultimately, UFE initiatives to move students towards greater social responsibility in a quest to develop global citizens.
Program learning outcomes based on the knowledge, skills, attitudes, and professional skills and abilities through the program. Basic knowledge and skills are towards developing the individual, whereas professional skills and abilities are variously defined in accordance with their professional features. Program learning outcomes (PLO) aim to meet the PEO and describe what a student gains “knowledge, skills, and attitudes” by the time of graduation. All course learning outcomes are mapped to program learning outcomes and will be presented in all course syllabus. PLOs are defined by the three levels (Introductory, Practiced, Demonstrated). For each PLO, a relevant course and measurable student action within that course are identified for assessment purposes. As shown in Figure 3, the PLOs of the undergraduate program of UFE consist of common skills (soft skills) set defined PLO1-7 and professional skillset defined PLO8.1-8.6, which varies depending on its profession. The followings are the list of common skills and knowledge set which means all courses can be mapped to these learning outcomes:
• PLO1 – Moral maturity: illustrates professional and student ethics, applies them to problem-solving and develops personal ethical practices.
• PLO2 – Communication skills: demonstrates communication etiquette and the ability to communicate effectively orally and in writing.
• PLO3 – Basic theoretical knowledge and skills interprets general knowledge of the theoretical and methodological aspects of majoring science and develops an understanding of their interrelationships and connections.
• PLO4 – Research and analysis skills identify the causes of problems based on evidence, analyze alternatives using appropriate research methods, propose optimal solutions, and draw logical conclusions.
• PLO5 – Digital Knowledge and Skills gathers and processes information using digital tools accurately and acquires the ability to apply professional and common software and applications effectively.
• PLO6 – Foreign language knowledge and skills: demonstrate the ability to read, speak, communicate, and write professional sources in a foreign language.
• PLO7 – Ability to think critically and creatively: analyzes specific scientific issues from many angles, creates logical connections, produces innovative ideas and initiatives, and summarizes them.
• PLO8 – Professional knowledge and skills interprets the theory and methodology of a field of science, applies it to any decision-making, and combines the knowledge of different fields of science.
As an example, the list of professional learning outcomes of Information system undergraduate programs is:
• PLO8-1: Knowledge on business information system - To gain skills to recognize information system supporting sustainable business activities, administrative decision making, and competition privileges and to understand ethical applications.
• PLO8-2: Knowledge and skills to manage information data - To gain skills to develop valuable information that improves organizational efficiency through data collection, organization, unification, processing, and descriptions.
• PLO8-3: Knowledge on information technology infrastructure - To gain skills to recognize and understand infrastructure of organizational information technology that are data, network, software architecture and infrastructure.
• PLO8-4: Skills to analyze and design information system - to gain knowledge and skills to understand, compare, and use different methodology and tools of system development to analyze problems faced to business and to determine users’ requirements optionally.
• PLO8-5: Skills of information system development and programming - to gain knowledge and skills to understand and compare various methodology and tools of software encoding and to make efficient and applicable codes in accordance with system design.
• PLO8-6: Skills to work for and supervise information system projects - to gain skills for understanding, tools, and techniques of project management and to apply for information system projects.
During the planning process, we developed a curriculum matrix mapping program learning outcomes onto each of the individual courses. Using lesson planning database record, to ensure that all outcomes are covered in at least one course, and preferably more than one.
2. Data analysis on Implementation
In the academic year of 2022-2023, our redesigned curriculum development and updated learning outcomes were made available to implement and in parallel with current information system to automate this process. This academic year, for undergraduate level, 246 different courses were offered, and 1176 course learning outcomes (CLO) were defined, which means each course has a 4-12 CLOs with three levels. The analysis focuses on ensuring how courses were supported common set PLOs. We maintain a matrix of which courses contribute to which outcomes.
For this analysis, the following steps were taken to prepare data.
• Removed graduate, vocational level courses’ plan before data analysis.
• The course plan had to be approved and used in 2022-2023 academic year.
• The research only focused on common PLO assessment and measurement. The selected course plan had to mapped to at least 1 common PLO1-7. Even if, each professional has specific set of professional PLO8.1-8.6.
• As for I, P, D level analysis, chosen courses were taught for all academic programs and levels.
From Table 1, 81.3% of all courses were mapped to common set PLOs, the rest were supported professional PLOs. The Table 1 shows the number and percentage of courses for each PLO. Overall, all defined PLOs were covered in more than nineteen courses. Furthermore, the highest percentage is 65% PLO3 compared to 10% PLO5 and PLO6.
PLOs are defined by the three levels (Introductory, Practiced, Demonstrated):
• Introduced - (Level I) – Get acquainted with and study the concepts related to the subject, remember, and refresh the knowledge and understanding.
• Practiced (Level P) – Apply the obtained knowledge and skills in practical situations and make a habit of using them.
• Demonstrated (Level D) - Solve problems professionally based on habitual skills and express them correctly to others.
To more depth analysis with PLO levels (Introductory, Practiced, Demonstrated), Table 2 shows that how chosen 75 courses were defined each PLOs and its levels. One particularly interesting fact highlighted by the table is that 81% of learning outcomes are tied to Level I of PLO3. In stark contrast, however, others mostly focus on Practice level or Demonstrating level.
Based on course planning record, Table 3 reveals that departments differentiate for curriculum mapping and its courses. On the other hand, how each learning outcome is defined in various department courses. PLO3 is designed for all courses even if PLO6 is barely sufficed in courses of Institute of foreign language.
Based on curriculum mapping analysis, the data showed that learning outcomes mapping are unbalanced. Most courses (65%) were mapped to theoretical learning outcomes, but employers prefer soft skills such as communication and research skills. Redesigning and updating curriculum mapping and course learning outcome is necessary for this implementation. The balanced coverage of learning outcomes for the entire study will be significant for each department.
Assessment and Evaluation
Learning outcomes are evaluated at the following three levels according to the stages of planning, implementation, monitoring, and improvement, and are used to improve the educational process. The purpose of the assessment is to evaluate the curriculum, to ensure coherence, to measure compliance with participants’ needs, changes in the environment and trends, and to continuously improve it. PLOs of UFE will be evaluated using direct and indirect methods based on the result of student performance, self assessment, and stakeholders’ assessment upon completion of an academic year and reports are made.
1. Direct assessment result
Faculty members responsible for a given course offering may create measurements for their courses, the measurements are collected for assessment through the university wide information system. The system takes care of combining the scores when reports are generated. During the semester ongoing, Faculties enter scores achieved by students on each of these measurements and then the data will be used for data analysis from course to program level. In below Table 4, direct assessment data were collected through the university wide information system. These are steps to prepare data:
1. From the database, the student performance grades with coded identity were received and then removed graduate, vocational level grades.
2. The student performance had to be graded in 2022-2023 academic year.
3. The performance grade must be identified to measure PLO.
4. Student performance data were gathered time ordered during the semester.
5. The following grade had to mapped to common set of PLOs which is PLO1-7.
Students must take a self assessment survey at the end of each course during the academic year through the information system and the survey results were collected and recorded on a database. Students were asked to assess each course learning outcome using a five-point, Likert-type scale. Participation was mandatory, the self assessment survey was implemented online, and respondents were ensured confidentiality. The survey was developed by academic admission and research board reviewed and approved by board of directors.
In 2022-2023 academic year, Table 5 shows that all student average performance of all course’s measurement tied to each PLO. Determining the desired achievement of each outcome is 70%. The average performance of PLO1 is the highest percent, which is.
2. Indirect assessment result
Assessing learning outcomes other than student performance such as getting feedback from the student or other stakeholders who may provide feedback for PLOs. UFE conducted a stakeholders’ survey for engagement of curriculum development and reflecting their assessment of learning outcomes for future improvement.
2.1 Survey design
A Stakeholders’ survey of UFE was conducted in spring 2023 during the months of April and May via Google form. The survey design shown in Table 6, was developed by the admission board based on previous surveys and then reviewed by head of departments and educational consultant. The final version was approved by the board of directors.
Respondents were asked to rate the importance of and satisfaction for program learning outcomes using a five-point, Likert-type scale consisting of 13 items including common set of PLO1-7 and corresponding professional PLOs 8.1-8.6. For this research, the section of Satisfaction of common set PLOs was used for analysis which was using a five-point, Likert-type scale consisting of 7 items including common set of PLO1-7.
2.2 Validity and reliability
The number of participants of each survey is enough to meet the minimum sampling size and these surveys were prepared and gathered responses using google forms (see Table 7). Assessing the validity and reliability of the questionnaire was using SPSS software.
Using SPSS to do Pearson Correlation analysis. Find correlation between each question in the questionnaire and its total value. The significance value (Sig.):
– If Sig. < 0.05 the question is valid.
– If Sig. > 0.05 the question in not valid: deleted/removed.
Based on results from the validity test (shown in Figure 4, Figure 5, Figure 6), each item’s significance value is less than 0.05 for each survey and proved that each variable is valid.
Cronbach’s alpha is a measure of internal consistency, that is, how closely related a set of items are as a group. The reliability coefficient of .70 or higher is considered “reliable” in most social science research situations (UCLA: Statistical Consulting Group, 2023). The table below shows that each survey’s Cronbach’s Alpha is more than 0.6 and each of them is reliable.
As for student, Satisfaction survey of PLO gathered participants from students randomly. To represent each class and each program, setting up specific requirements were defined in sample size. The results show that PLO3 is the highest percent and PLO6 has the lowest rate of satisfaction. The margin between top and bottom is about 10%. At the end of course, all students take a self assessment for each CLO, which was collected and generated into each PLO’s self assessment rate. Student self assessment report shows that maximum and minimum score have only 3.6% margin. Table 8 shows that comparison of these survey.
The alumni survey has launched first time its PLO survey, which was looking into assessment from alumnus who graduated in 2018 and 2022. From the Table 9, the most satisfied rate is PLO1: Moral maturity and PLO6: Foreign language knowledge and skills is emphasizing for future concern. Also, there is a minor difference between years.
According to the employer’s survey, PLO5, PLO2 and PLO1 are the most satisfied learning outcomes from our graduates. Overall, the survey revealed that all outcomes are higher than target goals of 70%. The following details are designed to Figure 7.
From employers’ survey shown in Table 10, three learning outcomes were rated as “high priority” and are recommended as a baseline for all graduates. In contrast, most courses are tied to theoretical knowledge and skills rather than other soft skills.
3. Comparing direct assessment and indirect assessment results
Table 11 compares the rates of student performance and stakeholders’ assessment results in various PLOs. Overall, it seems that current student performance and assessment are a bit lower than other stakeholders’ assessment.
We defined performance and stakeholders’ assessment threshold of 70%, both direct and indirect assessments result are measured above desired threshold. The gap between direct and indirect assessment of current learning outcomes is a small margin, which proves that faculty direct assessments are objective (Figure 8).
Findings and Discussions
This paper contributes to an effective continuous deployment, measurement, and improvement process from program level to each unit of all courses systematically, and data analysis on direct and indirect assessment on learning outcomes. The following findings are used for future development and continuous improvements.
1. Based on curriculum mapping analysis, the data showed that learning outcomes mapping are unbalanced. Most courses (65%) were mapped to theoretical learning outcomes, but employers prefer soft skills such as communication and research skills. Redesigning and updating curriculum and course planning is necessary for this implementation.
2. The analysis ensures that the gap between direct and indirect assessment of current learning outcomes is a narrow margin, which confirms the direct assessments are objective.
3. The gap analysis demonstrated that stakeholders’ assessment results were higher than student performance. To ensure continuous improvement of the curriculum, the result of the survey is used by professional departments to improve and develop the curriculum.
4. The system development for measuring and reporting program learning outcomes is significant for future work. The software requirements will be defined for the conceptual model. Moreover, the interface which enables each student to get reports about their skill-building experiences throughout their undergraduate studies, is the next big development for our software application.
Finally, we believe that the research will continue to drive ongoing process of implementation effort to:
• This research was focused on the program implementing side. So, it is expected that analysis of each learner's learning path and their skill-building experiences. The result would be to give specific guidance for all stakeholders to achieve their individual goals and approaches.
• Follow up in detailed ways to verify the relevance between teaching methods and evaluation techniques and its performance.
• Make process and technology changes that facilitate ease of use and more accurate reports for each stakeholder.
Notes
Sustainable development goals