We joined PISA back in 2018, and we were so disappointed with the results in math, science, and reading. We fared so badly that we pointed our fingers to DepEd to be responsible for the results. They bear the biggest chunk of blame often from the generally uninformed public.
In 2022, we subjected our Grade 10 (15-year-old) learners to the Creative Thinking assessment, and the results took longer than we expected. That shows how hard it is to evaluate the performances of the learners in the said assessment. And as expected, we ranked second to last alongside Uzbekistan with a score of 14, which is 19 points away from the OECD (Organization for Economic Cooperation and Development) average, and a staggering 27 points behind the top performer, our neighbor country in Southeast Asia, Singapore.
Prior to joining PISA on creative thinking, (and/or even math, science, reading areas) our assessment practices were anchored on the revised Bloom’s taxonomy which measures learning competencies and skills such as remembering, understanding, applying, analyzing, evaluating and creating. Then, our assessment design is multiple choice, wherein our learners’ ability to think creatively is “boxed” within a certain frame of cognitive rules and competencies. Creative thinking is nuanced under “create” but in the actual practice, it is barely observed.
PISA’s method of assessment is through a computer system, and it is done remotely. The test materials contain open-ended questions, where test-takers are free to think outside any “box” for their answers. Particularly, it measures learners’ abilities to generate diverse ideas; generate creative ideas; and evaluate and improve ideas. Obviously, PISA is geared toward imaginative skills and not on recall, memorization, familiarization, analysis, and/or ironically not by chance nor luck.
The results are truly disheartening. But it does not mean that we are poor in this kind of international standard assessment. We have to remember that “firsts” are not always pleasant experiences.
Our learners are not used to the PISA test mechanics/dynamics. Some test-takers might have a troublesome encounter with how the test was administered using gadgets since public schools never had employed computer-aided assessments in any setting, except those from private schools. They are used to paper-pencil tradition, where they have to choose one correct or (worst) “best” answer only by shading or encircling the letter of a predetermined correct answer. Our teachers are not trained nor even aware of the assessment framework of PISA (prior to joining) or even with other similar assessment models. PISA is totally different from the usual local or national achievement tests we used to have before.
Seemingly, there was a mismatch of expectation between test-takers and the kind of test materials/mechanics/dynamics. If the learners scored low, it does not mean they are “poor” creative thinkers. They must have been tested in a (very) “poor” timing instead.
The purpose of joining PISA was for “benchmarking”. We wanted to measure our educational outcomes against the international standards so we can identify our areas of improvements and adjust our practices to align with the standards. We don’t need to worry at this time, at least we are now aware of our flaws, hence we know what to do. We are just starting; we don’t expect outstanding results as beginners. But we have to act swiftly!
Rodel Calvez is a Teacher III at the Sta. Rosa Elementary School – Central III.
To submit an opinion or a comment to an opinion, you may email [email protected].