SUBMIT ARTICLE
ISSN: 2782- 893X
eISSN: 2799-0664

Data Trap: When School Testing Does Not Show Real Learning

IJAMS Publisher

AUTHOR(S)

MARY ANN F. ORDOÑO



ABSTRACT

—— This study investigated the validity and long-term predictive power of traditional school testing methods by comparing researcher-made test scores with performance on Project-Based Learning (PBL) Assessments and future academic outcomes. The results indicated a moderate correlation (r=0.42) between traditional test scores and authentic task performance, with tests accounting for only 30.3% of the variance in applied skills, confirming that conventional testing fails to capture transferable competencies. Furthermore, a Test-Preparation Curriculum led to significantly lower knowledge retention (75.2% mean score) months later than an Inquiry-Based Curriculum (82.5% mean score), undermining the rationale for teaching to the test. Finally, high-stakes test scores showed limited long-term utility, accounting for only 13.3% of the unique variance in first-year college GWA after controlling for Socioeconomic Status. These findings confirm a data trap where current school assessments provide an inefficient and misleading picture of student readiness for life and higher education. Keywords — Authentic Assessment; Project-Based Learning; Knowledge Retention; Predictive Validity; Educational Testing