The Development of a Cognitive Test on Exercise Physiology for Exercise and Sports Science College Students

Authors

  • Julia Pearl M. Arroyo University of the Philippines, Quezon City, Philippines

DOI:

https://doi.org/10.69569/jip.2025.517

Keywords:

Exercise physiology, Cognitive test, Test development, Sports science, Knowledge test

Abstract

Future exercise and sports science professionals must master human physiological concepts and their applications to be effective in their fields. The education of aspiring professionals in exercise and sports science requires quality assessment methods, from which results serve as a basis for educational and training programs. To ensure coherence with the program objectives of academic institutions and industry standards, the use of well-developed cognitive tests is imperative. This study aimed to develop a cognitive written test for an introductory course in Exercise Physiology for use in an Exercise and Sports Science program. Eighty sports science students participated in the development of the cognitive test. The constructed forty-item multiple-choice test had a Cronbach’s alpha of 0.73, indicating an “acceptable” internal consistency. Item-analysis results showed that 39 out of the 40 items had acceptable item difficulty levels, with a good combination of easy-desirable-difficult items. However, only 15 out of the 40 items were selected for retention, as 25 items were classified as having negative to poor item discrimination. The test development process entailed careful planning, pre-construction conceptualization, and a multi-step process, from identifying the test's purpose and constructing the table of specifications to item analysis and field testing. This study highlights the importance of test improvements and regular evaluation of curriculum assessment tools.

 

Downloads

Download data is not yet available.

References

Ajjawi, R., Tai, J., Huu Nghia, T., Le, Boud D., Johnson, L., & Patrick CJ. (2020). Aligning assessment with the needs of work-integrated learning: the challenges of authentic assessment in a complex context, Assess Eval High Educ. 45:304–16. https://doi.org/10.1080/02602938.2019.1639613

Aligway, G. J., Delos Angeles, J., Collano, A., Barroca, E., Aves, A. C., Catubay, J., & Cortes, S. (2024). Validity and reliability of a concept inventory test in human physiology. JPBI (Jurnal Pendidikan Biologi Indonesia), 10(1), 273-282. https://doi.org/10.22219/jpbi.v10i1

Balasooriya, C., Lyons, K., Tran, M., Pather, N., Chur-Hansen, A., & Steketee, C. (2024). Learning, teaching, and assessment in health professional education and scholarship over the next 50 years. Focus On Health Professional Education, 25(2), 110–129. http://dx.doi.org/10.11157/fohpe.v25i2.785

Bruce, L., Bellesini, K., Aisbe1, B., Drinkwater, E., & Kremer, P. (2022). A profile of the skills, attributes, development, and employment opportunities for sport scientists in Australia. Journal of Science and Medicine in Sport, 25, 419-424. https://doi.org/10.1016/j.jsams.2021.12.009

Chartered Association of Sports and Exercise Science. (2025, March). Undergraduate endorsement scheme application guidelines. United Kingdom. Retrieved from https://www.cases.org.uk/imgs/bues_application_guidelines_new___mar_2593.pdf

Colorado College. (2022, April 14). How to assess learning: Bloom's revised taxonomy. Retrieved from Colorado College: https://tinyurl.com/4tptrmsx

Elgadal, A., & Mariod, A. (2021). Item analysis of multiple-choice questions (MCQs): Assessment tool for quality assurance measures. Sudan Journal of Medical Sciences, 16(3), 334-346. https://doi.org/10.18502/sjms.v16i3.9695.

Exercise and Sports Science Australia. (2024, September). Accredited exercise science professional standards for accreditation. Australia. Retrieved from https://tinyurl.com/mwdkb8ms

Khan, H. F., Qayyum, S., Beenish, H., Khan, R. A., Iltaf, S., & Faysal, L. R. (2025). Determining the alignment of assessment items with curriculum goals through document analysis by addressing identified item flaws. BMC Medical Education, 25, 1-9 https://doi.org/10.1186/s12909-025-06736-4.

Knupp, T., & Harris, D. (2012). Building content and statistical specifications. In C. S. Denison, Handbook on measurement, Assessment and Evaluation in Higher Education (p. 680). New York: Routledge, Taylor & Francis Group.

Krathwohl, David (2002). A revision of Bloom’s taxonomy: An overview. Theory Into Practice, 41(4), 212-218. https://doi.org/10.1207/s15430421tip4104_2

Kubiszyn, T., & Borich, G. (2013). Educational testing & measurement, 10th edition. Jefferson City: John Wiley & Sons, Inc.

Lane, S., Raymond, M., & Haladyna, T. (2016). Handbook of test development. New York: Routledge Taylor & Francis.

Mallaih, J., Williams, O., & Allegrante, J. (2024). Development and validation of a stroke literacy assessment test for community health workers. Health Education & Behavior, 51(5), 764–774. http://dx.doi.org/10.1177/10901981241245050

Mubuuke, A. G., Mwesigwa, C., Maling, S., Rukundo, G., Kagawa, M., Kitara, D.L., et al. (2014). Standardizing assessment practices of undergraduate medical competencies across medical schools: challenges, opportunities, and lessons learned from a consortium of medical schools in Uganda. Pan Afr Med J, 19:382.. http://www.panafrican-med-journal.com/content/article/19/382/full/

Nitko, A. J. (2001). Educational assessment of students (3rd ed.). Upper Saddle River, NJ: Prentice-Hall, Inc.

Pal, M., Maji, B., Sakar Mahat, J., Haldar, S., Das, P., & Sethuraman, K. (2025). Unlocking cognitive potential: Validating multiple-choice assessments for higher-order cognitive skills in medical education. Journal of Chemical Health Risks, 15(3), 250–253. https://www.jchr.org/index.php/JCHR/article/view/8183/4692

Sadaf, S., Khan, S., & Ali, S. (2012). Tips for developing a valid and reliable bank of multiple-choice questions (MCQs). Education for Health, 25(3). https://doi.org/10.4103/1357-6283.109786

Singh, G., Singh, R., & Ananthakrishnan, N. (2024, December). Analyzing the quality of multiple-choice questions in microbiology for second-year MBBS students. MGM Journal of Medical Sciences, 11, 651–658. DOI:10.4103/mgmj.mgmj_333_24. http://dx.doi.org/10.4103/mgmj.mgmj_333_24

Thellesen, L., Bergholt, T., Hedegaard, M., Colov, N., Christensen, K., Andersen, K., & Sorensen, J. (2017). Development of a written assessment for a national interprofessional cardiotopography education program. BMC Medical Education, 17(88). https://tinyurl.com/y3yeser9

Wheeler, W., & Van Mullem, H. (2021). High-Impact Educational Practices in Kinesiology: Examples of Curricular Advancements to Prepare Students for the Future of Work. Kinesiology Review, 10, 419-427. https://doi.org/10.1123/kr.2021-0047

Downloads

Published

2025-07-28

How to Cite

Arroyo, J. P. (2025). The Development of a Cognitive Test on Exercise Physiology for Exercise and Sports Science College Students. Journal of Interdisciplinary Perspectives, 3(8), 866–873. https://doi.org/10.69569/jip.2025.517