You are here:

Examining Domains of Technological Pedagogical Content Knowledge Using Factor Analysis
ARTICLE

, , , ,

Journal of Research on Technology in Education Volume 45, Number 4, ISSN 1539-1523

Abstract

This study examined the construct validity of the Survey of Preservice Teachers' Knowledge of Teaching and Technology through an exploratory factor analysis using responses from 365 preservice teachers enrolled in an educational technology course in the United States. The participants were completing methods courses and field experience concurrent to the educational technology course, allowing them to contextualize the content they learned during the semester. The survey, grounded in the framework of Technological Pedagogical Content Knowledge (TPACK), is designed to measure seven domains associated with technological, pedagogical, and content knowledge. Although the influence of the TPACK framework on teacher education programs continues to grow, research indicates the need for clearer distinctions between the domains. Results from this study revealed that participants did not always make conceptual distinctions between the TPACK domains. Specifically, factors were congruent across only technological knowledge (TK) and content knowledge (CK) and not congruent across pedagogical knowledge (PK), pedagogical content knowledge (PCK), technological content knowledge (TCK), and TPACK. Additionally, PK and PCK loaded together, indicating the participants did not distinguish PK from PCK. Overall, this study confirms the need to provide more clarity about the TPACK framework and to revisit survey instruments built directly around the framework. (Contains 6 tables and 1 figure.)

Citation

Shinas, V.H., Yilmaz-Ozden, S., Mouza, C., Karchmer-Klein, R. & Glutting, J.J. (2013). Examining Domains of Technological Pedagogical Content Knowledge Using Factor Analysis. Journal of Research on Technology in Education, 45(4), 339-360. Retrieved March 19, 2019 from .

This record was imported from ERIC on December 3, 2015. [Original Record]

ERIC is sponsored by the Institute of Education Sciences (IES) of the U.S. Department of Education.

Copyright for this record is held by the content creator. For more details see ERIC's copyright policy.

Keywords

View References & Citations Map

Cited By

  1. Unpacking Performance Indicators in the TPACK Levels Rubric to Examine Differences in the Levels of TPACK

    Aleksandra Kaplon-Schilis, The Graduate Center, CUNY, United States; Irina Lyublinskaya, College of Staten Island, CUNY, United States

    Society for Information Technology & Teacher Education International Conference 2018 (Mar 26, 2018) pp. 2074–2083

  2. TPACK Radar Diagrams - A Visual Quantitative Representation for Tracking Growth of Essential Teacher Knowledge

    Julien Corven & Ming Tomayko, Towson University, United States

    Society for Information Technology & Teacher Education International Conference 2017 (Mar 05, 2017) pp. 2296–2301

  3. Putting TPACK on the Radar: A Visual Quantitative Model for Tracking Growth of Essential Teacher Knowledge

    Julien C. Colvin & Ming C. Tomayko, Towson University, United States

    Contemporary Issues in Technology and Teacher Education Vol. 15, No. 1 (March 2015) pp. 68–84

  4. Measuring TPACK... Yes! But how? A working session

    Petra Fisser, National Institute for Curriculum Development, Netherlands; Lara Ervin, San Jose State University, United States; Joke Voogt, University of Amsterdam, Netherlands; Matt Koehler, Michigan State University, United States

    Society for Information Technology & Teacher Education International Conference 2014 (Mar 17, 2014) pp. 907–908

These links are based on references which have been extracted automatically and may have some errors. If you see a mistake, please contact info@learntechlib.org.