header image
<div class="breadcrumb breadcrumbs"><div class="breadcrumb-trail"> » <a href="https://carme-educ.sites.olt.ubc.ca" title="Cross Cultural Assessment and Research Methods in Education (CARME)" rel="home" class="trail-begin">Home</a> <span class="sep">»</span> Director </div></div>

Director


Dr. Kadriye Ercikan is a Professor in the area of Measurement, Evaluation and Research Methodology (MERM) in the department of Educational and Counselling Psychology and Special Education (ECPS). She is also the director of the Cross-Cultural Assessment and Research Methods in Education (CARME) research lab unit at UBC.

Research Interests:

Dr. Ercikan has over 20 years experience conducting research on assessment design, validity and fairness issues in multi-cultural and multi-lingual assessments and psychometric issues in large-scale assessments. In the area of research methods Dr. Ercikan’s research focuses on research generalization and validity of research claims.

Teaching:

  • Large-scale assessment/measurement
  • Cross-cultural and language issues in measurement
  • Validity of assessment and research generalizations
  • International assessments
  • Assessment of complex thinking
  • Evidence centered assessment design

Advisory service:

Dr. Ercikan has served on numerous national and international advisory boards on large-scale assessments including:

  • 2014 – present: American Institute for Certified Public Accountants, Board of Examiners
  • 2013 – present: Assessment Group on Provincial Assessments (AGPA)
  • 2013: Alberta Learning, Assessment of Student Learning Advisory Committee
  • 2012 – present: American Institute for Certified Public Accountants, Psychometric Oversight Committee
  • 2011 – present: ETS Visiting Panel on Research
  • 2009 – present: National Assessment of Educational Progress (NAEP) – Design and Analysis Committee member
  • 2008 – present: National Assessment of Educational Progress (NAEP) – Quality Assurance Technical Advisory Panel member
  • 2010: Race to the Top Assessment Grants Adjudication Committee
  • 2007 – 2010: South Africa National Education Quality Initiative, International Advisory Committee
  • 2007 – 2008: Center for Assessment and Evaluation in Science Learning (CAESL), Executive Planning Group
  • 2006 – 2011: Vancouver School Board Committee on Assessment
  • 2006: Singapore National Institute of Education Classroom Assessment Project
  • 2005 – 2006: Benchmarks of Historical Thinking Assessment Project
  • 2002 – 2006: Francophone School District, British Columbia
  • 2002 – 2004: Puerto Rico K-12 Assessment Technical Advisory Committee chair and member
  • 2000 – 2005: Alberta Learning Assessment
  • 1999 – 2000: National Assessment of Educational Progress (NAEP) – Foreign Language Assessment

Professional leadership and service:

  • 2016: Program chair and organizer, 10th conference of the International Test Commission
  • 2015 – 2018: Vice President AERA Division D
  • 2014: Vice-President elect AERA Division D
  • 2012: Member, International Test Commission Council
  • 2012 – 2014: Chair, AERA Division D International Committee
  • 2011 – 2013: Member, AERA Division D Linn Award Committee
  • 2011 – 2013: Member, AERA Division D Early Career Award Committee
  • 2008 – 2011: Elected member of the Board of Directors of the National Council on Measurement in Education
  • 2010 – 2012: Member, American Educational Research Association, Division D International Issues Committee
  • 2007 – 2009: Member, American Educational Research Association, Division D Bylaws Revision Committee
  • 2007 – 2010 Member, National Council on Measurement in Education Committee on International Testing Standards
  • 2006 – 2007: Chair, National Council on Measurement in Education Committee for the Jason Millman Award
  • 2005 – 2006: Member, National Council on Measurement in Education Committee for the Jason Millman Award
  • 2003 – 2005: Member, American Educational Research Association, Division D, Awards Committee
  • 2003: AERA Division D annual meeting program chair
  • 2002: Member, NCME Award for Technical or Scientific Contributions to the Field of Educational Measurement Committee
  • 2000: Program chair and organizer for the International Meeting of the Psychometric Society
  • 1999 – 2002: Chair, National Council on Measurement in Education (NCME) Committee on International Testing Issues
  • 1999 – 2002: Member, National Council on Measurement in Education (NCME) Dissertation Award Committee
  • 1996 – 1999: Member, National Council on Measurement in Education (NCME) Committee on International Testing Issues

Awards and distinctions:

  • 2014: nominated for Fellowship in the Royal Society of Canada
  • 2013: present – Fellow of the International Academy of Education
  • 2010: AERA Division D award for “Significant Contribution to Educational Measurement and Research Methodology” for her edited book “Generalizing from Educational Research: Beyond Qualitative and Quantitative Polarization”
  • 2000: present – Associate of the Peter Wall Institute for Advanced Studies
  • 2000: Peter Wall Institute Early Career Award
  • 1998 – 2001: Member of the US National Academy of Sciences Committee on Foundations of Assessment

Dr. Ercikan has also been an elected member of the Board of Directors of the National Council on Measurement in Education

Recent talks:

  • Ercikan, K. (October, 2014). Issues in interpreting results from international assessments for informing policy. Invited presentation at the University of Massachusetts, Amherst, MA.
  • Ercikan, K. & Gonzalez, E. (July, 2014). Accuracy of International Scales for individual countries in international assessments. Invited presentation at the 9th Congress of International Test Commission, San Sebastian, Spain.
  • Arim, R.G., Lyons-Thomas, J. & Ercikan, K. (July, 2014). Comparability between English and French Versions of the PIRLS 2011 Reader Test Results from Canada, United States, and France. Invited presentation at the 9th Congress of International test Commission, San Sebastian, Spain.
  • Ercikan, K. (April, 2014). Assessment of 21st century skills. AERA presidential invited symposium discussant, Philadelphia, PA.
  • Ercikan, K. (April, 2014). Population heterogeneity and interpretation of results from international assessments. AERA invited session for the International Academy of Education, Philadelphia, PA.

Authored book:

CdnsPasts

Conrad, M., Ercikan, K., Friesen, G., Letourneau, J., Muise, D., Seixas, P.  (2013). Canadians and Their Pasts. Toronto University Press.

Edited books:

Ercikan, K. & Seixas, P. (Eds.) (in press). New directions in assessing historical thinking. Routledge.

Simon, M., Ercikan, K., & Rousseau, M. Eds. (2013). Improving Large Scale Assessment in Education: Theory, Issues, and Practice. Taylor and Francis publishers.

Ercikan, K., & W-M. Roth (2009). Generalizing from Educational Research: Beyond Qualitative and Quantitative Polarization. New York: Routledge. This book was awarded the AERA Division D Significant Contributions to Measurement and Research Methodology Award.

Committee authored book:

National Academy of Sciences Committee on the Foundations of Assessment (2001). Knowing what students know: The science and design of educational assessments.National Academy of Sciences Press. Ercikan was one of the 13 committee members who authored this book, edited by Pellegrino, Chudowsky, and Glaser.

Refereed journal articles:

  1. Ercikan, K., Roth, W-M., Asil, M. (in press). Cautions about uses of international assessments. Teachers College Record.
  2. Ercikan, K., Yue, M., Lyons-Thomas, J., Sandilands, D., Roth, W-M., & Simon, M. (in press). Reading proficiency and comparability of mathematics and science scores for students from English and Non-English backgrounds: An international perspective. International Journal of Testing special issue on assessment of linguistic minorities.
  3. Oliveri, M. E., Ercikan, K., Lyons-Thomas, J., & Holtzman, S. (in press). Heterogeneity in linguistic groups and measurement comparability research. Applied Measurement in Education.
  4. Sandilands, D., Barclay-McKeown, S., Lyons-Thomas, J., Ercikan, K. (2014). An investigation of school level factors associated with science performance for minority and majority Francophone students in CanadaCanadian Journal for Science, Mathematics, and Technology Education, 14, 135-153.
  5. Ercikan, K., & Roth, W-M. (2014). Limits of generalizing in education research: Why Criteria for Research Generalization should include population heterogeneity and users of knowledge claims. Teachers College Record,116 (6), 1-28.
  6. Ercikan, K., Roth, M., Simon, M., Lyons-Thomas, J., & Sandilands, D. (2014). Inconsistencies in DIF detection for sub-groups in heterogeneous language groups. Applied Measurement in Education, 27, 273-285.
  7. Oliveri, M.E., Ercikan, K., Zumbo, B., & Lawless, R. (2014). Uncovering substantivep patterns in student reponses in international large-scale assessments – Comparing a latent class to a manifest DIF approach. International Journal of Testing, 14, 265-287.
  8. Lyons-Thomas, J., Sandilands, D., & Ercikan, K. (2014). Gender differential item functioning in mathematics in four international jurisdictions. Education and Science, 39(172),20-32.
  9. Arim, R. G., & Ercikan, K. (2014). Comparability between the American and Turkish Versions of the TIMSS Mathematics Test results [Large Scale Assessment Special Issue]. Education & Science, 39(172), 33-48. http://egitimvebilim.ted.org.tr/index.php/EB/article/view/2847
  10. Ercikan, K. (2013). How Is Testing Supposed to Improve Schooling If We Do Not Evaluate to What Extent It Improves Schooling? Measurement: Interdisciplinary Research and Perspectives11(1-2), 60-63.
  11. Seixas, P. & Ercikan, K. (2013). Historical Thinking in Canadian Schools. Canadian Journal of Social Research, 4, 31-41.
  12. Oliveri, M. E., Ercikan, K., & Zumbo, B. (2013). Analysis of sources of latent class DIF in international assessments. International Journal of Testing13, 272-293
  13. Sandilands, D., Oliveri, M. E., Zumbo, B. D., Ercikan, K. (2013). Investigating sources of differential item functioning in international large-scale assessments using a confirmatory approach. International Journal of Testing13, 152-174.
  14. Roth, W-M., Oliveri, M. E., Sandilands, D., Lyons-Thomas, J., & Ercikan, K. (2013). Investigating linguistic sources of differential item functioning using expert think-aloud protocols in science achievement tests. International Journal of Science Education, 35(4), 546-576.
  15. Oliveri, M., Olson, B., Ercikan, K., & Zumbo, B. (2012). Methodologies for Investigating Item- and Test-Level Construct Comparability in International Large-Scale Assessments.International Journal of Testing, 12,203-223.
  16. Oliveri, M. & Ercikan, K. (2011). Do different approaches to examining construct comparability lead to similar conclusions? Applied Measurement in Education, 24, 349-366.
  17. Ercikan, K., Arim, R.,G., Law, D. M., Lacroix, S., Gagnon, F., & Domene, J. F. (2010). Application of think-aloud protocols in examining sources of differential item functioning. Educational Measurement: Issues and Practice, 29, 24-35.
  18. Seixas, P., Ercikan, K. & Gosselin, V. (2010) “Cuestionar el pasado: los canadienses ante las polemicas historiograficas,” Ciudadania: Didactica de las Ciencias Sociales, Geografia e Historia 64 : 58-66.
  19. Seixas, P., Ercikan, K., and Gosselin, V. (2009). Canadians Confront the History Wars. Diversity/Diversité, 7, 50-54.
  20. Martinez, Y. & Ercikan, K. (2009). Chronic Illnesses in Canadian Children: What is the Effect of Illness on Academic Achievement, and Anxiety and Emotional Disorders. Child: Care, Health and Development, 35, 391-401.
  21. Ercikan, K., & Alper, N. (2008). Adaptation of instructional materials:  A commentary on the research on adaptations of Who Polluted the Potomac.Cultural Studies of Science Education, 4, 141-148.
  22. Ercikan, K., & Barclay-McKeown (2007). Design and development issues in large-scale assessments: Designing assessments to provide useful information to guide policy and practice. Canadian Journal of Program Evaluation, 22, 53-71.
  23. Ercikan, K., & Roth, W-M. (2006). What good is polarizing research into qualitative and quantitative? Educational Researcher, 35,14-23.
  24. Mendes-Barnett, S., & Ercikan, K. (2006). Examining sources of gender DIF in mathematics assessments using a confirmatory multidimensional model approach. Applied Measurement in Education, 19, 289-304.
  25. Wu, A., & Ercikan, K. (2006). Using multiple-variable matching to identify cultural sources of differential item functioning. International Journal of Testing, 6, 287-300.
  26. Ercikan, K. (2006). Examining guidelines for developing accurate proficiency level scores. Canadian Journal of Education, 29, 823-838.
  27. Ercikan, K. (2006). Adapting educational and psychological tests for cross-cultural assessment: Time for the measurement community to pay attention to research and findings. International Journal of Testing, 6, 105-113.
  28. Ercikan, K., & Koh, K. (2005). Construct comparability of the English and French versions of TIMSS. International Journal of Testing, 5,23-35.
  29. Ercikan, K., McCreith, T., & Lapointe, V. (2005). Factors associated with mathematics achievement and participation in advanced mathematics courses: An examination of gender differences from an international perspective. School Science and Mathematics Journal, 105,11-18.
  30. Ercikan, K., Gierl, M. J., McCreith, T., Puhan, G., & Koh, K. (2004). Comparability of bilingual versions of assessments: Sources of incomparability of English and French Versions of Canada’s National Achievement Tests. Applied Measurement in Education, 17, 301-321.
  31. Ercikan, K. (2003). Are the English and French versions of the Third International Mathematics and Science Study Administered in Canada comparable? Effects of adaptations. International Journal of Educational Policy, Research and Practice, 4, 55-76.
  32. Ercikan, K. (2002). Disentangling sources of differential item functioning in multi-language assessments. International Journal of Testing, 2, 199-215.
  33. Ercikan, K. (2002). Scoring examinee responses for multiple inferences: Multiple-scoring in assessments. Educational Measurement: Issues and Practice, 21, 8-15.
  34. Ercikan, K., & Julian, M. (2002). Classification accuracy of assigning student performance to proficiency levels: Guidelines for assessment design. Applied Measurement in Education, 15, 269-294.
  35. Ercikan, K. (2002). Structure of validity evidence. Measurement: Interdisciplinary Research and Perspectivehttp://bear.soe.berkeley.edu/measurement/docs/Ercikan_1_1.pdf
  36. Ercikan, K. (1998). Translation effects in international assessments. International Journal of Educational Research, 29, 543-553.
  37. Fitzpatrick, A. R., Ercikan, K., Yen, W., & Ferrara, S. (1998). The consistency between ratings collected in different test years. Applied Measurement in Education, 11,195-208.
  38. Ercikan, K., Schwarz, R., Julian, M., Burket, G., Weber, M., & Link, V. (1998). Calibration and scoring of tests with multiple-choice and constructed-response item types. Journal of Educational Measurement, 35,137-155.
  39. Ercikan, K. (1997). Linking statewide tests to the National Assessment of Educational Progress:  Accuracy of combining tests results across states. Applied Measurement in Education, 10, 145-159.
  40. Sykes, R. S., Ito, K., Fitzpatrick, A. R., & Ercikan, K. (1997).  Review of technical issues in large- scale performance assessments. Journal of Educational Measurement, 4, 379-384.
  41. Candell, G., & Ercikan, K. (1994). On the generalizability of school-level performance assessment scores. International Journal of Educational Research, 21, 267-278.
  42. Zwick, R., & Ercikan, K. (1989). Differential item functioning in NAEP. Journal of Educational Measurement, 26, 55-66.

Non-refereed journal articles:

  1. Perry, N. & Ercikan, K. (in press). What did we learn from PISA? Teachers College Record.
  2. Ercikan, K. & Solano-Flores, G. (2014). Levels of Analysis in the Assessment of Linguistic Minority Students. Applied Measurement in Education, 27, 233-235.
  3. Seixas, P. & Ercikan, K. Historical Thinking in Schools in Canada. Education Letter (Queen’s University) (Fall-Winter, 2010), pp. 11-14.
  4. Ercikan, K. (1998). Introduction. International Journal of Educational Research, 29, 486-487. Special issue on international assessments.

Refereed chapters:

  1. Ercikan, K. & Roth, M. (in press). Qualitative and quantitative evidence in health – the critics view. In K. Olson, R. A. Young, and I.Z. Schultz (Eds.) Handbook of Qualitative Research for Evidence-Based Practice. Sage Publishing.
  2. Ercikan, K., Seixas, P., Lyons-Thomas, J., & Gibson, L. (in press). Cognitive validity evidence for validating assessments of historical thinking.In K. Ercikan & P. Seixas (Eds). New directions in assessing historical thinking. Routledge.
  3. Seixas, P. & Ercikan, K. (in press). Introduction: The new shape of history assessments. In K. Ercikan & P. Seixas (Eds). New directions in assessing historical thinking. Routledge.
  4. Seixas, P., Gibson, L., & Ercikan, K. (in press). A sesign process for assessing historical thinking: The case of a one-hour test.In K. Ercikan & P. Seixas (Eds). New directions in assessing historical thinking. Routledge.
  5. Ercikan, K. & Oliveri, M. E. (2013). Is Fairness Research Doing Justice? A Modest Proposal for an Alternative Validation Approach in Differential Item Functioning (DIF) Investigations.In M. Chatterji (Ed.),Validity, fairness and testing of individuals in high stakes decision-making context (pp. 69-86). Bingley, UK: Emerald Publishing.
  6. Ercikan, K., & Lyons-Thomas, J. (2013). Adapting tests for use in other languages and cultures. In K. Geisinger (Ed.),APA Handbook of testing and assessment in psychology, Volume 3, (pp. 545-569). American Psychological Association: Washington, DC.
  7. Ercikan, K., Oliveri, M. E, & Sandilands, D. (2013).  Large scale assessments of achievement in Canada. In J.A.C. Hattie and E.M. Anderman (Eds.), The International handbook of student achievement, Routledge. (pp. 456-459).
  8. Oliveri, M. E., Gundersen-Bryden, B., & Ercikan, K. (2013). Scoring issues in large-scale assessments. In M. Simon, K. Ercikan, & M. Rousseau. (Eds.), Improving large-scale assessment in education: Theory, issues and practice. (pp. 143-153). New York: Routledge/Taylor & Francis.
  9. Ercikan, K., Simon, M., & Oliveri, M. E. (2013). Score comparability of multiple language versions of assessments within jurisdictions. In M. Simon, K. Ercikan, & M. Rousseau. (Eds.), Improving large-scale assessment in education: Theory, issues and practice. (pp. 110-124). New York: Routledge/Taylor & Francis.
  10. Ercikan, K. & Seixas, P. (2011). Assessment of higher order thinking: The case of historical thinking. In G. Shraw and D. H. Robinson (Eds.), Assessment of higher order thinking skills,, (pp. 245-261) Charlotte: NC, Information Age Publishing.
  11. Ercikan, K., & Roth, W-M. (2011). Constructing data.  In C. Conrad and R. Serlin (Eds.), SAGE Handbook for research in education, Second Edition(pp. 219-245). Sage Publications.
  12. Ercikan, K. (2009). Limitations in sample to population generalizing. In K. Ercikan & M-W. Roth (Eds.), Generalizing in educational research: Beyond qualitative and quantitative polarization (pp. 211-235), New York: Routledge.
  13. Ercikan, K. & W-M. Roth (2009). Key issues in generalizing in educational research, with contributions from L. F. Bachman, M. Eisenhart, R. J. Mislevy, P. Moss, G. Solano-Flores, & K. Tobin. In K. Ercikan & M-W. Roth (Eds.), Generalizing in educational research: Beyond qualitative and quantitative polarization (pp. 265-295), New York: Routledge.
  14. W-M. Roth. & Ercikan, K. (2009). Introduction. In K. Ercikan & M-W. Roth (Eds.), Generalizing from educational research: Beyond qualitative and quantitative polarization (pp. 1-9), New York: Routledge.
  15. Ercikan, K. (2006). Developments in assessment of student learning and achievement. In P.A. Alexander and P. H. Winne (Eds.), American Psychological Association, Division 15,Handbook of educational psychology, 2nd edition (pp. 929-953). Mahwah: NJ: Lawrence Erlbaum Associates.
  16. Ercikan, K. & Roth, W-M. (2006). Constructing data. In C. Conrad and R. Serlin (Eds.), Handbook for research in education: Engaging ideas and enriching inquiry (pp. 451-477). Sage Publications.
  17. Ercikan, K., & Roth, W-M. (2006). Constructing data.  In C. Conrad and R. Serlin (Eds.), SAGE Handbook for research in education: Engaging ideas and enriching inquiry(pp. 451-477). Sage Publications.
  18. Ercikan, K., McCreith, T., & Lapointe, T. (2005). How are non-school related factors associated with participation and achievement in science? An examination of gender differences in Canada, the USA and Norway. In S. J. Howie and  T. Plomp (Eds), Contexts of learning mathematics and science: lessons learned from TIMSS(pp. 211-225). Swets & Zeitlinger International Publishers, the Netherlands.
  19. Ercikan, K., & McCreith, T. (2002). Effects of adaptations on comparability of test items and test scores.In D. Robitaille & A. Beaton (Eds.) Secondary analysis of the TIMSS results: A synthesis of current research (pp.391-407). Dordrecht, the Netherlands, Kluwer Academic Publishers.
  20. Mislevy, R., Wilson, M., Ercikan, K., & Chudowsky, N. (2002). Psychometric principles in student evaluation. In D. Nevo & D. Stufflebeam (Eds.), International handbook of educational evaluation (pp. 478-520). Dordrecht, the Netherlands: Kluwer Academic Press.
  21. Ercikan, K., & Koh, K. (2002). Construct comparability in international assessments:Comparability of English and French versions of TIMSS. In H. Yanai, A. Okada, and K. Shigemasu (Eds), New developments on psychometrics: Proceedings of the International Meeting of the Psychometric Society (pp.223-231). Tokyo: Springer-Verlag.

Non-refereed chapters:

  1. Ercikan, K. (2014). Becoming a Research Methodologist and Psychometrician: Chances, Opportunities, and Influences.In D.C. Phillips and Maria de Ibarrola (Eds.). Making of an Education Researcher (pp.59-75). Rotterdam: the Netherlands, Sense Publishers.

Evaluation reports:

  1. Ercikan, K, Arim, R., Oliveri, M., & Sandilands, D. (2008).  Evaluation of Dimensions of the Work of the Southern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ) and of its Programme of Cooperation with the International Institute for Educational Planning (IIEP). UNESCO, Internal Oversight Service, Internal Oversight Service Evaluation Section, IOS/EVS/PI/91 (48 pages).
  2. Ercikan, K, Arim, R., Oliveri, M., & Sandilands, D. (2008).  Evaluation of the Literacy Assessment and Monitoring Programme (LAMP) / UNESCO Institute for Statistics (UIS). UNESCO, Internal Oversight Service, Internal Oversight Service Evaluation Section, IOS/EVS/PI/91 (29 pages).
  3. Lichtenstein, G., Weissglass, J., Ercikan, K. (1998).  Final Evaluation Report:  Middle School Mathematics Through Applications Project, MMAP II (years 1994 -1998) (80 pages).

Technical reports: 

  1. Ercikan, K., & Hillier, F. (1987). Heuristic procedures for 0-1 integer programming. Technical report, Systems Optimization Laboratory, Department of Operations Research, Stanford University. (81 pages)

Commissioned reports: 

  1. Ercikan, K. (1999). Synthesis paper on the redesign of the National Assessment of Educational Progress (NAEP).  (30 pages). Commissioned by the American Institutes of Research.
  2. Ercikan, K. (1998).  Multi-scoring and integration in assessments. (80 pages).  Report commissioned by the Council of Chief State School Officers (CCSSO).

a place of mind, The University of British Columbia

CARME
2125 Main Mall,
Vancouver, BC, V6T 1Z4, Canada

Emergency Procedures | Accessibility | Contact UBC  | © Copyright The University of British Columbia