Promoting Enhanced Transparency in Psychometric Studies.

Authors

DOI:

https://doi.org/10.18050/psiquemag.v11i2.2064

Keywords:

Open Science Framework (OSF), HARKing, P-haking, StandardReport, Reproducibility, Preprint, Pre-reports

Abstract

We are in the presence of a great moment for the advancement of psychological science. Currently, we have a wide range of resources and sources to adhere to good research practices, which allow the development of an increasingly reliable, valid and reproducible discipline. Within responsible behavior in research, it is essential to promote adherence to transparency and open science practices. Although it is currently difficult to think that any researcher does not agree with these principles and practices, their implementation is not yet generalized or extended, neither in all disciplines and subdisciplines, nor equally in all countries. Here we are interested in focusing on measurement practices, in the field of psychometrics. We believe that transparent and open science practices are a viable and fundamental solution to counteract questionable practices in research and, especially, those related to questionable measurement practices. Focusing on these psychometric or measurement practices is essential, since the more general validity of our scientific findings depends on them. The ultimate goal of this research work is to offer a series of resources that facilitate the dissemination and adherence of these responsible behaviors in research, among institutions and researchers in Latin-American.

Downloads

Download data is not yet available.

References

Abadal, E. (2021). Ciencia abierta: un modelo con piezas por encajar. Arbor, 197(799), a588. https://doi.org/10.3989/arbor.2021.799003

APA Publications and Communications Board Working Group on Journal Article Reporting Standards. (2008). Reporting standards for research in psychology: Why do we need them? What might they be? American Psychologist, 63(9), 839–851. https://doi.org/10.1037/0003-066X.63.9.839

Appelbaum, M., Cooper, H., Kline, R. B., Mayo-Wilson, E., Nezu, A. M., & Rao, S. M. (2018). Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board task force report. American Psychologist, 73(1), 3–25. https://doi.org/10.1037/amp0000191

Barber, T. X. (1976). Pitfalls in human research: Ten pivotal points. Pergamon Press.

Clark, L. A., & Watson, D. (2019). Constructing validity: New developments in creating objective measuring instruments. Psychological Assessment, 31(12), 1412–1427. https://doi.org/10.1037/pas0000626

Epskamp, S. (2019). Reproducibility and Replicability in a Fast-Paced Methodological World. Advances in Methods and Practices in Psychological Science, 2(2), 145–155. https://doi.org/10.1177/2515245919847421

Flake, J. K., & Fried, E. I. (2020). Measurement Schmeasurement: Questionable Measurement Practices and How to Avoid Them. Advances in Methods and Practices in Psychological Science, 3(4), 456–465. https://doi.org/10.1177/2515245920952393

Flake, J. K., Pek, J., & Hehman, E. (2017). Construct Validation in Social and Personality Research: Current Practice and Recommendations. Social Psychological and Personality Science, 8(4), 370–378. https://doi.org/10.1177/1948550617693063

Flores-Kanter, P. E., Dominguez-Lara, S., Trógolo, M. A., & Medrano, L. A. (2018). Best Practices in the Use of Bifactor Models: Conceptual Grounds, Fit Indices and Complementary Indicators. Revista Evaluar, 18(3). https://doi.org/10.35670/1667-4545.v18.n3.22221

Flores-Kanter, P. E., Toro, R., & Alvarado, J. M. (2021). Internal Structure of Beck Hopelessness Scale: An Analysis of Method Effects Using the CT-C(M–1) Model. Journal of Personality Assessment, 1–9. https://doi.org/10.1080/00223891.2021.1942021

Lilienfeld, S. O., & Strother, A. N. (2020). Psychological measurement and the replication crisis: Four sacred cows. Canadian Psychology/Psychologie Canadienne, 61(4), 281–288. https://doi.org/10.1037/cap0000236

Lindsay, D. S. (2020). Seven steps toward transparency and replicability in psychological science. Canadian Psychology/Psychologie Canadienne, 61(4), 310–317. https://doi.org/10.1037/cap0000222

Lloret-Segura, S., Ferreres-Traver, A., Hernández-Baeza, A., & Tomás-Marco, I. (2014). El análisis factorial exploratorio de los ítems: Una guía práctica, revisada y actualizada. Anales de Psicología, 30(3), 1151–1169. https://doi.org/10.6018/analesps.30.3.199361

Mellor, D. T., Vazire, S., & Lindsay, D. S. (2018). Transparent science: A more credible, reproducible, and publishable way to do science [Preprint]. PsyArXiv. https://doi.org/10.31234/osf.io/7wkdn

Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 0021. https://doi.org/10.1038/s41562-016-0021

Nelson, L. D., Simmons, J., & Simonsohn, U. (2018). Psychology’s Renaissance. Annual Review of Psychology, 69(1), 511–534. https://doi.org/10.1146/annurev-psych-122216-011836

Nosek, B. A., Hardwicke, T. E., Moshontz, H., Allard, A., Corker, K. S., Dreber, A., Fidler, F., Hilgard, J., Kline Struhl, M., Nuijten, M. B., Rohrer, J. M., Romero, F., Scheel, A. M., Scherer, L. D., Schönbrodt, F. D., & Vazire, S. (2022). Replicability, Robustness, and Reproducibility in Psychological Science. Annual Review of Psychology, 73(1), 719–748. https://doi.org/10.1146/annurev-psych-020821-114157

Royal Society. (2012). Science as an open enterprise. Royal Society. http://royalsociety.org/policy/projects/science-public-enterprise/report/

Tackett, J. L., Brandes, C. M., & Reardon, K. W. (2019). Leveraging the Open Science Framework in clinical psychological assessment research. Psychological Assessment, 31(12), 1386–1394. https://doi.org/10.1037/pas0000583

Tackett, J. L., Lilienfeld, S. O., Patrick, C. J., Johnson, S. L., Krueger, R. F., Miller, J. D., Oltmanns, T. F., & Shrout, P. E. (2017). It’s Time to Broaden the Replicability Conversation: Thoughts for and From Clinical Psychological Science. Perspectives on Psychological Science, 12(5), 742–756. https://doi.org/10.1177/1745691617690042

Tijdink, J. K., Horbach, S. P. J. M., Nuijten, M. B., & O’Neill, G. (2021). Towards a Research Agenda for Promoting Responsible Research Practices. Journal of Empirical Research on Human Research Ethics, 16(4), 450–460. https://doi.org/10.1177/15562646211018916

Viladrich, C., Angulo-Brunet, A., & Doval, E. (2017). Un viaje alrededor de alfa y omega para estimar la fiabilidad de consistencia interna. Anales de Psicología, 33(3), 755. https://doi.org/10.6018/analesps.33.3.268401

Published

2022-03-21 — Updated on 2022-03-21

Versions

How to Cite

Flores Kanter, P. E., & Mosquera, M. . (2022). Promoting Enhanced Transparency in Psychometric Studies. PsiqueMag, 11(2), 14–21. https://doi.org/10.18050/psiquemag.v11i2.2064

Issue

Section

Research Articles