“Advancing Reliability in the Social Sciences: Meta‑Science, Transparency, and Crowd Research”
DETAILS
Call for Papers – Special Issue: “Advancing Reliability in the Social Sciences: Meta‑Science, Transparency, and Crowd Research”
Journal: Journal of Economic Psychology
Publisher: Elsevier (ScienceDirect)
Impact Factor: 2.3 | CiteScore: 4.7
Submission deadline: 31 May 2026
This special issue focuses on improving reliability, transparency, and cumulative knowledge in the social sciences, especially in economics, psychology, and related behavioural fields. It responds to the replication crisis by highlighting meta‑scientific, open‑science, and collaborative research practices that make evidence more robust, reproducible, and policy‑relevant.
Why this issue matters
Replicability, publication bias, and selective reporting remain serious concerns, undermining confidence in social‑science findings used for policy and practice.
New methods—such as preregistration, data/code sharing, many‑lab studies, many‑analyst projects, and prediction markets—have improved transparency and generalisability, but more empirical work is needed on how to scale and institutionalise them.
This SI explicitly links meta‑science and experimental economics‑plus‑psychology, asking how to design, document, and share studies so that results are credible, reusable, and cumulative.
Key themes and research topics
Submissions must be empirical, experimental, methodological, or Registered Reports; purely theoretical or “dataset‑only” papers are not considered.
Replication and robustness
High‑power replication studies and robustness tests of influential economic‑psychology findings (e.g., social‑preference, trust, saving, decision‑making, nudges).
Evidence synthesis
Meta‑analyses and systematic syntheses that map effect‑size heterogeneity across contexts, populations, and methods.
Crowd‑ and team‑science designs
Many‑lab, many‑analyst, many‑designs, and meta‑reproduction projects that test generalisability and identify sources of variability.
Methodological meta‑science
Studies on how preregistration, pre‑analysis plans, multiple‑testing corrections, and reporting standards affect reproducibility and power.
Norms, institutions, and incentives
Analyses of how review practices, funding incentives, and institutional policies shape researchers’ choices on transparency and openness.
Open‑science infrastructures and workflows
Work on data/materials/code sharing, FAIR data principles, and reproducible analysis workflows (e.g., using R/Python, Jupyter notebooks).
Prediction and decision markets
Applications of prediction markets to prioritise replication targets or forecast study outcomes.
Guest editors
Dr. Maja Adena, WZB Berlin Social Science Center & TU Berlin, Germany
Prof. Frank M. Fossen, University of Nevada‑Reno, USA
Dr. Levent Neyse, WZB Berlin Social Science Center & SOEP (DIW Berlin), Germany
Submission details
Submission platform: Journal of Economic Psychology via Elsevier’s submission system:
https://submit.elsevier.com/JOEP or via Editorial Manager:
https://www.editorialmanager.com/joepWhen submitting, select article type “VSI: Advancing Reliability in Social Sciences”.
Submission window: 1 December 2025 – 31 May 2026
All submissions must:
Include a complete transparency package (reusable data, materials, and code), where data sensitivity permits.
Follow the journal’s Guide for Authors on ethics, non‑deception, and reporting standards.
A limited number of Registered Reports will be accepted, selected for exemplary rigour and relevance.
This special issue is ideal for behavioural‑economics, experimental‑psychology, and meta‑science researchers who wish to strengthen replicability, transparency, and collaborative research practices in the social sciences.
ServiceSetu Academics — Premier Platform for Academic Opportunities & Research Collaboration
COMMENTS (0)
Sign in to join the conversation
SIGN IN TO COMMENT