A look at the global shift to open science, exploring how transparency, preregistration, and open data are becoming the new standard for credible research, benefiting scientists, funders and society.
Conducting systematic literature reviews is traditionally a laborious, manual process involving the extraction of distinct data points from hundreds of academic papers, but this presentation introduces a structured, multi-tool workflow using the …
To err is human, but when it comes to creating research materials, mistakes can be reduced by sharing more of our work and by using some helpful tools. For instance, we can make our research materials FAIRer—that is, more Findable, Accessible, …
Electroencephalography (EEG) has become a cornerstone for understanding the intricate workings of the human brain in the field of neuroscience. However, EEG software and hardware come with their own set of constraints, particularly in the management of markers, also known as triggers. This article aims to shed light on these limitations and future prospects of marker management in EEG studies, while also introducing R functions that can help deal with vmrk files from BrainVision.
Extension of the rscopus R package with functions that manage search quotas, retrieve DOIs for reference managers, search for additional DOIs, compare publication counts across topics, and visualize bibliometric comparisons over time.
An Excel workbook template with conditional formatting to facilitate planning, registration, and tracking of sessions in longitudinal studies involving multiple session conductors.
The best argument to motivate a preregistration may be that it doesn’t take any extra time. It just requires frontloading an important portion of the work. As a reward, the paper will receive greater trust from the reviewers and the readers at large. Preregistration is not perfect, but is a lesser evil that reduces the misuse of statistical analysis in science.
Frequently asked questions about mixed-effects models, covering the necessity of random slopes, appropriate p-value calculation methods, parallelization limitations, convergence issues, and optimizer selection.
In the fast-paced world of scientific research, establishing minimum standards for the creation of research materials is essential. Whether it's stimuli, custom software for data collection, or scripts for statistical analysis, the quality and transparency of these materials significantly impact the reproducibility and credibility of research. This blog post explores the importance of adhering to FAIR (Findable, Accessible, Interoperable, Reusable) principles, and offers practical examples for researchers, with a focus on the cognitive sciences.