The State of Open Science Practices in Psychometric Studies of Suicide: A Systematic Review Abstract The adoption of open science practices (OSPs) is crucial for promoting transparency and robustness in research. We conducted a systematic review to assess the frequency and trends of OSPs in psychometric studies focusing on measures of suicidal thoughts and behavior. We analyzed publications from two international databases, examining the use of OSPs such as open access publication, preregistration, provision of open materials, and data sharing. Our findings indicate a lack of adherence to OSPs in psychometric studies of suicide. The majority of manuscripts were published under restricted access, and preregistrations were not utilized. The provision of open materials and data was rare, with limited access to instruments and analysis scripts. Open access versions (preprints/postprints) were scarce. The low adoption of OSPs in psychometric studies of suicide calls for urgent action. Embracing a culture of open science will enhance transparency, reproducibility, and the impact of research in suicide prevention efforts. Keywords: responsible research conduct, questionable research practice, assessment, suicide research. 2 Suicide is a major public health problem (Flores-Kanter, 2017; Franklin et al., 2017; Naghavi, 2019). Each year, there are an estimated 700,000 suicide deaths (World Health Organization, 2021), 20 suicide attempts per every death (World Health Organization, 2022) and 140 million annual suicide ideators worldwide (Borges et al., 2008). Through the development of theories, implementation of interventions, and development of evaluation methods, the central goal of suicide research is the prevention of suicide deaths (Carpenter & Law, 2021; Flores-Kanter et al., 2019). The scientific suicide literature is fundamental in this regard not only for researchers, but also for clinicians and policymarkers. Thus, ensuring the validity and reliability of the suicide research became essential (Bauer et al., 2021). Recently, there has been increasing scrutiny on the scientific quality of suicide research, acknowledging both progress and identified barriers to proper advancements in suicide prevention and treatment (Kirtley et al., 2022). Concerns have been raised regarding methodological limitations, publication bias, and p-hacking, which hinder the progress in the field (Carpenter & Law, 2021; Franklin et al., 2017). In response to these challenges, there has been a growing call for suicide researchers to adopt open science practices (OSPs) as a means to enhance research practices (Carpenter & Law, 2021; Kirtley et al., 2022). These OSPs include preregistration, preprints, postprints, and sharing of study materials, code, and data (Christensen et al., 2019; Gomes et al., 2022; Nosek et al., 2015; Tackett et al., 2019). Embracing OSPs in suicide research aligns with broader initiatives in psychological science that emphasize responsible research practices for valid and reliable scientific inquiry (Nelson et al., 2018; Nosek et al., 2022; Open Science Collaboration, 2015; Simmons et al., 2011; Tijdink et al., 2021). In light of this context, it is crucial to investigate the extent to which OSPs are being utilized 3 in suicide research. To date, no systematic reviews have explored this specific area1, although existing evidence suggests that the use of OSPs in suicide research is still uncommon (Carpenter & Law, 2021; Kirtley et al., 2022). Therefore, our aim is to address this gap through a systematic review focused on a specific aspect of suicide research related to the assessment of suicidal behavior. Specifically, we will examine psychometric studies involving measures of suicidal thoughts and behaviors. Measurement plays a central role in scientific progress and is critical for the valid interpretation and application of research findings (Bosma & Granger, 2022; Flake et al., 2022; Flake & Fried, 2020; Lewis, 2021; Manapat et al., 2023). Improving the quality of suicide research necessitates the use of measures that enable valid inferences about suicidal thoughts and behaviors to be made (Flores-Kanter et al., 2023; Millner et al., 2020). Within psychometric studies, recent research has identified questionable research conduct that can potentially threaten the validity and reliability of psychometric research results (Manapat et al., 2023). To ensure more rigorous and reliable psychometric research, good practices associated with transparency and open science have been recommended (Flake & Fried, 2020; Flores-Kanter & Mosquera, 2023). These practices include the adoption of open science principles such as preregistration, pre-printing, and the sharing of study materials, code, and data. Therefore, it is pertinent to investigate the extent to which these open science practices are being utilized in psychometric suicide research. By gathering this information, we can lay the groundwork for raising awareness and promoting responsible research conduct that may be less commonly used or overlooked (Carpenter & Law, 2021). 1 We did not find any previous antecedent on OSPs in sucide research using the following search terms: - SCOPUS: ( TITLE-ABS-KEY ( suici* ) AND TITLE-ABS-KEY ( "open science" ) ) - Web of Science: suici* (Title) and "open science" (Title) or suici* (Abstract) and "open science" (Abstract) - Cochrane: Title Abstract Keyword: sui* AND “open science” Search engine was not available from PROSPERO. These searches were conducted on May 11, 2023. We do a new search from PROSPERO on May 22, 2023, and obtain no hits for line: “suicide open science”. 4 The report will be structured as follows. First, we will provide a brief description of OSPs, including their aims and scope. This section will offer a comprehensive understanding of the importance and benefits of implementing OSPs in suicide research. Next, we will examine the existing literature to explore the prevalence of questionable research practices and OSPs in previous suicide research studies. This analysis will shed light on the current landscape and highlight any gaps or challenges in the adoption of OSPs within the field. Moving forward, we will outline the systematic review methodology employed in this study, emphasizing the approach used to identify and evaluate psychometric studies related to suicidal behavior. The main findings obtained from the systematic review will then be presented, providing insights into the extent to which OSPs are being utilized in psychometric suicide research. Finally, we will engage in in- depth discussions and draw conclusions based on the findings, addressing the implications and potential future directions for promoting responsible research practices in this critical area. Open Science: What and Why? In this paper, when we refer to open science, we are specifically addressing initiatives that aim to promote transparent conduct in research. Transparent conduct entails meeting the following requirements (Nosek & Bar-Anan, 2012): a) making information accessible, b) explicitly detailing the steps and decision-making involved in each phase of the scientific cycle, and c) providing complete and non-partial information. These transparent behaviors are advocated to advance our science by enhancing rigor, reliability, and equitable access to research (Pownall et al., 2023; Segerstrom et al., 2023; UNESCO, 2021). It is important to note that open science practices alone do not guarantee the veracity or validity of reported information. To uphold the principles of good science, rigor and adherence to the value of truthfulness in information are equally crucial (Antonakis, 2017; Banks et al., 2019; Chin et al., 2021; Haeffel, 2022; Simmons et al., 2011). 5 In the past decade, there has been increasing emphasis on adopting open science practices as effective means to counter questionable research practices or behaviors (Banks et al., 2019; Munafò et al., 2017; Segerstrom et al., 2023; Wigboldus & Dotsch, 2016). This call has been made in general and has been particularly highlighted in specific fields such as psychometrics (Flake & Fried, 2020; Flores-Kanter & Mosquera, 2023; Manapat et al., 2023) and suicide research (Carpenter & Law, 2021; Kirtley et al., 2022). For instance, preregistration of hypotheses and analytic plans before data collection and open sharing of data can help reduce publication bias, which is characterized by selectively publishing positive results that reach statistical significance. Preregistration also serves as a safeguard against post hoc hypotheses and analyses presented as if they were planned a priori, a practice known as HARKing (Hypothesizing After the Results are Known) and p-hacking. In the domain of psychometric studies, preregistration of research projects/plans is proposed as a valuable approach to address specific forms of selective information, such as fit-hacking (employing different strategies to achieve an acceptable or optimal model fit) and model-HARKing (constructing the entire document to align with the model that exhibits the best fit) (Flores-Kanter & Mosquera, 2023). By pre-registering research projects/plans, psychometric researchers can clearly outline essential aspects, including the measurement and structural models to be estimated, as well as the fit indicators to be considered. It is important to highlight that open science practices are not solely advocated to reduce questionable research practices but also to foster a better research culture and enhance the openness and comprehensibility of the scientific process underlying research findings (Banks et al., 2018; Chin et al., 2021). This principle also applies to the domain of psychometric studies (Flake & Fried, 2020; Flores-Kanter & Mosquera, 2023). Open access options ensure that all relevant information and materials necessary to fully understand the research process (e.g., design 6 protocols, measures, analytic scripts) and the achieved results (e.g., data) are openly accessible. This promotes reproducible and replicable science and enables evaluation of the correct implementation of study procedures. Furthermore, open-access publishing practices such as preprints and postprints facilitate broader and faster dissemination of research findings. Questionable Research Conduct and Open Science Practices in Suicide Research To date, no comprehensive studies have specifically examined the prevalence of questionable research conduct in suicide research. However, it is reasonable to assume that suicide research is not exempt from such issues, and it is hypothesized that questionable behaviors are also present in this domain (Carpenter & Law, 2021). For example, the work of Carpenter and Law (2021) highlights the high indicators of publication bias evident in Franklin et al. (2016) meta- analysis as evidence of a significant problem in the validity of the literature on suicide prediction: "the meta-analysis uncovered that many risk factors reported in extant suicide literature are weak or questionable, and the authors found the data provided little insight on how to advance research and prevention practices." Carpenter and Law (2021) also provide arguments that support the hypothesis of the presence of other questionable research practices in suicide research, such as the simultaneous occurrence of p-hacking and HARKing: given that studies on suicide often take a considerable amount of time in their design and execution, this can facilitate the presence of significant "researcher degrees of freedom" (Simmons et al., 2011) or decision-making possibilities (e.g., considering the aspect of suicide they would like to examine, such as suicidal ideation or suicide attempts, and whether or not they should count aborted attempts as suicide attempts). This may lead that suicide researcher "it may be tempting to inadvertently run many analyses and—inadvertently or intentionally—selectively publish those that reach p < 0.05." (Carpenter & Law, 2021). 7 Similarly, open science practices have not been extensively investigated in the field, but it is hypothesized that their adoption is still limited (Kirtley et al., 2022). The Present Research While concrete studies assessing questionable research conduct and open science practices in suicide research are lacking, existing evidence from related fields and general research practices suggests the need for investigation in this specific context. By exploring the presence of these behaviors and the extent of open science practices in suicide research, we can gain valuable insights into the current state of the field and identify areas for improvement. This systematic review aims to bridge this gap by examining the frequency of open science practices and the prevalence of questionable research conduct in psychometric studies of measures related to suicidal thoughts and behaviors. 8 Method Protocol and Registration To fulfill the objectives of this research, a systematic review was conducted following the PRISMA-P (Preferred Reporting Items of Systematic Reviews and Meta-Analyses Protocol) guidelines (PRISMA-P Group et al., 2015; Shamseer et al., 2015) and adhering to open science recommendations for more robust and reliable analyses (Moreau & Gamble, 2020). The protocol for this review was registered in accordance with PRISMA, initially submitted to PROSPERO register. However, due to the prioritization of COVID-19-related submissions, our protocol was automatically rejected. Consequently, we proceeded to preregister the protocol on the Open Science Framework (OSF) Registries platform (https://doi.org/10.17605/OSF.IO/2FUQD). The registration of the protocol was completed on March 7, 2023. Our preregistration is associated with an OSF Project (https://doi.org/10.17605/OSF.IO/8FQG5) where all materials and data related to this systematic review are accessible. Information Source and Search Strategy The systematic literature search was performed on the following two electronic databases: Web of Science (title), and Scopus (title) in March 2023. Search terms are shown in Table 12. Eligibility Criteria These were the inclusion criteria for the present study3: 2 The search process was iniƟated using SCOPUS as the primary database. IniƟally, the search strategy included both the Ɵtle and abstract, as well as keywords. However, this iniƟal search yielded over four thousand results, many of which were outside the scope of this review. Consequently, the search strategy was refined to focus solely on searching by Ɵtle. 3 Why these inclusion criteria? We consider original psychometric studies (1.a. and 1.b.) to verify the frequency of open science tool usage in such studies. We include studies on the measurement of suicidal behavior (1.c.) to limit measurements that include suicidal ideaƟon and/or aƩempts (Flores-Kanter et al., 2023). Other relevant constructs, such as knowledge-prejudice about suicide, are beyond the scope of this project. We consider studies from 2010 to 9 1. Study Characteristics: a. Psychometric studies (related to measurement of psychological attributes); b. Original non-review studies; c. Studies on the measurement of suicidal behavior (including indicators of suicidal ideation and/or suicide attempt). 2. Report Characteristics: a. Studies developed since 2010; b. Written in English, Spanish or Portuguese. The search yielded 290 results. After removing duplicates and retrieving articles, 152 studies were screened for inclusion and exclusion criteria (see Figure 1 for a detailed flow chart). A total of 71 studies were excluded as they did not meet our inclusion criteria. The most common reasons for exclusion were: focus on predictors of suicide behavior (e.g., hopelessness; acquired capability) rather than suicide behavior; a focus on other constructs such as social attitudes to suicide, literacy of suicide, and suicide stigma; and a focus on health personnel who provide care. Eighty-one articles met all eligibility criteria and were therefore included in the present review. The initial databases as well as the final reduced databases after applying the inclusion criteria can be downloaded from: https://doi.org/10.17605/OSF.IO/8FQG5. There one can also access the agreed reasons for the exclusion of reports. Data Extraction We extracted information about the authors, year of publication, first authors country of affiliation, and type of publication (i.e., open access or restricted). Moreover, we extracted information about the means of restricted paper retrieved, the presence of open science practices (i.e., preprint, preregistration, open material, open data), and the source of open access date (2.a.) since that year, certain research pracƟces, such as quesƟonable research pracƟces, began to be more strongly criƟcized, and responsible research pracƟces were promoted (Munafò et al., 2017; Nelson et al., 2018). This provides a baseline for comparison with the most current studies. Finally, we consider papers wriƩen in English, Spanish, or Portuguese (2.b.) since the authors of this work are proficient in these languages. 10 data/material for each included study. Assessment of Open Science Practices We assessed the presence of OSPs in each of the 81 studies identified. To determine the presence of preprints/postprints, we conducted a search using the Google search engine specifically for manuscripts published under restricted access (not open access). The title of each report was entered verbatim, and we checked for any links to preprint/postprint versions within the first two pages of the search results. The presence of other OSPs was assessed by examining relevant sections of the manuscript, including the first page, end of the introduction, method section, results, and any statements related to the inclusion of supplementary materials. Results Frequency of Open Science Practices First, the publication types were examined to determine the number of open access and restricted access manuscripts (Table 2). The majority of manuscripts were found to be published under restricted access (59%). The analysis further explored the frequency of preprints in the restricted access manuscripts, revealing that only 4% of the restricted access manuscripts had a preprint version. Next, the presence of open science practices related to preregistrations and open materials was assessed (Table 2). None of the analyzed manuscripts presented preregistrations. Regarding the use of open materials, 32% of the manuscripts demonstrated this open science practice. Among the reports with open materials, 100% provided access to the applied version of the instrument used in the psychometric study, while only 4% provided access to the data analysis script. As for the data, only 1% of the manuscripts provided access to the database. 11 Furthermore, when manuscripts utilized open materials and/or open data, the least frequent approach was using specialized platforms like the Open Science Framework (OSF) (4%). Most commonly, access to open materials and data was provided within the manuscript itself or through supplementary material (77%). Frequency of Open Science Practices in relation to Year of Publication, First Authors’ Country-Region of Affiliation, and Type of Publication4. The variations in publication type (open access vs. restricted access) were analyzed based on the year of publication (Figure 2) and the region of affiliation of the first author (Figure 3). In most of the considered years, the frequency of publication in restricted format exceeded that of open access. However, this trend was reversed in the last two years, particularly in 2021. Analyzing the trends separately, there was a gradual increase in open access publications over the years, while the trend for restricted access publications was less clear. Specifically, higher frequency of restricted access publications was observed in 2015, followed by a clear decrease until 2018. However, an increase occurred in 2019, followed by a sharp decrease, especially in 2021. In contrast, there was a marked increase in restricted access publications in 2022. Regarding the variations in publication type (open access vs. restricted access) according to the region of affiliation, most regions showed a similar frequency of open or restricted access publication, except for North America and South-Central America, where there was a clear trend towards restricted and open access publication, respectively (Figure 3). Secondly, the variation in open materials practices was analyzed in relation to the year of 4 It was not possible to analyse these associaƟons in the cases of preprint, open script and open data, due to the small number of manuscripts that presented these types of open science pracƟces. 12 publication (Figure 4) and the region of affiliation (Figure 5). The analysis did not reveal a discernible trend towards an increase in open materials over the years (Figure 4). Examining the trends individually, the publication of open materials was more frequent only in 2012, reaching its highest frequency. However, in the following years, there was a lower frequency with some fluctuations compared to 2012. There was also evidence of a lower frequency of open material practices in the last three years (2020 to 2022) compared to the preceding three years. Regarding the absence of open material practices, there was an increase in frequency in 2015, followed by a downward trend. Interestingly, the non-presence of open material practices showed an increasing trend from 2018 onwards, reaching its maximum frequency in 2022. Regarding the variation in open materials practices by region of affiliation, the publication of open materials was rare in most regions (Figure 5). Finally, no visible relationship between publishing open materials and the type of publication (open access vs. restricted access; Figure 6) was observed. Open materials practices were infrequent in both open access and restricted access publications. Discussion The findings of this study highlight the infrequent use of OSPs in psychometric suicide research, aligning with the broader assumption that such practices are the exception rather than the rule (Kirtley et al., 2022). The analysis in the present report revealed a low frequency of OSPs across various dimensions. Firstly, the majority of manuscripts were published under restricted access, limiting the availability of research findings to a broader audience (Table 2). Concerningly, none of the included manuscripts utilized preregistration, an essential tool in countering questionable research practices and particularly relevant in psychometric research (Christensen et 13 al., 2019; Gomes et al., 2022; Nosek et al., 2015; Tackett et al., 2019; Flores-Kanter & Mosquera, 2023). Furthermore, the provision of open materials and data was also rare among the analyzed manuscripts (Table 2). When open materials were provided, the focus was predominantly on sharing the applied instrument, while access to analysis scripts and data was extremely limited. This lack of transparency and limited access to crucial components of the research hampers error detection and threatens the replicability and reproducibility of findings in psychometric suicide research (Carpenter & Law, 2021; Kirtley et al., 2022). Additionally, there was a minimal presence of preprints or postprints in manuscripts published under restricted access, further restricting the accessibility of research findings (Table 2). This discrepancy between restricted access and open access publications hinders stakeholders, including policymakers and suicide prevention professionals, from accessing the full range of psychometric research on measures of suicidal thoughts and behavior. Promoting the use of open access practices becomes crucial for fostering a broader scope of research in the field and aligning with the values of equity, justice, and collective benefit associated with open science (Flake & Fried, 2020; Flores-Kanter & Mosquera, 2023; Tackett et al., 2019; UNESCO, 2021). When considering the variation of OSPs over the years, by region, and by publication type, the overall picture remains concerning. While there appears to be a reversal of the trend favoring restricted access publication in the last two years, particularly in 2021, the increase in open access publications does not necessarily indicate a parallel increase in other key OSPs (Figure 2). The observation that South-Central America shows a higher frequency of open access publications may be attributed to circumstantial factors rather than a broader adherence to OSPs. This could be explained by the absence of publication costs in open access options, allowing authors to publish 14 in their native language without the need for costly translation into English. However, the rarity of other OSPs in this region (Figure 5), the infrequent use of specialised platforms for OSPs (i.e., OSF) and the lack of association between open access practices and open materials/data practices (Figure 6) suggest that the situation is not entirely favorable, and the adoption of OSPs remains infrequent and partial. The adoption of OSPs is relevant in several aspects. On one hand, we believe that the adoption of OSPs can help mitigate the potential replicability issues present in the field of suicide research. Studies that have investigated the robustness, reproducibility, or replicability of previous findings in suicide research do not appear to be common. We have only identified one paper addressing this aspect (Tello et al., 2020). Consequently, comprehensive evidence regarding the robustness, reproducibility, or replicability of previous findings in suicide research is lacking. However, there is some indirect evidence from recent meta-analyses. For example, the meta- analysis conducted by Franklin et al. (2016) enables us to conclude that there is a high degree of stability among investigations regarding the risk factors identified in suicide research, but it also points out elements that cast doubt on the replicability of the specific results obtained. Thus, although the average effect has been similar in different periods considered, the effect sizes included in the meta-analysis have shown high heterogeneity among themselves (I2 between 86.06 and 98.45%). This high level of heterogeneity has also been observed in a recent meta-analysis that examined whether psychopathology prospectively predicts suicidal thoughts and behaviors (Guzmán et al., 2019), confirming an I2 value of 92.26%. These high values of heterogeneity, as expressed by the I2 statistic, indicate that almost all of the effect sizes considered do not overlap with each other and can therefore be considered as unique effect sizes (Borenstein et al., 2017). 15 Since it can be assumed that the observed effect sizes show some variability, this value of the I2 statistic allows us to hypothesize that the range of effect size values has been highly variable. Although it is not possible to verify these data with the information provided in the report by Franklin et al. (2016), the study by Guzmán et al. (2019) reports a 95% Prediction Interval equal to .92 - 4.77, meaning that predictions range from poor negative associations to large positive associations. In other words, this heterogeneity indicates that there is low replicability in the primary studies included in the referenced meta-analyses (Hedges & Schauer, 2019; Landy et al., 2020). On the other hand, it is important to emphasize that the lack of adoption of OSPs limits the ability to avoid an even greater problem that affects the quality of research, beyond replicability: a result can be robust, replicable, and reproducible, but still not valid (Flake et al., 2022; Nosek et al., 2022). In the context of metanalysis, as Nosek et al. (2022) assert, averaging studies that vary in terms of quality and bias risk "can lead to a false sense of precision and accuracy." Addressing the issue of validity in psychometric suicide research, the recent meta-analysis by Franklin et al. (2016) points out: "Outcome specificity was complicated by the tendency for different researchers to use different criteria to determine what did and did not qualify as a specific type of suicidal thought or behavior. These variations reflect the fact that, as with most other psychological constructs, there are no universally accepted definitions of the specific forms of suicidal thoughts and behaviors." What underlies this concern of the authors is the uncertainty regarding the construct validity of the measurements used in the primary studies (Flake et al., 2017; McClure et al., 2023). They are correct to have such concerns because "literatures that seem coherent and rigorous to the casual or even the experienced reader might, in fact, be anything but. For a reasonable synthesis of the evidence, meta-analysts would have to correct for the differences in 16 sample selection, variability, reliability, and any other measurement-driven sources of heterogeneity... Hence, (a) the lack of strong empirical or procedural norms in measurement, (b) the lack of transparency in reporting, and (c) the lack of common referents (i.e., test norms) in measurement are an enormous threat to meaningful evidence cumulation and research synthesis... Not being able to detect that study results are the outcome of a fishing [measurement] expedition can result in a seemingly homogeneous literature that is actually the product of a trawling conglomerate" (Elson et al., 2023). So, ignoring construct validity can gives rise to replication results to be uninterpretable or invalid (Flake et al., 2022; Schimmack, 2021) or give rise to results that are not replicable (Lilienfeld & Strother, 2020). With all, we believe that in both potential contexts within suicide -psychometric- research (i.e., results that are not valid, though replicable, and results that are not replicable), OSPs can play a significant and beneficial role. Limitations The present study has several limitations that should be acknowledged. Firstly, our focus was solely on psychometric studies related to measures of suicidal thoughts and behavior. While it is reasonable to assume that the findings may have some relevance to other types of suicide research, it is important to conduct systematic reviews to validate and extend our conclusions to other areas of study in the field of suicide research. For example, studies utilizing biological or behavioral measures, experimental designs, and treatment outcome analyses may be particularly susceptible to replicability concerns, making the exploration of OSP adoption in these areas especially relevant.Secondly, our analysis was based on articles retrieved from two international databases, and we did not include other databases or consider grey literature. This narrow scope of data collection may have limited the generalizability of our findings. Future studies should 17 consider expanding the search to include additional databases and sources to obtain a more comprehensive and representative sample of psychometric studies related to suicide. Another limitation is that our analysis focused primarily on the presence or absence of open science practices, without assessing the quality or extent of their implementation. For example, recently, Willroth and Atherton (2023) have emphasized that while psychological researchers are increasingly using preregistration as a tool to increase the credibility of their research findings, most preregistration deviations go unreported or are reported in unsystematic ways. Further research should explore not only the frequency but also the quality of open science practices in psychometric studies on suicide. This could involve examining factors such as the comprehensiveness of data sharing, the accessibility of materials and methods, and the adherence to transparent reporting guidelines. Additionally, our study did not delve into the specific reasons why open science practices are infrequently adopted in psychometric suicide research. Understanding the barriers and challenges that researchers face in implementing these practices is crucial for developing targeted interventions and support mechanisms to promote their adoption. Lastly, it is important to note that the field of open science is evolving rapidly, with new practices and standards continually emerging. Our study reflects the state of open science practices up until our knowledge cutoff date in 2023, and it may not capture more recent developments in the field. Future research should consider updating the analysis to account for the evolving landscape of open science practices in psychometric studies related to suicide. Despite these limitations, our study provides valuable insights into the current status of open science practices in psychometric research on suicide. By recognizing these limitations and building upon our findings, future research can contribute to the advancement of open science 18 principles in the field, ultimately improving the quality and impact of research related to suicidal thoughts and behavior. Conclusion In summary, the current state of OSPs in psychometric suicide research is critical, with limited utilization and availability of open science practices. The identification of these gaps presents an opportunity to promote responsible research conduct and advocate for the wider adoption of OSPs in the field (Nelson et al., 2018). Further efforts are needed to increase awareness, encourage the implementation of OSPs, and facilitate access to research materials and data to ensure the transparency, reproducibility, replicability, and impact of psychometric research in suicide. Researchers interested in further adhering to open science practices, as discussed in this manuscript, can draw upon concrete precedents and recommendations put forth by notable contributions in the field. For instance, Kirtley et al. (2022) and Carpenter and Law (2021) have addressed issues related to open science in suicide research, while Flores-Kanter and Mosquera (2023) have specifically focused on psychometric studies. It is important to highlight several key points from these works. Firstly, promoting the use of preregistration is essential in preventing questionable research behaviors that have been identified in the field. Preregistration allows for transparent documentation of study protocols, hypotheses, and analysis plans, enhancing research rigor and reducing biases. Secondly, the publication of preprints or postprints is highly relevant in broadening the scope of research and increasing its accessibility. Sharing research findings at an earlier stage through preprints can facilitate timely dissemination and invite valuable feedback from the scientific community. Thirdly, greater access to research materials should be promoted. It is not sufficient to solely provide information about the applied scale; researchers should also 19 consider sharing the syntax or steps of the analysis. This comprehensive sharing of materials enables better understanding, scrutiny, and replication of research findings. Fourthly, having sensitive data should not be viewed as an impediment to sharing data. Researchers can explore the option of publishing synthetic data, which ensures privacy and confidentiality while still allowing for the replication and verification of results. Lastly, the use of platforms such as the Open Science Framework (OSF) can greatly facilitate the implementation of open science practices. Researchers are encouraged to utilize these platforms to enhance transparency, collaboration, and reproducibility in their studies. Additionally, it is worth noting that OSPs can be implemented retrospectively, even in concluded studies. Sharing materials, code, and deidentified data from completed studies can contribute to the transparency and cumulative knowledge in the field. By embracing these recommendations and precedents, researchers can actively contribute to the advancement of open science practices in psychometric suicide research, fostering transparency, reproducibility, and the collective growth of knowledge in the field. Supplementary Material Pre-registration can be accessed via OSF Registries platform (https://doi.org/10.17605/OSF.IO/2FUQD). Our preregistration is associated with an OSF Project (https://doi.org/10.17605/OSF.IO/8FQG5) where all materials and data related to the present systematic review are accessible. References Antonakis, J. (2017). On doing better science: From thrill of discovery to policy implications. The Leadership Quarterly, 28(1), 5–21. https://doi.org/10.1016/j.leaqua.2017.01.006 Banks, G. C., Field, J. G., Oswald, F. L., O’Boyle, E. H., Landis, R. S., Rupp, D. E., & Rogelberg, S. G. (2019). Answers to 18 Questions About Open Science Practices. 20 Journal of Business and Psychology, 34(3), 257–270. https://doi.org/10.1007/s10869- 018-9547-8 Bauer, B. W., Law, K. C., Rogers, M. L., Capron, D. W., & Bryan, C. J. (2021). Editorial overview: Analytic and methodological innovations for suicide‐focused research. Suicide and Life-Threatening Behavior, 51(1), 5–7. https://doi.org/10.1111/sltb.12664 Borges, G., Angst, J., Nock, M. K., Ruscio, A. M., & Kessler, R. C. (2008). Risk factors for the incidence and persistence of suicide-related outcomes: A 10-year follow-up study using the National Comorbidity Surveys. Journal of Affective Disorders, 105(1–3), 25–33. https://doi.org/10.1016/j.jad.2007.01.036 Bosma, C. M., & Granger, A. M. (2022). Sharing is caring: Ethical implications of transparent research in psychology. American Psychologist, 77(4), 565–575. https://doi.org/10.1037/amp0001002 Carpenter, T. P., & Law, K. C. (2021). Optimizing the scientific study of suicide with open and transparent research practices. Suicide and Life-Threatening Behavior, 51(1), 36–46. https://doi.org/10.1111/sltb.12665 Chin, J. M., Pickett, J. T., Vazire, S., & Holcombe, A. O. (2021). Questionable Research Practices and Open Science in Quantitative Criminology. Journal of Quantitative Criminology. https://doi.org/10.1007/s10940-021-09525-6 Christensen, G., Wang, Z., Paluck, E. L., Swanson, N., Birke, D. J., Miguel, E., & Littman, R. (2019). Open Science Practices are on the Rise: The State of Social Science (3S) Survey [Preprint]. MetaArXiv. https://doi.org/10.31222/osf.io/5rksu 21 Flake, J. K., Davidson, I. J., Wong, O., & Pek, J. (2022). Construct validity and the validity of replication studies: A systematic review. American Psychologist, 77(4), 576–588. https://doi.org/10.1037/amp0001006 Flake, J. K., & Fried, E. I. (2020). Measurement Schmeasurement: Questionable Measurement Practices and How to Avoid Them. Advances in Methods and Practices in Psychological Science, 3(4), 456–465. https://doi.org/10.1177/2515245920952393 Flores Kanter, P. E. (2017). El lugar de la psicología en las investigaciones empíricas del suicidio en Argentina: Un estudio bibliométrico. Interdisciplinaria: Revista de Psicología y Ciencias Afines, 34(1). https://doi.org/10.16888/interd.2017.34.1.2 Flores-Kanter, P. E., Alesandrini, C., & Alvarado, J. M. (2023). Columbia Suicide Severity Rating Scale: Evidence of Construct Validity in Argentinians. Behavioral Sciences, 13(3), 198. https://doi.org/10.3390/bs13030198 Flores-Kanter, P. E., García-Batista, Z. E., Moretti, L. S., & Medrano, L. A. (2019). Towards an Explanatory Model of Suicidal Ideation: The Effects of Cognitive Emotional Regulation Strategies, Affectivity and Hopelessness. The Spanish Journal of Psychology, 22, E43. https://doi.org/10.1017/sjp.2019.45 Flores-Kanter, P. E., & Mosquera, M. (2023). How do you Behave as a Psychometrician? Research Conduct in the Context of Psychometric Research. The Spanish Journal of Psychology, 26, e13. https://doi.org/10.1017/SJP.2023.14 Franklin, J. C., Ribeiro, J. D., Fox, K. R., Bentley, K. H., Kleiman, E. M., Huang, X., Musacchio, K. M., Jaroszewski, A. C., Chang, B. P., & Nock, M. K. (2017). Risk factors for suicidal thoughts and behaviors: A meta-analysis of 50 years of research. Psychological Bulletin, 143(2), 187–232. https://doi.org/10.1037/bul0000084 22 Gomes, D. G. E., Pottier, P., Crystal-Ornelas, R., Hudgins, E. J., Foroughirad, V., Sánchez-Reyes, L. L., Turba, R., Martinez, P. A., Moreau, D., Bertram, M. G., Smout, C. A., & Gaynor, K. M. (2022). Why don’t we share data and code? Perceived barriers and benefits to public archiving practices. Proceedings of the Royal Society B: Biological Sciences, 289(1987), 20221113. https://doi.org/10.1098/rspb.2022.1113 Haeffel, G. J. (2022). Psychology needs to get tired of winning. Royal Society Open Science, 9(6), 220099. https://doi.org/10.1098/rsos.220099 Kirtley, O. J., Janssens, J. J., & Kaurin, A. (2022). Open Science in Suicide Research Is Open for Business. Crisis, 43(5), 355–360. https://doi.org/10.1027/0227-5910/a000859 Lewis, N. A. (2021). What counts as good science? How the battle for methodological legitimacy affects public psychology. American Psychologist, 76(8), 1323–1333. https://doi.org/10.1037/amp0000870 Lilienfeld, S. O., & Strother, A. N. (2020). Psychological measurement and the replication crisis: Four sacred cows. Canadian Psychology/Psychologie Canadienne, 61(4), 281–288. https://doi.org/10.1037/cap0000236 Manapat, P. D., Anderson, S. F., & Edwards, M. C. (2023). Evaluating avoidable heterogeneity in exploratory factor analysis results. Psychological Methods. https://doi.org/10.1037/met0000589 Millner, A. J., Robinaugh, D. J., & Nock, M. K. (2020). Advancing the Understanding of Suicide: The Need for Formal Theory and Rigorous Descriptive Research. Trends in Cognitive Sciences, 24(9), 704–716. https://doi.org/10.1016/j.tics.2020.06.007 23 Moreau, D., & Gamble, B. (2020). Conducting a meta-analysis in the age of open science: Tools, tips, and practical recommendations. Psychological Methods. https://doi.org/10.1037/met0000351 Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 0021. https://doi.org/10.1038/s41562-016-0021 Naghavi, M. (2019). Global, regional, and national burden of suicide mortality 1990 to 2016: Systematic analysis for the Global Burden of Disease Study 2016. BMJ, l94. https://doi.org/10.1136/bmj.l94 Nelson, L. D., Simmons, J., & Simonsohn, U. (2018). Psychology’s Renaissance. Annual Review of Psychology, 69(1), 511–534. https://doi.org/10.1146/annurev-psych-122216-011836 Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., … Yarkoni, T. (2015). Promoting an open research culture. Science, 348(6242), 1422–1425. https://doi.org/10.1126/science.aab2374 Nosek, B. A., & Bar-Anan, Y. (2012). Scientific Utopia: I. Opening Scientific Communication. Psychological Inquiry, 23(3), 217–243. https://doi.org/10.1080/1047840X.2012.692215 Nosek, B. A., Hardwicke, T. E., Moshontz, H., Allard, A., Corker, K. S., Dreber, A., Fidler, F., Hilgard, J., Kline Struhl, M., Nuijten, M. B., Rohrer, J. M., Romero, F., Scheel, A. M., Scherer, L. D., Schönbrodt, F. D., & Vazire, S. (2022). Replicability, Robustness, and 24 Reproducibility in Psychological Science. Annual Review of Psychology, 73(1), 719–748. https://doi.org/10.1146/annurev-psych-020821-114157 Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716. https://doi.org/10.1126/science.aac4716 Pownall, M., Azevedo, F., König, L. M., Slack, H. R., Evans, T. R., Flack, Z., Grinschgl, S., Elsherif, M. M., Gilligan-Lee, K. A., De Oliveira, C. M. F., Gjoneska, B., Kalandadze, T., Button, K., Ashcroft-Jones, S., Terry, J., Albayrak-Aydemir, N., Děchtěrenko, F., Alzahawi, S., Baker, B. J., … FORRT. (2023). Teaching open and reproducible scholarship: A critical review of the evidence base for current pedagogical methods and their outcomes. Royal Society Open Science, 10(5), 221255. https://doi.org/10.1098/rsos.221255 PRISMA-P Group, Moher, D., Shamseer, L., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P., & Stewart, L. A. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Systematic Reviews, 4(1), 1. https://doi.org/10.1186/2046-4053-4-1 Schimmack, U. (2021). The Validation Crisis in Psychology. Meta-Psychology, 5. https://doi.org/10.15626/MP.2019.1645 Segerstrom, S. C., Diefenbach, M. A., Hamilton, K., O’Connor, D. B., Tomiyama, A. J., & Behavioral Medicine Research Council. (2023). Open science in health psychology and behavioral medicine: A statement from the Behavioral Medicine Research Council. Health Psychology, 42(5), 287–298. https://doi.org/10.1037/hea0001236 Shamseer, L., Moher, D., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P., Stewart, L. A., & the PRISMA-P Group. (2015). Preferred reporting items for systematic 25 review and meta-analysis protocols (PRISMA-P) 2015: Elaboration and explanation. BMJ, 349(jan02 1), g7647–g7647. https://doi.org/10.1136/bmj.g7647 Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science, 22(11), 1359–1366. https://doi.org/10.1177/0956797611417632 Tackett, J. L., Brandes, C. M., & Reardon, K. W. (2019). Leveraging the Open Science Framework in clinical psychological assessment research. Psychological Assessment, 31(12), 1386–1394. https://doi.org/10.1037/pas0000583 Tello, N., Harika-Germaneau, G., Serra, W., Jaafari, N., & Chatard, A. (2020). Forecasting a Fatal Decision: Direct Replication of the Predictive Validity of the Suicide–Implicit Association Test. Psychological Science, 31(1), 65–74. https://doi.org/10.1177/0956797619893062 Tijdink, J. K., Horbach, S. P. J. M., Nuijten, M. B., & O’Neill, G. (2021). Towards a Research Agenda for Promoting Responsible Research Practices. Journal of Empirical Research on Human Research Ethics, 16(4), 450–460. https://doi.org/10.1177/15562646211018916 Wigboldus, D. H. J., & Dotsch, R. (2016). Encourage Playing with Data and Discourage Questionable Reporting Practices. Psychometrika, 81(1), Article 1. https://doi.org/10.1007/s11336-015-9445-1 World Health Organization. (2021). Suicide worldwide in 2019: Global health estimates. World Health Organization. World Health Organization. (2022). World mental health report: Transforming mental health for all. World Health Organization. 26 Figures Figure 1. PRISMA flow chart. 27 Figure 2. Type of publication in function of Year. Figure 2. A dotted vertical line is added in the year 2015 since the influential manuscript of the Open Science Collaboration group on the lack of replicability in psychology studies was published there. Since the year 2023 is ongoing, the range of years considered in the graph ends in 2022. 28 Figure 3. Access in function of first author region affiliation. Figure 3. Euro = Europa; NortA = North America; Osceani = Osceania; SoutA = South-Central America. 29 Figure 4. Open materials practices in function of Year. Figure 4. A dotted vertical line is added in the year 2015 since the influential manuscript of the Open Science Collaboration group on the lack of replicability in psychology studies was published there. Since the year 2023 is ongoing, the range of years considered in the graph ends in 2022. 30 Figure 5. Open materials practice in function of first author region affiliation. Figure 5. Euro = Europa; NortA = North America; Osceani = Osceania; SoutA = South-Central America. 31 Figure 6. Open materials practice in function of Type of publication. 32 Tables Table 1 Search Terms. Suicide Measure Psychometric suici* measure* psychometric* scale validity questionnaire reliability sensibility specificity development construction Note. This table summarizes the search terms used in the present systematic review5. The columns are separated by the AND operator (i.e., each query includes at least one term from each of the four columns: suicide, measure, psychometric). Each row is separated by the OR operator (i.e., one search term from each category is used for each query). To be indexed, studies will need to mention at least one term from each column (e.g., suicide ideation AND scale AND psychometric properties). All duplicates were removed after completion of the search (see Figure 1 for details). 5 Web of Science Search String: (((((TI=(suici*)) AND TI=(measure* OR scale OR quesƟonnaire)) AND TI=(psychometric* OR validity OR reliability OR sensibility OR specificity OR development OR construcƟon))) AND DOP=(2010-01-01/2023-03-07)) AND LA=(English OR Portuguese OR Spanish). Refined By: Document Types: ArƟcle. Scopus Search String: ( TITLE ( suici* ) AND TITLE ( measure* ) OR TITLE ( scale ) OR TITLE ( quesƟonnaire ) AND TITLE ( psychometric* ) OR TITLE ( validity ) OR TITLE ( reliability ) OR TITLE ( sensibility ) OR TITLE ( specificity ) OR TITLE ( development ) OR TITLE ( construcƟon ) ) AND PUBYEAR > 2009 AND ( LIMIT-TO ( DOCTYPE,"ar" ) ) AND ( LIMIT-TO ( LANGUAGE,"English" ) OR LIMIT-TO ( LANGUAGE,"Spanish" ) OR LIMIT-TO ( LANGUAGE,"Portuguese" ) ) 33 Table 2 Frequency and Sources of Open Science Practices. No Yes Open Access 59% 41% Pre/Postprint 96% 4% Open Material 68% 32% Open Script 96% 4% Open Data 99% 1% Source OA Institutional Rep. 81% 19% Manuscript/SM 23% 77% OSF 96% 4% Note. Source OA = Source of Open Access; Manuscript/SM = Manuscript/Supplementary Material; OSF = Open Science Framework.