Search Strategies

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Brian R Haynes - One of the best experts on this subject based on the ideXlab platform.

  • cumulative index to nursing and allied health literature Search Strategies for identifying methodologically sound causation and prognosis studies
    Applied Nursing Research, 2008
    Co-Authors: Cindy Walkerdilks, Nancy L Wilczynski, Brian R Haynes
    Abstract:

    We developed Search Strategies for detecting sound articles on causation and prognosis in Cumulative Index to Nursing and Allied Health Literature (CINAHL) in the year 2000. An analytic survey was conducted, comparing hand Searches of 75 journals with retrievals from CINAHL for 5,020 Search terms and 11,784 combinations for causation and 9,946 combinations for prognosis. For detecting sound causation studies, a three-term strategy maximized sensitivity at 97.0% with a specificity of 52.3%. For detecting sound prognosis studies, a three-term strategy maximized sensitivity at 92.2% with a specificity of 50.0%. These Search filters will enhance the Searching efforts of clinicians and reSearchers.

  • embase Search Strategies achieved high sensitivity and specificity for retrieving methodologically sound systematic reviews
    Journal of Clinical Epidemiology, 2007
    Co-Authors: Nancy L Wilczynski, Brian R Haynes
    Abstract:

    Abstract Objectives Systematic reviews of the literature are instrumental for bridging reSearch to health care practice and are widely available through databases such as MEDLINE and EMBASE. Search Strategies have been developed to aid users in MEDLINE, but no empirical work has been done for EMBASE. The objective of this study was to develop Search Strategies that optimize the retrieval of methodologically sound systematic reviews from EMBASE. Study Design and Setting An analytic survey was conducted, comparing hand Searches of 55 journals with retrievals from EMBASE for 4,843 candidate Search terms and 17,004 combinations. Candidate Search Strategies were run in EMBASE, the retrievals being compared with the hand Search data. The sensitivity, specificity, precision, and accuracy of the Search Strategies were calculated. Results Two hundred twenty (16.2%) of the 1,354 articles classified as a review met basic criteria for scientific merit. Combinations of Search terms reached peak sensitivities of 94.6% with specificity at 63.7%, whereas combinations of Search terms to optimize specificity reached peak specificities of 99.3% with sensitivity at 61.4%. Conclusion Empirically derived Search Strategies can achieve high sensitivity and specificity for retrieving methodologically sound systematic reviews from EMBASE.

  • developing optimal Search Strategies for detecting clinically sound treatment studies in embase
    Journal of The Medical Library Association, 2006
    Co-Authors: Sharon Wong, Nancy L Wilczynski, Brian R Haynes
    Abstract:

    Objective: The ability to accurately identify articles about therapy in large bibliographic databases such as EMBASE is important for reSearchers and clinicians. Our study aimed to develop optimal Search Strategies for detecting sound treatment studies in EMBASE in the year 2000. Methods: Hand Searches of journals were compared with retrievals from EMBASE for candidate Search Strategies. Six trained reSearch assistants reviewed fifty-five journals indexed in EMBASE and rated articles using purpose and quality indicators. Candidate Search Strategies were developed for identifying treatment articles and then tested, and the retrievals were compared with the hand-Search data. The operating characteristics of the Strategies were calculated. Results: Three thousand eight hundred fifty articles were original studies on treatment, of which 1,256 (32.6%) were methodologically sound. Combining Search terms revealed a top performing strategy (random:.tw. OR clinical trial:.mp. OR exp health care quality) with sensitivity of 98.9% and specificity of 72.0%. Maximizing specificity, a top performing strategy (double-blind:.mp. OR placebo:.tw. OR blind: .tw.) achieved a value over 96.0%, but with compromised sensitivity at 51.7%. A 3-term strategy achieved the best optimization of sensitivity and specificity (random:.tw. OR placebo:.mp. OR double-blind:.tw.), with both these values over 92.0%. Conclusion: Search Strategies can achieve high performance for retrieving sound treatment studies in EMBASE.

  • optimal Search Strategies for retrieving systematic reviews from medline analytical survey
    BMJ, 2005
    Co-Authors: Victor M Montori, Nancy L Wilczynski, Douglas Morgan, Brian R Haynes
    Abstract:

    Abstract Objective: To develop optimal Search Strategies in Medline for retrieving systematic reviews. Design: Analytical survey. Data sources: 161 journals published in 2000 indexed in Medline. Main outcome measures: The sensitivity, specificity, and precision of retrieval of systematic reviews of 4862 unique terms in 782 485 combinations of one to five terms were determined by comparison with a hand Search of all articles (the criterion standard) in 161 journals published during 2000 (49 028 articles). Results: Only 753 (1.5%) of the 49 028 articles were systematic reviews. The most sensitive strategy included five terms and had a sensitivity of 99.9% (95% confidence interval 99.6% to 100%) and a specificity of 52% (51.6% to 52.5%). The strategy that best minimised the difference between sensitivity and specificity had a sensitivity of 98% (97% to 99%) and specificity of 90.8% (90.5% to 91.1%). Highest precision for multiterm Strategies, 57% (54% to 60%), was achieved at a sensitivity of 71% (68% to 74%). The term “cochrane database of systematic reviews.jn.” was the most precise single term Search strategy (sensitivity of 56% (52% to 60%) and precision of 96% (94% to 98%)). These Strategies are available through the “limit” screen of Ovid9s Search interface for Medline. Conclusions: Systematic reviews can be retrieved from Medline with close to perfect sensitivity or specificity, or with high precision, by using empirical Search Strategies.

  • optimal Search Strategies for detecting health services reSearch studies in medline
    Canadian Medical Association Journal, 2004
    Co-Authors: Nancy L Wilczynski, Brian R Haynes, John N Lavis, Ravi Ramkissoonsingh, Alexandra E Arnoldoatley
    Abstract:

    Background: Evidence from health services reSearch (HSR) is currently thinly spread through many journals, making it difficult for health services reSearchers, managers and policy-makers to find reSearch on clinical practice guidelines and the appropriateness, process, outcomes, cost and economics of health care services. We undertook to develop and test Search terms to retrieve from the MEDLINE database HSR articles meeting minimum quality standards. Methods: The retrieval performance of 7445 methodologic Search terms and phrases in MEDLINE (the test) were compared with a hand Search of the literature (the gold standard) for each issue of 68 journal titles for the year 2000 (a total of 25 936 articles). We determined sensitivity, specificity and precision (the positive predictive value) of the MEDLINE Search Strategies. Results: A majority of the articles that were classified as outcome assessment, but fewer than half of those in the other categories, were considered methodologically acceptable (no methodologic criteria were applied for cost studies). Combining individual Search terms to maximize sensitivity, while keeping specificity at 50% or more, led to sensitivities in the range of 88.1% to 100% for several categories (specificities ranged from 52.9% to 97.4%). When terms were combined to maximize specificity while keeping sensitivity at 50% or more, specificities of 88.8% to 99.8% were achieved. When terms were combined to maximize sensitivity and specificity while minimizing the differences between the 2 measurements, most Strategies for HSR categories achieved sensitivity and specificity of at least 80%. Interpretation: Sensitive and specific Search Strategies were validated for retrieval of HSR literature from MEDLINE. These Strategies have been made available for public use by the US National Library of Medicine at www.nlm.nih.gov/nichsr/hedges/Search.html.

Jessie Mcgowan - One of the best experts on this subject based on the ideXlab platform.

  • press peer review of electronic Search Strategies 2015 guideline statement
    Journal of Clinical Epidemiology, 2016
    Co-Authors: Margaret Sampson, Jessie Mcgowan, Douglas M Salzwedel, Elise Cogo, Vicki Foerster, Carol Lefebvre
    Abstract:

    Abstract Objective To develop an evidence-based guideline for Peer Review of Electronic Search Strategies (PRESS) for systematic reviews (SRs), health technology assessments, and other evidence syntheses. Study Design and Setting An SR, Web-based survey of experts, and consensus development forum were undertaken to identify checklists that evaluated or validated electronic literature Search Strategies and to determine which of their elements related to Search quality or errors. Results Systematic review: No new Search elements were identified for addition to the existing (2008–2010) PRESS 2015 Evidence-Based Checklist, and there was no evidence refuting any of its elements. Results suggested that structured PRESS could identify Search errors and improve the selection of Search terms. Web-based survey of experts: Most respondents felt that peer review should be undertaken after the MEDLINE Search had been prepared but before it had been translated to other databases. Consensus development forum: Of the seven original PRESS elements, six were retained: translation of the reSearch question; Boolean and proximity operators; subject headings; text word Search; spelling, syntax and line numbers; and limits and filters. The seventh (skilled translation of the Search strategy to additional databases) was removed, as there was consensus that this should be left to the discretion of Searchers. An updated PRESS 2015 Guideline Statement was developed, which includes the following four documents: PRESS 2015 Evidence-Based Checklist, PRESS 2015 Recommendations for Librarian Practice, PRESS 2015 Implementation Strategies, and PRESS 2015 Guideline Assessment Form. Conclusion The PRESS 2015 Guideline Statement should help to guide and improve the peer review of electronic literature Search Strategies.

  • electronic Search Strategies to identify reports of cluster randomized trials in medline low precision will improve with adherence to reporting standards
    BMC Medical Research Methodology, 2010
    Co-Authors: Jessie Mcgowan, Jeremy M Grimshaw, Monica Taljaard, Jamie C Brehaut, Andrew D Mcrae, Martin P Eccles
    Abstract:

    Background Cluster randomized trials (CRTs) present unique methodological and ethical challenges. ReSearchers conducting systematic reviews of CRTs (e.g., addressing methodological or ethical issues) require efficient electronic Search Strategies (filters or hedges) to identify trials in electronic databases such as MEDLINE. According to the CONSORT statement extension to CRTs, the clustered design should be clearly identified in titles or abstracts; however, variability in terminology may make electronic identification challenging. Our objectives were to (a) evaluate sensitivity ("recall") and precision of a well-known electronic Search strategy ("randomized controlled trial" as publication type) with respect to identifying CRTs, (b) evaluate the feasibility of new Search Strategies targeted specifically at CRTs, and (c) determine whether CRTs are appropriately identified in titles or abstracts of reports and whether there has been improvement over time.

  • an evidence based practice guideline for the peer review of electronic Search Strategies
    Journal of Clinical Epidemiology, 2009
    Co-Authors: Margaret Sampson, Jessie Mcgowan, Elise Cogo, Jeremy M Grimshaw, David Moher, Carol Lefebvre
    Abstract:

    Abstract Objective Complex and highly sensitive electronic literature Search Strategies are required for systematic reviews; however, no guidelines exist for their peer review. Poor Searches may fail to identify existing evidence because of inadequate recall (sensitivity) or increase the resource requirements of reviews as a result of inadequate precision. Our objective was to create an annotated checklist for electronic Search strategy peer review. Study Design A systematic review of the library and information retrieval literature for important elements in electronic Search Strategies was conducted, along with a survey of individuals experienced in systematic review Searching. Results Six elements with a strong consensus as to their importance in peer review were accurate translation of the reSearch question into Search concepts, correct choice of Boolean operators and of line numbers, adequate translation of the Search strategy for each database, inclusion of relevant subject headings, and absence of spelling errors. Seven additional elements had partial support and are included in this guideline. Conclusion This evidence-based guideline facilitates the improvement of Search quality through peer review, and thus the improvement in quality of systematic reviews. It is relevant for librarians/information specialists, journal editors, developers of knowledge translation tools, reSearch organizations, and funding bodies.

  • errors in Search Strategies were identified by type and frequency
    Journal of Clinical Epidemiology, 2006
    Co-Authors: Margaret Sampson, Jessie Mcgowan
    Abstract:

    Abstract Objective Errors in the electronic Search strategy of a systematic review may undermine the integrity of the evidence base used in the review. We studied the frequency and types of errors in reviews published by the Cochrane Collaboration. Study Design and Setting Data sources were MEDLINE Searches from reviews in the Cochrane Library, Issue 3, 2002. To be eligible, systematic reviews must have been of randomized or quasi-randomized controlled trials, reported included and excluded studies, and used one or more sections of the Cochrane Collaboration's Highly Sensitive Search Strategy. MEDLINE Search Strategies not reported in enough detail to be assessed or that were duplicates of a Search strategy already assessed for the study were excluded. Two librarians assessed eligibility and scored the eligible electronic Search Strategies for 11 possible errors. Dual review with consensus was used. Results Of 105 MEDLINE Search Strategies examined, 63 were assessed; 31 were excluded because they were inadequately reported, and 11 were duplicates of assessed Search Strategies. Most (90.5%) of the assessed Search Strategies contained ≥1 errors (median 2, interquartile range [IQR] 1.0–3.0). Errors that could potentially lower recall of relevant studies were found in 82.5% (median 1, IQR 1.0–2.0) and inconsequential errors (to the evidence base) were found in 60.3% (median 1, IQR 0.0–1.0) of the Search Strategies. The most common Search errors were missed MeSH terms (44.4%), unwarranted explosion of MeSH terms (38.1%), and irrelevant MeSH or free text terms (28.6%). Missed spelling variants, combining MeSH and free text terms in the same line, and failure to tailor the Search strategy for other databases occurred with equal frequency (20.6%). Logical operator error occurred in 19.0% of Searches. Conclusion When the MEDLINE Search strategy used in a systematic review is reported in enough detail to allow assessment, errors are commonly revealed. Additional peer review steps are needed to ensure Search quality and freedom from errors.

Nancy L Wilczynski - One of the best experts on this subject based on the ideXlab platform.

  • cumulative index to nursing and allied health literature Search Strategies for identifying methodologically sound causation and prognosis studies
    Applied Nursing Research, 2008
    Co-Authors: Cindy Walkerdilks, Nancy L Wilczynski, Brian R Haynes
    Abstract:

    We developed Search Strategies for detecting sound articles on causation and prognosis in Cumulative Index to Nursing and Allied Health Literature (CINAHL) in the year 2000. An analytic survey was conducted, comparing hand Searches of 75 journals with retrievals from CINAHL for 5,020 Search terms and 11,784 combinations for causation and 9,946 combinations for prognosis. For detecting sound causation studies, a three-term strategy maximized sensitivity at 97.0% with a specificity of 52.3%. For detecting sound prognosis studies, a three-term strategy maximized sensitivity at 92.2% with a specificity of 50.0%. These Search filters will enhance the Searching efforts of clinicians and reSearchers.

  • embase Search Strategies achieved high sensitivity and specificity for retrieving methodologically sound systematic reviews
    Journal of Clinical Epidemiology, 2007
    Co-Authors: Nancy L Wilczynski, Brian R Haynes
    Abstract:

    Abstract Objectives Systematic reviews of the literature are instrumental for bridging reSearch to health care practice and are widely available through databases such as MEDLINE and EMBASE. Search Strategies have been developed to aid users in MEDLINE, but no empirical work has been done for EMBASE. The objective of this study was to develop Search Strategies that optimize the retrieval of methodologically sound systematic reviews from EMBASE. Study Design and Setting An analytic survey was conducted, comparing hand Searches of 55 journals with retrievals from EMBASE for 4,843 candidate Search terms and 17,004 combinations. Candidate Search Strategies were run in EMBASE, the retrievals being compared with the hand Search data. The sensitivity, specificity, precision, and accuracy of the Search Strategies were calculated. Results Two hundred twenty (16.2%) of the 1,354 articles classified as a review met basic criteria for scientific merit. Combinations of Search terms reached peak sensitivities of 94.6% with specificity at 63.7%, whereas combinations of Search terms to optimize specificity reached peak specificities of 99.3% with sensitivity at 61.4%. Conclusion Empirically derived Search Strategies can achieve high sensitivity and specificity for retrieving methodologically sound systematic reviews from EMBASE.

  • developing optimal Search Strategies for detecting clinically sound treatment studies in embase
    Journal of The Medical Library Association, 2006
    Co-Authors: Sharon Wong, Nancy L Wilczynski, Brian R Haynes
    Abstract:

    Objective: The ability to accurately identify articles about therapy in large bibliographic databases such as EMBASE is important for reSearchers and clinicians. Our study aimed to develop optimal Search Strategies for detecting sound treatment studies in EMBASE in the year 2000. Methods: Hand Searches of journals were compared with retrievals from EMBASE for candidate Search Strategies. Six trained reSearch assistants reviewed fifty-five journals indexed in EMBASE and rated articles using purpose and quality indicators. Candidate Search Strategies were developed for identifying treatment articles and then tested, and the retrievals were compared with the hand-Search data. The operating characteristics of the Strategies were calculated. Results: Three thousand eight hundred fifty articles were original studies on treatment, of which 1,256 (32.6%) were methodologically sound. Combining Search terms revealed a top performing strategy (random:.tw. OR clinical trial:.mp. OR exp health care quality) with sensitivity of 98.9% and specificity of 72.0%. Maximizing specificity, a top performing strategy (double-blind:.mp. OR placebo:.tw. OR blind: .tw.) achieved a value over 96.0%, but with compromised sensitivity at 51.7%. A 3-term strategy achieved the best optimization of sensitivity and specificity (random:.tw. OR placebo:.mp. OR double-blind:.tw.), with both these values over 92.0%. Conclusion: Search Strategies can achieve high performance for retrieving sound treatment studies in EMBASE.

  • optimal Search Strategies for retrieving systematic reviews from medline analytical survey
    BMJ, 2005
    Co-Authors: Victor M Montori, Nancy L Wilczynski, Douglas Morgan, Brian R Haynes
    Abstract:

    Abstract Objective: To develop optimal Search Strategies in Medline for retrieving systematic reviews. Design: Analytical survey. Data sources: 161 journals published in 2000 indexed in Medline. Main outcome measures: The sensitivity, specificity, and precision of retrieval of systematic reviews of 4862 unique terms in 782 485 combinations of one to five terms were determined by comparison with a hand Search of all articles (the criterion standard) in 161 journals published during 2000 (49 028 articles). Results: Only 753 (1.5%) of the 49 028 articles were systematic reviews. The most sensitive strategy included five terms and had a sensitivity of 99.9% (95% confidence interval 99.6% to 100%) and a specificity of 52% (51.6% to 52.5%). The strategy that best minimised the difference between sensitivity and specificity had a sensitivity of 98% (97% to 99%) and specificity of 90.8% (90.5% to 91.1%). Highest precision for multiterm Strategies, 57% (54% to 60%), was achieved at a sensitivity of 71% (68% to 74%). The term “cochrane database of systematic reviews.jn.” was the most precise single term Search strategy (sensitivity of 56% (52% to 60%) and precision of 96% (94% to 98%)). These Strategies are available through the “limit” screen of Ovid9s Search interface for Medline. Conclusions: Systematic reviews can be retrieved from Medline with close to perfect sensitivity or specificity, or with high precision, by using empirical Search Strategies.

  • optimal Search Strategies for detecting health services reSearch studies in medline
    Canadian Medical Association Journal, 2004
    Co-Authors: Nancy L Wilczynski, Brian R Haynes, John N Lavis, Ravi Ramkissoonsingh, Alexandra E Arnoldoatley
    Abstract:

    Background: Evidence from health services reSearch (HSR) is currently thinly spread through many journals, making it difficult for health services reSearchers, managers and policy-makers to find reSearch on clinical practice guidelines and the appropriateness, process, outcomes, cost and economics of health care services. We undertook to develop and test Search terms to retrieve from the MEDLINE database HSR articles meeting minimum quality standards. Methods: The retrieval performance of 7445 methodologic Search terms and phrases in MEDLINE (the test) were compared with a hand Search of the literature (the gold standard) for each issue of 68 journal titles for the year 2000 (a total of 25 936 articles). We determined sensitivity, specificity and precision (the positive predictive value) of the MEDLINE Search Strategies. Results: A majority of the articles that were classified as outcome assessment, but fewer than half of those in the other categories, were considered methodologically acceptable (no methodologic criteria were applied for cost studies). Combining individual Search terms to maximize sensitivity, while keeping specificity at 50% or more, led to sensitivities in the range of 88.1% to 100% for several categories (specificities ranged from 52.9% to 97.4%). When terms were combined to maximize specificity while keeping sensitivity at 50% or more, specificities of 88.8% to 99.8% were achieved. When terms were combined to maximize sensitivity and specificity while minimizing the differences between the 2 measurements, most Strategies for HSR categories achieved sensitivity and specificity of at least 80%. Interpretation: Sensitive and specific Search Strategies were validated for retrieval of HSR literature from MEDLINE. These Strategies have been made available for public use by the US National Library of Medicine at www.nlm.nih.gov/nichsr/hedges/Search.html.

Margaret Sampson - One of the best experts on this subject based on the ideXlab platform.

  • press peer review of electronic Search Strategies 2015 guideline statement
    Journal of Clinical Epidemiology, 2016
    Co-Authors: Margaret Sampson, Jessie Mcgowan, Douglas M Salzwedel, Elise Cogo, Vicki Foerster, Carol Lefebvre
    Abstract:

    Abstract Objective To develop an evidence-based guideline for Peer Review of Electronic Search Strategies (PRESS) for systematic reviews (SRs), health technology assessments, and other evidence syntheses. Study Design and Setting An SR, Web-based survey of experts, and consensus development forum were undertaken to identify checklists that evaluated or validated electronic literature Search Strategies and to determine which of their elements related to Search quality or errors. Results Systematic review: No new Search elements were identified for addition to the existing (2008–2010) PRESS 2015 Evidence-Based Checklist, and there was no evidence refuting any of its elements. Results suggested that structured PRESS could identify Search errors and improve the selection of Search terms. Web-based survey of experts: Most respondents felt that peer review should be undertaken after the MEDLINE Search had been prepared but before it had been translated to other databases. Consensus development forum: Of the seven original PRESS elements, six were retained: translation of the reSearch question; Boolean and proximity operators; subject headings; text word Search; spelling, syntax and line numbers; and limits and filters. The seventh (skilled translation of the Search strategy to additional databases) was removed, as there was consensus that this should be left to the discretion of Searchers. An updated PRESS 2015 Guideline Statement was developed, which includes the following four documents: PRESS 2015 Evidence-Based Checklist, PRESS 2015 Recommendations for Librarian Practice, PRESS 2015 Implementation Strategies, and PRESS 2015 Guideline Assessment Form. Conclusion The PRESS 2015 Guideline Statement should help to guide and improve the peer review of electronic literature Search Strategies.

  • an evidence based practice guideline for the peer review of electronic Search Strategies
    Journal of Clinical Epidemiology, 2009
    Co-Authors: Margaret Sampson, Jessie Mcgowan, Elise Cogo, Jeremy M Grimshaw, David Moher, Carol Lefebvre
    Abstract:

    Abstract Objective Complex and highly sensitive electronic literature Search Strategies are required for systematic reviews; however, no guidelines exist for their peer review. Poor Searches may fail to identify existing evidence because of inadequate recall (sensitivity) or increase the resource requirements of reviews as a result of inadequate precision. Our objective was to create an annotated checklist for electronic Search strategy peer review. Study Design A systematic review of the library and information retrieval literature for important elements in electronic Search Strategies was conducted, along with a survey of individuals experienced in systematic review Searching. Results Six elements with a strong consensus as to their importance in peer review were accurate translation of the reSearch question into Search concepts, correct choice of Boolean operators and of line numbers, adequate translation of the Search strategy for each database, inclusion of relevant subject headings, and absence of spelling errors. Seven additional elements had partial support and are included in this guideline. Conclusion This evidence-based guideline facilitates the improvement of Search quality through peer review, and thus the improvement in quality of systematic reviews. It is relevant for librarians/information specialists, journal editors, developers of knowledge translation tools, reSearch organizations, and funding bodies.

  • errors in Search Strategies were identified by type and frequency
    Journal of Clinical Epidemiology, 2006
    Co-Authors: Margaret Sampson, Jessie Mcgowan
    Abstract:

    Abstract Objective Errors in the electronic Search strategy of a systematic review may undermine the integrity of the evidence base used in the review. We studied the frequency and types of errors in reviews published by the Cochrane Collaboration. Study Design and Setting Data sources were MEDLINE Searches from reviews in the Cochrane Library, Issue 3, 2002. To be eligible, systematic reviews must have been of randomized or quasi-randomized controlled trials, reported included and excluded studies, and used one or more sections of the Cochrane Collaboration's Highly Sensitive Search Strategy. MEDLINE Search Strategies not reported in enough detail to be assessed or that were duplicates of a Search strategy already assessed for the study were excluded. Two librarians assessed eligibility and scored the eligible electronic Search Strategies for 11 possible errors. Dual review with consensus was used. Results Of 105 MEDLINE Search Strategies examined, 63 were assessed; 31 were excluded because they were inadequately reported, and 11 were duplicates of assessed Search Strategies. Most (90.5%) of the assessed Search Strategies contained ≥1 errors (median 2, interquartile range [IQR] 1.0–3.0). Errors that could potentially lower recall of relevant studies were found in 82.5% (median 1, IQR 1.0–2.0) and inconsequential errors (to the evidence base) were found in 60.3% (median 1, IQR 0.0–1.0) of the Search Strategies. The most common Search errors were missed MeSH terms (44.4%), unwarranted explosion of MeSH terms (38.1%), and irrelevant MeSH or free text terms (28.6%). Missed spelling variants, combining MeSH and free text terms in the same line, and failure to tailor the Search strategy for other databases occurred with equal frequency (20.6%). Logical operator error occurred in 19.0% of Searches. Conclusion When the MEDLINE Search strategy used in a systematic review is reported in enough detail to allow assessment, errors are commonly revealed. Additional peer review steps are needed to ensure Search quality and freedom from errors.

Carol Lefebvre - One of the best experts on this subject based on the ideXlab platform.

  • press peer review of electronic Search Strategies 2015 guideline statement
    Journal of Clinical Epidemiology, 2016
    Co-Authors: Margaret Sampson, Jessie Mcgowan, Douglas M Salzwedel, Elise Cogo, Vicki Foerster, Carol Lefebvre
    Abstract:

    Abstract Objective To develop an evidence-based guideline for Peer Review of Electronic Search Strategies (PRESS) for systematic reviews (SRs), health technology assessments, and other evidence syntheses. Study Design and Setting An SR, Web-based survey of experts, and consensus development forum were undertaken to identify checklists that evaluated or validated electronic literature Search Strategies and to determine which of their elements related to Search quality or errors. Results Systematic review: No new Search elements were identified for addition to the existing (2008–2010) PRESS 2015 Evidence-Based Checklist, and there was no evidence refuting any of its elements. Results suggested that structured PRESS could identify Search errors and improve the selection of Search terms. Web-based survey of experts: Most respondents felt that peer review should be undertaken after the MEDLINE Search had been prepared but before it had been translated to other databases. Consensus development forum: Of the seven original PRESS elements, six were retained: translation of the reSearch question; Boolean and proximity operators; subject headings; text word Search; spelling, syntax and line numbers; and limits and filters. The seventh (skilled translation of the Search strategy to additional databases) was removed, as there was consensus that this should be left to the discretion of Searchers. An updated PRESS 2015 Guideline Statement was developed, which includes the following four documents: PRESS 2015 Evidence-Based Checklist, PRESS 2015 Recommendations for Librarian Practice, PRESS 2015 Implementation Strategies, and PRESS 2015 Guideline Assessment Form. Conclusion The PRESS 2015 Guideline Statement should help to guide and improve the peer review of electronic literature Search Strategies.

  • an evidence based practice guideline for the peer review of electronic Search Strategies
    Journal of Clinical Epidemiology, 2009
    Co-Authors: Margaret Sampson, Jessie Mcgowan, Elise Cogo, Jeremy M Grimshaw, David Moher, Carol Lefebvre
    Abstract:

    Abstract Objective Complex and highly sensitive electronic literature Search Strategies are required for systematic reviews; however, no guidelines exist for their peer review. Poor Searches may fail to identify existing evidence because of inadequate recall (sensitivity) or increase the resource requirements of reviews as a result of inadequate precision. Our objective was to create an annotated checklist for electronic Search strategy peer review. Study Design A systematic review of the library and information retrieval literature for important elements in electronic Search Strategies was conducted, along with a survey of individuals experienced in systematic review Searching. Results Six elements with a strong consensus as to their importance in peer review were accurate translation of the reSearch question into Search concepts, correct choice of Boolean operators and of line numbers, adequate translation of the Search strategy for each database, inclusion of relevant subject headings, and absence of spelling errors. Seven additional elements had partial support and are included in this guideline. Conclusion This evidence-based guideline facilitates the improvement of Search quality through peer review, and thus the improvement in quality of systematic reviews. It is relevant for librarians/information specialists, journal editors, developers of knowledge translation tools, reSearch organizations, and funding bodies.