All Title Author
Keywords Abstract

Comparison of tools for assessing the methodological quality of primary and secondary studies in health technology assessment reports in Germany

Keywords: quality assessment , assessment quality , quality assessment tools , assessment tools , study quality , study assessment , clinical trials , evaluation criteria , methodologic quality , validity , quality , science , risk of bias , bias , confounding , systematic reviews , health technology assessment , HTA , health economics , health economic studies , critical appraisal , quality appraisal , checklists , scales , component ratings , components , tool , studies , interventional studies , observational stud

Full-Text   Cite this paper   Add to My Lib


Health care policy background: Findings from scientific studies form the basis for evidence-based health policy decisions. Scientific background: Quality assessments to evaluate the credibility of study results are an essential part of health technology assessment reports and systematic reviews. Quality assessment tools (QAT) for assessing the study quality examine to what extent study results are systematically distorted by confounding or bias (internal validity). The tools can be divided into checklists, scales and component ratings. Research questions: What QAT are available to assess the quality of interventional studies or studies in the field of health economics, how do they differ from each other and what conclusions can be drawn from these results for quality assessments? Methods: A systematic search of relevant databases from 1988 onwards is done, supplemented by screening of the references, of the HTA reports of the German Agency for Health Technology Assessment (DAHTA) and an internet search. The selection of relevant literature, the data extraction and the quality assessment are carried out by two independent reviewers. The substantive elements of the QAT are extracted using a modified criteria list consisting of items and domains specific to randomized trials, observational studies, diagnostic studies, systematic reviews and health economic studies. Based on the number of covered items and domains, more and less comprehensive QAT are distinguished. In order to exchange experiences regarding problems in the practical application of tools, a workshop is hosted. Results: A total of eight systematic methodological reviews is identified as well as 147 QAT: 15 for systematic reviews, 80 for randomized trials, 30 for observational studies, 17 for diagnostic studies and 22 for health economic studies. The tools vary considerably with regard to the content, the performance and quality of operationalisation. Some tools do not only include the items of internal validity but also the items of quality of reporting and external validity. No tool covers all elements or domains. Design-specific generic tools are presented, which cover most of the content criteria. Discussion: The evaluation of QAT by using content criteria is difficult, because there is no scientific consensus on the necessary elements of internal validity, and not all of the generally accepted elements are based on empirical evidence. Comparing QAT with regard to contents neglects the operationalisation of the respective parameters, for which the quality and precision are important for tr


comments powered by Disqus