Show simple item record

dc.contributor.authorRakap, Salih
dc.contributor.authorRakap, Serife
dc.contributor.authorEvran, Derya
dc.contributor.authorCig, Oguzcan
dc.date.accessioned2020-06-21T13:39:18Z
dc.date.available2020-06-21T13:39:18Z
dc.date.issued2016
dc.identifier.issn0747-5632
dc.identifier.issn1873-7692
dc.identifier.urihttps://doi.org/10.1016/j.chb.2015.09.008
dc.identifier.urihttps://hdl.handle.net/20.500.12712/13532
dc.descriptionCig, Oguzcan/0000-0003-0448-0016; Rakap, Salih/0000-0001-7853-3825en_US
dc.descriptionWOS: 000367755400016en_US
dc.description.abstractThe use of evidence-based practices in education has been gaining a lot of attention in recent years. Researchers often use meta-analyses to identify evidence-based practices. To conduct meta-analyses of studies employing single-subject experimental research (SSER) designs for the purpose of identifying evidence base for a practice, a necessary step is to obtain raw data from published graphs. One method for obtaining raw data from published SSER graphs is the use of computer programs specifically designed to extract data from graphs. Purpose of the present study was to examine the reliability and validity of three data extraction programs, Ungraph, GraphClick, and DigitizeIt, using 60 graphs obtained from 15 SSER studies focused on a practice. Three coders extracted data from the graphs using the three programs. Values extracted by each coder were compared to (a) each other (reliability) and (b) values reported in the original articles in which the graphs were obtained from (validity). Results showed that raw data from SSER graphs can be obtained reliably using all three data extraction programs and values obtained using the three programs are highly valid. These results suggest that researchers can use data extracted using these programs with a high level of confidences while concluding meta-analyses of studies employing SSER designs. Authors make recommendations for improving the accuracy of data extraction using the three programs. (C) 2015 Elsevier Ltd. All rights reserved.en_US
dc.description.sponsorshipNational Center for Special Education Research, Institute of Education Sciences [R324A070008]en_US
dc.description.sponsorshipWork completed in this manuscript was supported, in part, by a grant from the National Center for Special Education Research, Institute of Education Sciences to the University of Florida (R324A070008). The opinions expressed are those of the authors, not the funding agency.en_US
dc.language.isoengen_US
dc.publisherPergamon-Elsevier Science Ltden_US
dc.relation.isversionof10.1016/j.chb.2015.09.008en_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectData extraction programsen_US
dc.subjectReliability and validityen_US
dc.subjectSingle-subject experimental research designsen_US
dc.subjectUnGraphen_US
dc.subjectGraphClicken_US
dc.subjectDigitizeIten_US
dc.titleComparative evaluation of the reliability and validity of three data extraction programs: UnGraph, GraphClick, and DigitizeIten_US
dc.typearticleen_US
dc.contributor.departmentOMÜen_US
dc.identifier.volume55en_US
dc.identifier.startpage159en_US
dc.identifier.endpage166en_US
dc.relation.journalComputers in Human Behavioren_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record