Purpose Through a two-stage survey,this paper examines how researchers judge the quality of answers onResearchGate Q&A, an academic social networking site.Design/methodology/approachIn the first-stage survey, 15 researchers from Library and Information Science(LIS) judged the quality of 157 answers to 15 questions and reported thecriteria that they had used.The content of their reports was analyzed, and theresults were merged with relevant criteria from the literature to form thesecond-stage survey questionnaire. This questionnaire was then completed byresearchers recognized as accomplished at identifying high-quality LIS answerson ResearchGate Q&A.Findings Most of the identified quality criteria foracademic answers-such as relevance,completeness, and verifiability-havepreviously been found applicable to generic answers. The authors also foundother criteria, such as comprehensiveness, the answerer's scholarship, andvalue-added.Providing opinions was found to be the most important criterion,followed by completeness and value-added. Originality/value The findings hereshow the importance of studying the quality of answers on academic socialQ&A platforms and reveal unique considerations for the design of suchsystems.
Li,Lei,Zhang,Chengzhi,He,Daqing,Du,JiaTina. Researchers' judgmentcriteria of high-quality answers on academic social Q&A platforms[J] OnlineInformation Review,2020,02.