Evidence based assessment/Assessment Center
The HGAPS Assessment Center was created by the UNC-Chapel Hill chapter of Helping Give Away Psychological Science in collaboration with the Society for Clinical Child and Adolescent Psychology (SCCAP). The North Carolina Psychological Association (NCPA), Division 5 of the American Psychological Association, and Division 12 Section 9 of the American Psychological Association, as well as clinical psychologists at the University of North Carolina at Chapel Hill Department of Psychology consulted on the creation of the Assessment Center. The Assessment Center consists of both a version for the general public and for clinicians.
Frequently asked questions (FAQs)
Q: Are my answers confidential? Who can see them?
Q: I don't feel comfortable with using measures that I don't know the research behind them. I realize that you have many different measures in your library, but how can I know if they are reliable/valid measures.
A: I love this question! It warms my heart to have someone want to see the evidence, and not just use any old thing available!
Several parts to the answer:
How We Prioritize Measures
We use reviews and meta-analyses to help decide which measures to prioritize. Our original set started looking at the Beidas et al. (Beidas et al., 2015) review of the best free measures available. We picked some from it, and we supplemented with results from meta-analyses (Stockings et al., 2015; Youngstrom et al., 2018; Youngstrom, Genzlinger, Egerton, & Van Meter, 2015). Most recently, we have been asking content experts who are updating handbooks on assessment (the Mash & Barkley series with Guilford, that Mitch Prinstein and I are working on editing now, and the Hunsley & Mash Guide to Assessments that Work)(Hunsley & Mash, 2018) to provide recommendations for a reasonable starter kit of measures.
To be clear, there is not perfect overlap: Not everything that fared well in the reviews is up on Wiki, nor is everything on Wiki top tier in the reviews (we didn't write every assessment page). Instead, think of it as a garden where we are using the research reviews to decide which things to cultivate and help grow more rapidly. Our priorities are (a) evidence of validity from multiple samples (we pay more attention to validity than reliability -- if a measure shows good validity, that shows it is reliable enough!), (b) free measures come before commercially distributed ones (the commercial ones have their own advertising budgets; the best of the free measures are underutilized because people do not know about them or where to find them), and (c) measures with promise of clinical utility take precedence over basic research measures.
The measures that are going into the Assessment Center are filtered even more: We are picking all of them, and having ongoing conversations with colleagues to refine the list. Everything in this smaller set has been picked in the reviews, has a good-sized research base (everything has at least four different samples, and usually dozens, behind it).
Where Can We Find this Evidence?
It is always possible to do our own search in GoogleScholar or PubMed (both of which are free). TRIP Database is helpful as an aggregator, and finds practice guidelines that would be harder to locate with other search engines, but it does not do a great job with coverage of psychological assessments. TRIP was designed to cover all of medicine, so in fairness, psychological assessment is a special niche. The Buros Mental Measurement Yearbook does not prioritize reviewing free measures (!!!), so it is not as helpful in this context as we had hoped.
The goal is to have the information just be a click away, though. We would like to get the supporting research onto the Wikipedia and Wikiversity pages. It has been a slower process than we expected, for several reasons, but we are getting the hang of it. We have added the equivalent of 7 journal articles worth of words to Wiki over the last two years.
Wikipedia is intended for the general public, so we are not putting technical information about measures there. Wikipedia editors want to see review articles, chapters, meta-analyses, and books as the citations, and they often reject citations to the actual research studies. This makes it hard to put a bibliography of the evidence for the measure on Wikipedia. If someone writes a review or meta-analysis, we can cite that; but that means that there is an awkward zone where a free measure has multiple research studies supporting it, but no review… so we can’t drop the citations on a Wikipedia page for the measure and have them stick.
Wikiversity is our way around this. We can write a page on Wikiversity that gets more technical, includes a more thorough supporting bibliography, and has links to other goodies (scoring syntax, a “live” version in the Assessment Center). We have had a lot more artistic control over the content on the Wikiversity pages, and we have spent more time on them over the last 2 years as a result.
The rate limiting factor right now is the number of people that know how to make edits. Most of the HGAPS editors are undergraduate or young graduate students, so they do not know the literature off the top of their heads. For researchers, writing content for Wiki has not been a high priority, because it is a service, not peer-reviewed research.
That is changing, though.
Things to Look for in 2020:
- More citations on Wikiversity: We are building a “library in the cloud” with all of the citations from a couple dozen reviews. It is in Zotero, a free reference management program. HGAPS and others will be able to use it to add citations quickly to Wiki. It took a while to build, but if we have one resource with 2000+ citations, share it with 100+ people and a to-do list, then I think we will make a lot of progress quickly.
- Peer Reviewed Wiki Articles: There is are Wikipedia Journals that are a hybrid: Open access (no fee!), peer reviewed, with an author byline (so the researchers get credit!), a DOI (so people know it is legit!), a stable version (a free PDF), and… it is written in Wiki format, so it has live hyperlinks, and is easier to move chunks onto Wikipedia. This is a relatively new development, but a very exciting one. It creates a path where researchers can get peer-reviewed credit, while writing content for Wiki. Thomas Shafee, a molecular biologist from Australia, did a couple talks at the APA Convention in Chicago that laid out the model and some intriguing examples (see the Lysine page in Wikipedia!).
- Branding Pages: Mian-Li Ong got a pair of icons/boxes made where we can tag pages to show that there is a sister page on Wikipedia or Wikiversity. That is a sign that we were there, and made some improvements to the content. There’s always more to be done, but it is a visual signal to those in the know that we were there.
- This page details resources associated with the HGAPS Assessment Center broken down by symptom area. This page is duplicated on the HGAPS Assessment Center.
- This page contains resources associated with the clinician version of the HGAPS Assessment Center. This resource includes (1) a list of the assessments contained in the clinician version of the HGAPS Assessment Center, (2) a list of resources for clinicians associated with specific content areas and symptom clusters, and (3) resources to otherwise assist the evidence-based assessment process and aid the utility of the HGAPS Assessment Center for clinicians.This page is duplicated on the HGAPS Assessment Center.