Article Text

Download PDFPDF

Original research
MyHEARTSMAP: development and evaluation of a psychosocial self-assessment tool, for and by youth
  1. Punit Virk1,
  2. Samara Laskin2,
  3. Rebecca Gokiert3,
  4. Chris Richardson1,
  5. Mandi Newton4,
  6. Rob Stenstrom5,
  7. Bruce Wright4,
  8. Tyler Black6,
  9. Quynh Doan2,7
  1. 1 School of Population and Public Health, University of British Columbia, Vancouver, British Columbia, Canada
  2. 2 Pediatrics, University of British Columbia, Vancouver, British of Columbia, Canada
  3. 3 Faculty of Extension, University of Alberta, Edmonton, Alberta, Canada
  4. 4 Pediatrics, University of Alberta, Edmonton, Alberta, Canada
  5. 5 Emergency Medicine, University of British Columbia, Vancouver, British Columbia, Canada
  6. 6 Psychiatry, University of British Columbia, Vancouver, British Columbia, Canada
  7. 7 BC Children's Hospital Research Institute, Vancouver, British Columbia, Canada
  1. Correspondence to Dr Quynh Doan; qdoan{at}


Background Paediatric mental health-related visits to the emergency department are rising. However, few tools exist to identify concerns early and connect youth with appropriate mental healthcare. Our objective was to develop a digital youth psychosocial assessment and management tool (MyHEARTSMAP) and evaluate its inter-rater reliability when self-administered by a community-based sample of youth and parents.

Methods We conducted a multiphasic, multimethod study. In phase 1, focus group sessions were used to inform tool development, through an iterative modification process. In phase 2, a cross-sectional study was conducted in two rounds of evaluation, where participants used MyHEARTSMAP to assess 25 fictional cases.

Results MyHEARTSMAP displays good face and content validity, as supported by feedback from phase 1 focus groups with youth and parents (n=38). Among phase 2 participants (n=30), the tool showed moderate to excellent agreement across all psychosocial sections (κ=0.76–0.98).

Conclusions Our findings show that MyHEARTSMAP is an approachable and interpretable psychosocial assessment and management tool that can be reliably applied by a diverse community sample of youth and parents.

  • child psychology
  • accident and emergency
  • measurement
  • screening
  • qualitative research

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See:

Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.


Mental health conditions affect approximately 13%–23% of North American youth.1 2 Delayed identification of mental health conditions may lead to crises and reliance on emergency department (ED) management.3 Among youth presenting with non-mental health-related complaints to the ED, 20%–50% are found on screening to have mild to severe unrecognised or unmanaged mental health conditions.4 5 These conditions may complicate management of physical complaints,6 and increase emergency services utilisation.7

Early recognition of mental health conditions can lead to timely access to mental health services, thus improve health outcomes and utilisation of care.8 While the American Academy of Pediatrics has recommended universal screening for mental health conditions among youth,3 this has yet to be effectively implemented. Rising paediatric visits,9 coupled with the ED’s access to vulnerable populations,10 11 and ability to manage acute screening results, make EDs a promising universal screening venue.12 The ED provides an opportunity to evaluate broader psychosocial health, including substance use, education and other lifestyle factors.13 Existing assessments include HEADS-ED, a clinician-administered evaluation of youths need for immediate intervention, with good inter-rater reliability and accuracy in predicting inpatient psychiatric admission.14 15 HEARTSMAP is an expanded, but brief assessment and management tool for ED clinicians, which distinguishes psychiatric, social and behavioural concerns. This tool has good inter-rater reliability among diverse ED clinician types16 and good predictive validity for inpatient psychiatric admissions.17

Universal screening implementation barriers include ED clinicians’ inadequate mental health training,18 time constraints,19 integration into existing practices,20 strained hospital resources and limited awareness of community care.21 An online self-assessment could help reduce screening burden on clinicians and minimally impact ED flow.22 Youth may prefer disclosing sensitive information over electronic interfaces versus face-to-face interaction.23 Digital screening offers patients privacy, time to effectively articulate concerns and a sense of control over managing their well-being, without clinician judgement.24 In the ED, electronic self-assessment is time and resource efficient, which may facilitate screening uptake.

To enable universal mental health self-screening in the ED, we proposed modifying HEARTSMAP for use as a self-administered online assessment by youth and family members (MyHEARTSMAP), and to evaluate its inter-rater reliability among them.



We conducted a multiphasic, multimethod study. In phase 1, we used qualitative methods to develop MyHEARTSMAP, a youth and family version of the clinical HEARTSMAP emergency assessment and management guiding tool. We used focus groups with youth and parents to establish tool content and face validity, and ensure tool structure, readability and content appropriateness. In phase 2, we engaged a cross-section of youth and parents to evaluate 25 fictional clinical vignettes, to evaluate MyHEARTSMAP inter-rater reliability.


A convenience sample of community-based youth and parents was recruited through the support of a mental health non-profit organisation, posters at a children’s hospital and postings on the study’s and non-profit partner’s social media. We excluded youth with severe overall disability and non-English speakers. Phase 2 sample size was based on an intraclass correlation coefficient (ICC) power analysis,25 equivalent to quadratically weighted kappas.26 Thirty parent and youth raters were required to achieve a power of 80% to detect a kappa of 0.60 (substantial agreement) under the alternative hypothesis, assuming a kappa of 0.42 (moderate agreement) under the null hypothesis.


The HEARTSMAP clinical tool served as a template in developing MyHEARTSMAP. The tool has clinicians’ report across 10 psychosocial sections: home, education, alcohol and drugs, relationship and bullying, thoughts and anxiety, safety, sexual health, mood, abuse, and professional resources. Sections map to general domains: social, functional, youth health and psychiatry. For each section, concern severity is measured on a 4-point Likert-type scale from 0 (no concern) to 3 (severe concern), and services already accessed are measured on a separate 2-point scale (yes or no). Inputs from both scales feed into a built-in algorithm, triggering service recommendations with suggested time frames of access.16 17 Scoring options on each severity scale have descriptive statements expanding on each score’s conditions, helping clinicians decide on appropriate scores.

Study procedures

Phase 1 focus groups

Sixty-minute focus groups were held with up to five youth and three parents per group, in separate but simultaneous sessions. Smaller more numerous focus groups were used to facilitate in-depth discussion, and gain more varied input.

Each session followed the same structure. All participants had the opportunity to review the tool and inform its modification. A moderator introduced the tool’s purpose and thoroughly reviewed its 10 psychosocial sections while a research assistant took comprehensive notes on group discussions. The first youth and parent focus groups reviewed an expanded version of the clinical tool. Modifications were made after each set of simultaneous youth and parent sessions, subsequent groups were presented with the up-to-date version, as shown in figure 1A.

Figure 1

Schematic diagram showing the process of iterative modification that MyHEARTSMAP underwent in phase 1 (A) and phase 2 (B), with corresponding tool versions, sessions/rounds and participants involved.

First, participants went through each tool section, reviewing guiding questions, severity and resource scoring scale descriptors, with focus on improving usability. For each tool section, open-ended questions were used to assess participant’s understanding of tool components, whether they felt the sections were important to youth their age (or other parents), if they could place themselves (or their child) on the scoring scale, and ways the tool could be improved. Each session ended with participants applying the reviewed MyHEARTSMAP version to three fictional vignettes. The first two cases familiarised participants with the tool and were completed as a group or independently with the opportunity to ask questions. We retained responses from the independently completed final case, reflecting participants’ ability to use the tool.

Phase 2 inter-rater reliability evaluation

Participants completed MyHEARTSMAP for 25 fictional clinical vignettes, describing a range of paediatric psychosocial visits to the ED, from none to severe issues. Individually, participants completed a 45–60 min telephone or in-person training session with a research assistant prior to reviewing vignettes. Training included a 3 min instructional video and presentation overviewing MyHEARTSMAP sections, scoring guidelines and application to fictional cases. Participants also completed two to three training cases, scoring tool sections, and sought clarification when necessary. On training completion, vignettes were emailed in sets of five for remote completion at a self-directed pace, under parental supervision (youth participants). Vignette responses were captured in Research Electronic Data Capture (REDCap),27 an online survey system. REDCap’s activity logging feature was used to monitor duration, to ensure participants did not complete cases with unreasonable speed. After the first 10 cases, participants received a generic email highlighting close-reading strategies.

Procedures above were carried out in two consecutive rounds of evaluation shown in figure 1B. Between the rounds, participant feedback was incorporated into the tool version and vignettes, allowing further vignette and tool understandability refinement (eg, medical jargon, acronyms, word choice).

Analytical approach

Focus groups

We used qualitative content analysis to evaluate focus group transcripts.28 Data saturation was reached when no new constructive feedback or tool modifications were proposed. Transcripts were coded, summarised into categories and reviewed by the study team to make tool modifications prior to subsequent groups. We compared average percent agreement for tool sections and domains on the independent test case, to measure changes in scoring consistency with iterative tool modifications. We compared average agreement between the first and second groups of youth using Fisher’s exact test. We compared overall agreement across tool sections using a χ2 test.

Inter-rater reliability evaluation

We used quadratically weighted kappa statistics to measure overall inter-rater agreement on tool sections and domains. We also conducted subgroup analyses, measuring section and domain agreement among participating youth and parents. The mean of all pairwise kappas was used as our index of agreement.26 Statistical comparisons of kappas between or within each round of evaluation were carried out using Welch’s t-test, χ2 test and Fisher’s exact test, with significance level at p=0.05. We report 95% CIs for all tests. Analyses were conducted using Microsoft Excel 2010 Data Analysis ToolPak (Microsoft, Redmond, Washington) and STATA V.15.0 (StataCorp, College Station, Texas).

Patient and public involvement

No patients were involved in the design, data collection or analysis of this study.


Focus groups

We recruited 38 participants, 9 parents and 29 youth into 11 focus groups, 7 with youth and 4 with parents. Sixteen were youth–parent dyad members and 22 were independent. A total of 71.1% of participants were female. The median age for participating youth was 16.0 years ranging from 10 to 17 years. All participants had some lived experience with mental health concerns. Additional details are summarised in table 1. Qualitative content analysis revealed two feedback categories—MyHEARTSMAP’s approachability (covering relatability and accessibility) and interpretability.

Table 1

Demographic characteristics of study participants in phase 1 (focus groups) and phase 2 (inter-rater session)

Approachability of MyHEARTSMAP

Participants evaluating versions 1–2 (sessions 1–4) stressed the importance of being able to answer tool items honestly, without judgement from themselves or others (table 2) and being reluctant to choose a scoring option labelled as ‘major concern.’ Thus, Likert scale labels were changed to only include 0–3 numbering. Scoring descriptors were kept so participants could understand the general severity of each option. However, sometimes, score descriptors were only partially applicable, therefore an ‘or’ was introduced between statements allowing flexibility. Participants felt adding ‘or’ helped them more comfortably score. Reviewers also suggested descriptors be inclusive of youth with different experiences, such as ‘homeschooled youth’ and ‘different romantic relationships.’ Version 3 and onwards showed no new feedback with respect to how well participants related to the tool.

Table 2

Summary of key categories, feedback and tool modifications from phase 1 parent and youth focus group sessions

Interpretability of MyHEARTSMAP

On versions 3–6, feedback shifted towards tool language. Youth reviewing version 3 suggested some words might have multiple meanings, while on version 4, participants noted that idioms and terms such as ‘contraception’ and ‘consensual’ might be difficult for youth to understand. With these corrections, most comments on versions 5–7 (sessions 5–7) were reaffirming. Youth described the tool as ‘easy to understand’ and that it ‘makes sense.’ Figure 2 displays an example of progressive tool changes.

Figure 2

Progression and transformation of MyHEARTSMAP’s ‘Mood’ section, in accordance with tool versions shown in figure 1. N/A, not applicable.

Test case

Overall agreement of focus group participants on MyHEARTSMAP sections ranged from 55% (Safety) to 97% (Abuse), with similar agreement patterns between youth and parents. Across sessions, sectional and domain scoring distributions varied significantly (p<0.001).

Inter-rater reliability evaluation

We recruited and trained 32 participants; however, two youth withdrew after training, prior to case review, leaving 10 parents and 20 youth. Participating youth’s median age was 14.5 years, ranging from 12 to 17 years. Table 1 displays their demographic information. Only 57% responded to questions about ethnicity and mental health experience. Among respondents, 10% identified as visible minorities, and 17% as having past mental health experiences.

Overall, we report high weighted kappa, displaying substantial to almost perfect agreement in both rounds (table 3). Significant (p<0.001) improvements were seen in nearly all sections between rounds 1 and 2. Clinically meaningful and statistically significant improvement was observed for ‘Professionals & services’, where agreement level rose from slight to substantial. Higher sectional kappas in round 2 were found when stratified by youth and parents; domain scores and tool-triggered recommendations also improved significantly (p<0.001).

Table 3

Quadratically weighted kappa statistics (95% CIs) measuring MyHEARTSMAP sectional agreement when applied by parents and youth (n=30) to a set of 25 fictional vignettes during phase 2 of the study


MyHEARTSMAP was developed through an iterative process to be a psychosocial self-assessment and management guiding application. We saw excellent face and content validity in a diverse community sample of youth and families. Participants valued the tool’s need to be easily interpretable, approachable for users, reflect different backgrounds and situations and reduce fears of judgement. The tool displayed strong inter-rater reliability when applied to fictional cases. Scoring consensus and significant improvements between evaluation rounds are quality indicators of MyHEARTSMAP assessment data and sources of evidence for tool reliability.29

There are few valid, reliable and brief tools for youth mental health self-assessment in the ED. The Behavioural Health Screen has been evaluated for acceptability and feasibility in the paediatric ED, where it saw an uptake rate of 33%, however it was not validated for ED use. While not specific to acute care, KIDSCREEN-27 is a European self-reporting tool for routine mental health monitoring and screening in school, home or clinical settings, for healthy and chronically ill youth.30 KIDSCREEN-27 has been broadly validated and shares similar content and completion time (5–10 min) to MyHEARTSMAP.31 32 KIDSCREEN-27 studies have shown inconsistent agreement with child–parent agreement ICCs ranging from 0.46 (poor-fair) to 0.74 (good).31 32

Variable and generally low agreement between youth and parents on psychosocial subscales in the above studies may reflect inherent tool properties (eg, response format, item content), or parental misperceptions. Youth can better assess their own experiences of internalising behaviours such as anxiety and depression compared with parents.33 Parents as key informants may introduce discrepancies in assessing youth’s mental health status. By providing all raters standardised vignettes on a fictional youth’s psychosocial status, we eliminated the need for parental inference about their own child,34 and found higher levels of agreement that may more closely reflect rater precision in applying and scoring with MyHEARTSMAP. However, agreement comparisons made with KIDSCREEN-27 are made cautiously, given the different study populations, and kappa and ICC sensitivity to sample heterogeneity and prevalence.35 Quadratically weighted kappas offer practical comparability to ICCs used in KIDSCREEN-27 studies. The primary outcome measure in these studies was between child–parent agreement, we measured overall sectional agreement on MyHEARTSMAP. However, our values were comparable to these other studies, as we saw nearly identical overall and among-group kappas.

Our study is strengthened by its methodological considerations for tool administration, using rater training and accountability measures for thoughtful scoring,36 infrequently reported in inter-rater studies of psychosocial measures.37 A self-administered psychosocial tool (YouthCHAT) for opportunistic primary care screening also had end-users inform tool development.38 While we received similar positive feedback for MyHEARTSMAP’s ease of use and simplicity, our unique iterative approach allowed us to make ongoing modifications to address participant concerns, raised in both study phases, regarding item difficulty and need for age-appropriate language. MyHEARTSMAP’s ability to reliably recommend management options is a novel addition to standard psychosocial self-assessment. Patients receiving and connecting with mental healthcare recommendations made in the ED report generally higher healthcare satisfaction,39 and are more likely to remain connected to care.40 Generally, our participants spent 5–10 min on each case. However, as the tool is intended for self-assessment, evaluation of time spent self-reporting with MyHEARTSMAP will be conducted in an ongoing cohort study.

Study limitations include using note taking for focus group data collection instead of audio-recording discussions, preventing us from producing verbatim transcripts, but provided sufficient documentation for MyHEARTSMAP modifications without potentially stressing participants with audio-recording. We did not evaluate MyHEARTSMAP for reading level and while diverse, the small number of participants may not display reading comprehension issues more substantive in the general population. Furthermore, inter-rater agreement estimates may vary depending on tool application to patients or vignettes,41 vignette use required rater training to ensure participants could comfortably score psychosocial information of fictional patients. While vignettes have been used in inter-rater studies and offer diverse, realistic,42 ED mental health presentations an ongoing cohort study will evaluate whether scoring reliability differs when youth self-report with MyHEARTSMAP.

MyHEARTSMAP demonstrates good content and face validity and inter-rater reliability comparable, if not higher, than similar tools. Following prospective evaluation of its predictive validity, we intend for MyHEARTSMAP to be accessible to youth and families visiting acute and paediatric primary care settings as a downloadable application. Clinicians may offer MyHEARTSMAP on a mobile device or stationary computer in waiting rooms, for universal screening and discuss appropriate mental health services recommendations as needed.

What is known about the subject?

  • Mental health concerns in youth often go unrecognised, leading to poor health outcomes, and crisis-driven management in acute care settings.

  • Universal screening has been recommended, but not implemented due to lack of reliable, effective and efficient methods.

What this study adds?

  • A digital self-administered psychosocial assessment and management tool (MyHEARTSMAP) was developed and evaluated for use by youth and parents in emergency care.

  • MyHEARTSMAP is well positioned for evaluation for universal screening in primary and acute care settings that see youth with or without identified mental health concerns.


  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.
  23. 23.
  24. 24.
  25. 25.
  26. 26.
  27. 27.
  28. 28.
  29. 29.
  30. 30.
  31. 31.
  32. 32.
  33. 33.
  34. 34.
  35. 35.
  36. 36.
  37. 37.
  38. 38.
  39. 39.
  40. 40.
  41. 41.
  42. 42.
  43. 43.


  • Funding This work was supported by the Canadian Institutes of Health Research (grant number F16-04309), in addition to seed funding through the BC Children’s Hospital Foundation.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Ethics approval This study was approved by our local institutional ethics review board.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data availability statement Data will not be made available to protect participant identity, as confidentiality cannot be fully guaranteed, given the small sample size which was collected in a fixed time period through specific institutions.