Published on in Vol 12 (2024)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/40689, first published .
Digital Phenotyping for Stress, Anxiety, and Mild Depression: Systematic Literature Review

Digital Phenotyping for Stress, Anxiety, and Mild Depression: Systematic Literature Review

Digital Phenotyping for Stress, Anxiety, and Mild Depression: Systematic Literature Review

Authors of this article:

Adrien Choi1 Author Orcid Image ;   Aysel Ooi1 Author Orcid Image ;   Danielle Lottridge1 Author Orcid Image

Review

School of Computer Science, Faculty of Science, University of Auckland, Auckland, New Zealand

Corresponding Author:

Danielle Lottridge, PhD

School of Computer Science

Faculty of Science

University of Auckland

38 Princes Street

Auckland, 1010

New Zealand

Phone: 64 9 373 7599 ext 82930

Email: d.lottridge@auckland.ac.nz


Background: Unaddressed early-stage mental health issues, including stress, anxiety, and mild depression, can become a burden for individuals in the long term. Digital phenotyping involves capturing continuous behavioral data via digital smartphone devices to monitor human behavior and can potentially identify milder symptoms before they become serious.

Objective: This systematic literature review aimed to answer the following questions: (1) what is the evidence of the effectiveness of digital phenotyping using smartphones in identifying behavioral patterns related to stress, anxiety, and mild depression? and (2) in particular, which smartphone sensors are found to be effective, and what are the associated challenges?

Methods: We used the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) process to identify 36 papers (reporting on 40 studies) to assess the key smartphone sensors related to stress, anxiety, and mild depression. We excluded studies conducted with nonadult participants (eg, teenagers and children) and clinical populations, as well as personality measurement and phobia studies. As we focused on the effectiveness of digital phenotyping using smartphones, results related to wearable devices were excluded.

Results: We categorized the studies into 3 major groups based on the recruited participants: studies with students enrolled in universities, studies with adults who were unaffiliated to any particular organization, and studies with employees employed in an organization. The study length varied from 10 days to 3 years. A range of passive sensors were used in the studies, including GPS, Bluetooth, accelerometer, microphone, illuminance, gyroscope, and Wi-Fi. These were used to assess locations visited; mobility; speech patterns; phone use, such as screen checking; time spent in bed; physical activity; sleep; and aspects of social interactions, such as the number of interactions and response time. Of the 40 included studies, 31 (78%) used machine learning models for prediction; most others (n=8, 20%) used descriptive statistics. Students and adults who experienced stress, anxiety, or depression visited fewer locations, were more sedentary, had irregular sleep, and accrued increased phone use. In contrast to students and adults, less mobility was seen as positive for employees because less mobility in workplaces was associated with higher performance. Overall, travel, physical activity, sleep, social interaction, and phone use were related to stress, anxiety, and mild depression.

Conclusions: This study focused on understanding whether smartphone sensors can be effectively used to detect behavioral patterns associated with stress, anxiety, and mild depression in nonclinical participants. The reviewed studies provided evidence that smartphone sensors are effective in identifying behavioral patterns associated with stress, anxiety, and mild depression.

JMIR Mhealth Uhealth 2024;12:e40689

doi:10.2196/40689

Keywords



Background

Digital phenotyping is “the moment-by-moment quantification of the individual level human phenotype in situ using data from personal digital devices” [1]. Digital phenotyping applies the concept of phenotypes, in other words, the observable characteristics resulting from the genotype and environment, to conceptualize observable patterns in individuals’ digital data. In the last decade, digital phenotyping studies have been able to compare typical and atypical patterns in daily activities to correlate atypical behavior with negative emotions [2,3]. Behavioral patterns include variations in mobility, frequency of being in various locations, and sleep patterns. In smartphones, user data can be stored, managed, interpreted, and captured in enormous amounts [1,4,5]. This can be done actively or passively. Active data collection requires the user to self-report and complete surveys, whereas passive sensing collects data automatically without user input [5]. Most studies combine active and passive sensing to more accurately detect and predict behavioral abnormalities. Modern smartphone analytics can be used for the discovery of commonalities and abnormalities in user behavior. The ease of using passive sensing makes it an ideal data gathering method for mental health studies [6-8] and an ideal technique for assessing mental health [9].

Digital phenotyping has been successful in the early detection and prediction of behaviors related to neuropharmacology [10]; cardiovascular diseases [11]; diabetes [12]; and major severe injuries, such as spinal cord injury [13], motivating further adoption. Digital phenotyping has also proven useful for the detection of severe mental health issues, such as schizophrenia [14,15], bipolar disorder [16], and suicidal thoughts [17]. Digital phenotyping has been so successful for specialized, clinical populations that it is increasingly considered for mass market use with nonclinical populations. Digital phenotyping applications and software tools have been used to capture employee information, such as their screen time and clicking patterns [18]. However, there are not many digital phenotyping studies that have specifically examined the detection or prediction of stress, anxiety, and mild depression.

Individuals with stress, anxiety, and mild depression can develop chronic mental health symptoms that impact their mobility, satisfaction with life, and social interaction [19,20]. When these symptoms are not detected early, they worsen, and the impact is more significant [21-23], increasing the need for medication and hospitalization. This makes mild mental health symptoms a valid target for digital phenotyping, as its goal is to enable early detection and, subsequently, early treatment. Smartphones are increasingly ubiquitous [24], which makes them an optimal platform for digital phenotyping. We constrained our systematic literature search to the more challenging problem of the detection of mild mental health symptoms using only smartphone sensors and excluded studies that used additional wearable sensors. In general, we believe that additional wearables might increase the effectiveness of digital phenotyping in detecting stress, anxiety, and mild depression. Given the ubiquity of smartphones, we aimed to answer the following question: what is the effectiveness of digital phenotyping using smartphone sensors in detecting stress, anxiety, and mild depression?

Objectives

The objective of this systematic literature review was to better understand the current uses of digital phenotyping and results of using digital phenotyping for the detection and prediction of mild behavioral patterns related to stress, anxiety, and mild depression. The 2 research questions this review sought to answer were as follows:

  1. What is the evidence of the effectiveness of digital phenotyping using smartphones in identifying behavioral patterns related to stress, anxiety, and mild depression?
  2. In particular, which smartphone sensors are found to be effective, and what are the associated challenges?

For these research questions, we considered statistically significant associations between sensor patterns and behavioral patterns as evidence of effectiveness.


Type of Studies

This review followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines [25] (Multimedia Appendix 1). Figure 1 shows the reviewing process and search results. In the first round of screening studies, 1 author excluded studies that were not relevant to the research questions. Another author reran the queries for confirmation. Studies were included in this review if they were conducted to measure and detect stress, anxiety, or mild depression, even if they included other variables, such as job performance, promotion, or discrimination. We included studies in which data were collected through smartphones with an iOS (Apple Inc) or Android (Google LLC) operating system. Data collected through wearable devices were excluded. We included studies in which the participants were adults aged ≥18 years and were from a nonclinical population. Studies conducted with nonadult participants (eg, teenagers and children) were excluded. Given our research questions, if the studies’ participants had or had had any severe mental health disorder, such as schizophrenia, bipolar disorder, or psychosis, they were not included. We also excluded personality and character measurement and phobia studies. The primary research language was English. The studies included were conducted from September 2010 to September 2023. Peer-reviewed conference articles and journal articles were included. The data we wished to extract were the study aim, data collected, operating system in the smartphone used for data collection, behavioral patterns identified, surveys used for verification, and sample size. A total of 3 authors reviewed the studies independently to extract data and confirm the extracted data. After the first round of data extraction, 1 author re-examined the studies to extract the predictive modeling used. These data are presented in the Results section. We noticed that participants in the included studies fell into 1 of 3 major groups (ie, students, adults, and employees). We refer to the participants of the studies that recruited adults enrolled in universities as “students,” participants of the studies that recruited adults unaffiliated to any particular organization as “adults,” and participants of the studies that recruited adults employed at a particular organization as “employees.”

Figure 1. Systematic literature reviewing process and search results with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) diagram.

Search Strategy

A total of 3 databases were queried: Web of Science, ACM, and PubMed. PubMed is a medicine-based database, ACM is a technology-based database, and Web of Science is a cross-domain database. The search query was the same for the 3 platforms: “digital phenotyping” OR “passive sensing” AND (stress OR anxiety OR ((mild OR moderate) AND depression)).


Duration

The study length varied from 10 days [26] to 3 years [27]. One study [28] conducted in-depth interviews with students lasting an average of 4.5 hours per person, and another study was a controlled laboratory study [29]. These 2 studies are not presented in Table 1. In the studies conducted with students, a semester or spring or winter term was a common duration. The studies with general nonclinical adult populations were typically longer than those with students.

Table 1. Duration of the reviewed studies (N=38; 2 studies are excluded, as 1 [28] is interview based and the other [29] is a controlled laboratory study).
Study, yearLength of the study (d)
Adams et al [26], 201410
Cai et al [30], 201814
Boukhechba et al [31], 201814
Di Matteo et al [32], 202114
Jacobson et al [33], 202016
Wen et al [34], 202121
Melcher et al [35], 202328
Fukuzawa et al [36], 201928
Rashid et al [37], 202035
Zakaria et al [38], 201935
DaSilva et al [39], 201943
Nepal et al [40], 202060
Saha et al [41], 201968
Morshed et al [42], 201970
Acikmese et al [43], 201970
Zakaria et al [38], 201981
Zakaria et al [38], 201981
Boukhechba et al [44], 201798
Tseng et al [45], 201698
Morshed et al [42], 201998
Xu et al [46], 2019106
Chikersal et al [47], 2021112
Meyerhoff et al [48], 2021112
Xu et al [46], 2019113
Rhim et al [49], 2020121
Wang et al [50], 2018121
Currey and Torous [51], 2022147
Di Matteo et al [52], 2021153
Sefidgar et al [53], 2019153
Mendu et al [54], 2020153
Pratap et al [55], 2017181
Mirjafari et al [56], 2019260
Currey et al [57], 2023336
Huckins et al [58], 2020458
Mack et al [59], 2021458
Xu et al [60], 2023458
Nepal et al [61], 2022730
Servia-Rodríguez et al [27], 20171095

Number of Participants

The number of participants ranged from a minimum of 7 adults [26] to a maximum of 18,000 adults [27]. Apart from the 3-year longitudinal study with 18,000 participants [27], the average number of participants was 129.4 (SD 184.01). We observed a pattern of attrition, where the number of participants who completed the study was lower than the number of the participants recruited. The number of participants reported in this review is the final sample size. For example, one of the studies [52] recruited 112 participants, of whom 84 (75%) completed the study. In the study by Pratap et al [55], there was a drastic drop in participants, with only 359 (30.42%) of the 1180 enrolled participants completing the study. Another significant drop was seen in the study by Nepal et al [40], where 750 participants were interested in the research, whereas only 141 (18.8%) of them completed the study. Some studies were less affected; for example, 86 participants started the study by Rhim et al [49], and 78 (91%) completed it.

Publication Years of the Studies

Although the query started with the year 2010, the earliest publication was from 2014 [26], extending to articles published as of April 2023 [35]. Over the years, the interest in detecting and predicting stress, anxiety, and mild depression in the nonclinical population has increased (Table 2).

Table 2. Number of reviewed reports (N=36) by year.
YearPublication, n (%)
20141 (3)
20161 (3)
20173 (8)
20184 (11)
201910 (28)
20206 (17)
20216 (17)
20222 (6)
20233 (8)

Studies With the iOS and Android Operating Systems

The Android operating system was more common than iOS. Among the 40 included studies, only 2 (5%) were compatible with only iOS [29,51]. A total of 27 (68%) studies were available for both iOS and Android [26,28,30,34,35,37-42,45-47,50,53-61]. A total of 11 (28%) studies were for only Android users [27,31-33,36,42-44,48,49,52]. The reasons identified for the use of the Android operating system were that it has more freedom to capture more modalities, such as keyboard typing and use of apps, and that Android devices enable apps to run more easily in the background [49].

Studies With Students

Table 3 presents the data extracted from the studies that were conducted with student populations. The average length of the studies with students was 158.6 (SD 176.4) days. The average number of participants was 137.3 (SD 152.1). There were significantly more studies with students than studies with employees or general adults. The sample sizes of the studies with students were similar to those of the studies with adults but smaller than those of the studies with employees. In the studies with students, various passive sensors were used, and some were found to be effective for detection, prediction, or both.

Of the 28 studies with students, 23 (82%) used machine learning models for prediction. A total of 12 studies (43%) [30,31,33,37,38,44,46,47,54] used decision tree–based methods, and 9 studies (32%) [37,39,42,49-51,57,58] used regression-based methods. A total of 3 (11%) studies conducted in recent years [43,60,61] used deep neural networks because of their enhanced ability to discern underlying patterns in large unstructured data sets. Tree-based models have the best performance when trained with structured data, and the reported studies mostly used tree-based models and structured data. Among the 28 studies, 2 studies [57,60] conducted in 2023 addressed the generalizability of their proposed detection method and verified its applicability across students from various years, classes, and institutions. Two (7%) studies [42,43] in Table 3 used the StudentLife data set [62]. Each study contributed substantial original analyses including different behavioral patterns and was considered a “study” in this systematic review. Entries with “N/A” in the predictive modeling column indicate that the study did not involve any attempts to predict future occurrences. However, these studies may still contain statistical analyses as part of their research approach. Overall, students who experienced depression, anxiety, and stress visited fewer locations [39,44,50,58-60] and were more sedentary [47,50,58-60]. Depression was also associated with shorter or irregular sleep [35,46,47,50,52,59,60] and accrued phone use [46,47,50,51,58-60].

Table 3. Summary of the reviewed studies with student participants.
Study, yearAimData collectedOperating systemBehavioral patternsPredictive modelingVerification surveysSample size, n
Huckins et al [58], 2020Understand how students’ behavioral health and mental health are affected by the COVID-19 pandemicGPS, accelerometer, phone lock and unlock, and light sensor dataiOS (Apple Inc) and Android (Google LLC)At the start of the COVID-19 pandemic, students were more depressed and anxious, used their phones more, visited fewer locations, and spent more time sedentary. Depression and stress were associated with increasing COVID-19–related news coverage.Linear regressors were used to inspect how behavioral changes were affected by COVID-19 news reports.PHQa-4217 students
Melcher et al [35], 2023Understand how behavioral patterns correlate with mental health for students during the COVID-19 pandemicGPS, accelerometer, call log, and phone use dataiOS and AndroidIndividuals with more irregular sleep patterns had worse sleep quality and were experiencing more depression and more stress than those with consistent sleep patterns.N/AbPHQ-9, DASSc, SIASd, GAD-7e, PQf, PSSg, PSQIh, BASISi, SFj-36, SFSk, Flourishing Scale, CGIl, HDRSm, CASn, HAIo, and UCLAp-Loneliness Scale100 students
Jacobson et al [33], 2020Predict social anxiety symptom severity and discriminate between depression, negative affect, and positive affectAccelerometer, call log, and SMS text message dataAndroidMeasures of SMS text message and call response time discriminated among depression, negative affect, and positive affect. Accelerometer patterns suggested that persons with low social anxiety walked at a steady pace, whereas persons with high social anxiety walked more quickly with more irregularity.XGBoostq with LOOCVr was used to predict social anxiety symptom severity.SIAS, DASS-21, and PANASs59 students
DaSilva et al [39], 2019Predict stressGPS, accelerometer, phone lock and unlock, microphone, and light sensor dataiOS and AndroidStudents with stress were more likely to spend less time in campus food locations and more time in schoolwork locations. Students with stress traveled less, engaged in fewer conversations, and were in quieter environments during evenings.Penalized generalized estimating equations were used to prune features and fit a marginal regression model to predict stress.MPSMt94 students
Acikmese and Alptekin [43], 2019Predict stress levelAccelerometer, microphone, Bluetooth, light sensor, phone lock and unlock, phone charge, and app use data (GPS and Wi-Fi data were collected but not used)AndroidStudents were successfully categorized as stressed or nonstressed using the measured sensors.LSTMu, CNNv, and CNN-LSTM were used to classify stress, with LSTM yielding the best accuracy.Self-reported stress48 students
Rooksby et al [28], 2019Understand students’ perspectives about digital phenotypingGPS, phone lock and unlock, phone charge, battery, microphone, Bluetooth, light sensor, SMS text message, email, app use, call log, camera, and keyboard dataiOS and AndroidNone of the results related sensors to symptoms of depression or anxiety. Students have privacy concerns regarding the use of app use logs, Bluetooth data, call logs, camera data, keyboard data, and microphone data but not regarding the use of battery, or light sensor. Students had privacy concerns with the use of SMS text message content but not with counts of messages.N/APHQ-9, GAD-7, and WEMWBSw15 students
Chikersal et al [47], 2021Predict postsemester depressive symptomsGPS, accelerometer, Bluetooth, Wi-Fi, phone use, call log, and microphone dataiOS and AndroidDepression was predicted by participants’ social context in the afternoons and evenings, phone use throughout the day, long periods without exercise, periods of disturbed sleep at night, and time spent outdoors.Trained an ensemble classifier with the outputs from models containing features from 1 sensor, with different setting combinations.BDIx-II138 students
Morshed et al [42], 2019Predict mood instabilityAccelerometer, microphone, Bluetooth, light sensor, Wi-Fi, GPS, phone lock and unlock, and phone charge dataAndroidMood instability was negatively correlated with the duration of sleep, the number of conversations, the amount of activity, and outdoor mobility.Ridge regression with regularization was used to infer mood instability score.EMAsy, PAMz, and PANAS48 students
Zakaria et al [38], 2019Detect depression and stressWi-Fi dataiOS and AndroidStudents with severe stress spent significantly less time on campus and were less involved in work-related activities than students with normal stress. Students with severe stress were more involved in these activities at the start of the semester, but the involvement decreased over time.The random forest stress model with domain-specific features achieved the best result, with feature sets changed every 6 days.PSS-4, PHQ-8, and BFIaa62 students
Zakaria et al [38], 2019Detect depression and stressWi-Fi dataiOS and AndroidSame patterns as those mentioned earlier.The random forest model that excluded domain-specific features achieved the best result, with feature sets changed every 6 days.PSS-4, PHQ-8, and BFI11 students
Zakaria et al [38], 2019Detect depression and stressWi-Fi dataiOS and AndroidSame patterns as those mentioned earlier.The best model is a random forest model with the neuroticism score added as an additional feature, with sensor data sets calculated with a 6-day interval.PSS-4, PHQ-8, and BFI35 students
Wang et al [50], 2018Predict depressionLight sensor, GPS, accelerometer, microphone, screen on and off, and phone lock and unlock dataiOS and AndroidStudents who experienced depression had more irregular sleep patterns, used their phones more at study places, spent more time stationary, and visited fewer locations.LASSOab regression was used to predict presurvey and postsurvey PHQ-9 scores.PHQ-4 and PHQ-883 students
Exposito et al [29], 2018Detect stressKeyboard 3D touch dataiOSStudents’ typing pressure increased under stress.N/ASelf-reported stress11 students
Rhim et al [49], 2020Detect subjective well-being and stressAccelerometer, GPS, screen on and off, app use, and notification dataAndroidLower subjective well-being was associated with more time spent on campus, more time spent stationary, increased phone use in the evenings, and more expenses.Hierarchical regression models were used to predict subjective well-being.COMOSWBac, PHQ, SASad, PPCae, and BFI78 students
Sefidgar et al [53], 2019Detect stress, anxiety, and gender discriminationAccelerometer, GPS, phone lock and unlock, screen on and off, and call log dataiOS and AndroidStudents who experienced discrimination became more physically active; their phone use increased in the morning, they had more calls in the evening, and they spent more time in bed on the day of the discrimination.Linear regression was used to predict long-term changes in mental health states; hierarchical linear modeling was used for short-term prediction.UCLA Loneliness Scale, SSSaf, MAASag, ERQah, BRSai, PSS, CES-Daj, STAIak, and self-reported affect and fairness of treatment176 students
Cai et al [30], 2018Detect state affect, stress, anxiety, and depressionAccelerometer, GPS, call log, and SMS text message dataiOS and AndroidNegative emotions were related to geographical locations, but this was affected by personal routines and preferences, for example, liking cinema theatres. On Fridays and Saturdays, students reported less negative states.Compared support vector machine, random forest, and XGboost with LOSOCVal and LOOCV to predict negative affect. The best model was support vector machine with LOOCV.SIAS and self-reported affect (EMAs)220 students
Boukhechba et al [31], 2018Predict response rate and latency to EMAGPS, call log, accelerometer, and SMS text message dataAndroidNone of the results related sensors to symptoms of depression or anxiety.Used random forest, support vector machine, and a multilayer perceptron of 1 hidden layer with LOOCV to predict the compliance rate of EMA responses.Self-reported affect (EMAs)65 students
Xu et al [46], 2019Detect depressionAccelerometer, battery or charge, Bluetooth, call log, screen, location, and phone lock and unlock dataiOS and AndroidStudents who experienced depression had more disturbed sleep patterns and more phone interactions than students who did not experience depression.AdaBoostam with decision tree–based components achieved the best performance when features were hybrid (contextually filtered + unimodal).BDI-II138 students
Xu et al [46], 2019Detect depressionAccelerometer, battery or charge, Bluetooth, call log, screen, location, and phone lock and unlock dataiOS and AndroidSame patterns as those mentioned earlier.AdaBoost with decision tree–based components achieved a similar result to majority-based baseline predictors.BDI-II212 students
Boukhechba et al [44], 2017Predict social anxietyGPS, call log, and SMS text message dataAndroidStudents who experienced high social anxiety may be more likely to buy food so they can eat at home; they tended to visit fewer places and had a narrower range of activities.Decision tree was used to predict SAS.SIAS54 students
Rashid et al [37], 2020Predict social anxiety and evaluate the effectiveness of imputation methods in handling missing dataGPS, pedometer, accelerometer, call log, and SMS text message dataiOS and AndroidThe level of social anxiety was predicted, but there were no specific patterns relating sensors to symptoms of social anxiety.Evaluated 7 predictive models: linear regression, decision tree, XBboost, lightGBMan, random forest, MERFao, and CatBoost.SIAS and self-reported dimensions of social anxiety80 students
Mendu et al [54], 2020Explore the relationships among private social media messages, personality traits, and symptoms of mental illnessFacebook (Meta Platforms, Inc) private messagesiOS and AndroidStudents who experienced anxiety received responses later, had more night-time communications, talked less about games and sports, and used more plural pronouns.Used random forest classifier to select features and support vector machine with LOOCV to predict each psychological measure binarily.STAI, UCLA Loneliness Scale, and TIPIap103 students
Tseng et al [45], 2016Detect stress and its relationship with academic performanceLocation, activity, step count (iOS only), audio, accelerometer (iOS only), device use, charging event, battery, light (Android only), SMS text message (Android only) and call (Android only) data and data about currently running apps (Android only)iOS and AndroidStudents slept less during examination periods and more during breaks; they felt more stressed during the breaks and examination periods; sensor data were able to capture different routines during weekdays, weekends, and breaks.N/APSQI, ESSaq, MCTQar, PROMISas-10, BHMat-20, CD-RISCau, Flourishing Scale, Perceived Stress Scale, BFI, PHQ-8, and UCLA Loneliness Scale22 students
Mack et al [59], 2021Understand the association between behavioral and mental health and the COVID-19 pandemicGPS, accelerometer, phone lock and unlock, and light sensor dataiOS and AndroidDuring the COVID-19 pandemic, students experienced more depression and anxiety and increased sedentary time and phone use, whereas sleep and the number of locations visited decreased.N/APHQ-4 and EMAs217 students
Xu et al [60], 2023Evaluate the cross–data set generalizability of depression detectionGPS, accelerometer, phone lock and unlock, Bluetooth, Wi-Fi, call log, microphone, gyroscope, and light sensor dataiOS and AndroidIndividuals who experienced depression had shorter sleep duration, had more interrupted sleep, had more frequent phone locks and unlocks, spent more time at home, were more sedentary, had fewer physical activities, visited fewer uncommon places, and had more consistent mobility patterns.A multitask learning model with the 1D-CNNav–based embedding, fully connected layers for reordering and classification.Weekly surveys on self-reported depression symptoms and affect, BDI-II, and PHQ-4534 students
Nepal et al [61], 2022Explore the association between students’ COVID-19 concerns and behavioral and mental healthGPS, accelerometer, phone lock and unlock, light sensor, and phone use dataiOS and AndroidHeightened COVID-19 concerns correlated with increased depression, anxiety, and stress. No specific results relating sensors to symptoms of depression, anxiety, or stress were observed.Evaluated different deep learning models in terms of their classification of COVID-19 concerns: CNN, InceptionTime, MCDCNNaw, ResNetax, multilayer perceptron, TWIESNay, LSTM, and FCNNaz; FCNN performed the best, with an AUROCba score of 0.7.Self-reported affect and PHQ-4180 students
Currey and Torous [51], 2022Predict survey results on mental health from passive sensorsGPS, accelerometer, call, and screen time dataiOSIndividuals at higher risks of psychosis spent less time at home. Individuals who were lonelier had longer sleep duration and fewer calls. Individuals who experienced stress or depression had longer outgoing calls.Logistic regression was used to predict survey scores.PHQ-9, GAD-7, PSS, UCLA Loneliness Scale, PQ-16, and PSQI147 students
Currey et al [57], 2023Explore the cross–data set generalizability of symptom improvement based on the surveysGPS, accelerometer, and screen time dataiOS and AndroidLogistic regression was able to predict changes in mood across 2 data sets of student participants. No results relating sensors to symptoms of depression or anxiety were observed.Logistic regression was used to predict weekly score improvement from both active and passive features.PHQ-9, GAD-7, PSS, UCLA Loneliness Scale, PSQI, PQ-16, and DWAIbb698 students

aPHQ: Patient Health Questionnaire.

bN/A: not applicable.

cDASS: Depression Anxiety Stress Scales.

dSIAS: Social Interaction Anxiety Scale.

eGAD-7: Generalized Anxiety Disorder Scale-7.

fPQ: Prodromal Questionnaire.

gPSS: Perceived Stress Scale.

hPSQI: Pittsburgh Sleep Quality Index.

iBASIS: Behavior and Symptom Identification Scale.

jSF: Short Form Health Survey.

kSFS: Social Functioning Schedule Scale.

lCGI: Clinical Global Impressions Scale.

mHDRS: Hamilton Depression Rating Scale.

nCAS: Coronavirus Anxiety Scale.

oHAI: Health Anxiety Inventory.

pUCLA: University of California, Los Angeles.

qXGBoost: extreme gradient boosting.

rLOOCV: leave-one-out cross validation.

sPANAS: Positive and Negative Affect Schedule.

tMPSM: Mobile Photographic Stress Meter.

uLSTM: long short-term memory.

vCNN: convolutional neural network.

wWEMWBS: Warwick-Edinburgh Mental Well-Being Scale.

xBDI: Beck Depression Inventory.

yEMA: ecological momentary assessment.

zPAM: Patient Activation Measure.

aaBFI: Big Five Inventory.

abLASSO: least absolute shrinkage and selection operator.

acCOMOSWB: Concise Measure of Subjective Well-Being.

adSAS: Sport Anxiety Scale.

aePPC: Perceived Personal Control.

afSSS: Social Support Scale.

agMAAS: Mindful Attention Awareness Scale.

ahERQ: Emotion Regulation Questionnaire.

aiBRS: Brief Resilience Scale.

ajCES-D: Center for Epidemiological Studies-Depression.

akSTAI: State Trait Anxiety Inventory.

alLOSOCV: leave-one-subject-out cross validation.

amAdaBoost: adaptive boosting.

anLightGBM: light gradient boosting machine.

aoMERF: mixed-effects random forest.

apTIPI: Ten-Item Personality inventory.

aqESS: Epworth Sleepiness Scale.

arMCTQ: Munich Chronotype Questionnaire.

asPROMIS: Patient-Reported Outcomes Measurement Information System.

atBHM: Behavioral Health Measure.

auCD-RISC: Connor-Davidson Resilience Scale.

av1D-CNN: 1-dimensional convolutional neural network.

awMCDCNN: multi-channel deep convolutional neural network.

axResNet: residual network.

ayTWIESN: time warping invariant echo state network.

azFCNN: fully convolutional neural network.

baAUROC: area under the receiver operating characteristic curve.

bbDWAI: Digital Working Alliance Inventory.

Studies With Adults

Table 4 presents the data extracted from the studies conducted with the general adult population. The average study duration was 201.6 (SD 367) days. Apart from a 3-year longitudinal study with 18,000 participants, the average number of participants was 123.4 (SD 139.8). Of the 8 studies with adults, 2 (25%) [32,52] were conducted with the same set of participants. A total of 3 (38%) studies used predictive modeling, with regression-based models being the most common [34,36,52], and 1 (12%) study identified gender differences in behavioral patterns [27]. Overall, the research with adults showed that GPS, accelerometer, ambient audio, and illuminance data related to individuals’ emotional state. Adults with depression were less likely to leave home and were less physically active, whereas adults who were socially anxious were more active and left their home more often but avoided going to places where they needed to socially interact.

Table 4. Summary of the reviewed studies with adult participants.
Study, yearAimData collectedOperating systemBehavioral patternsPredictive modelingVerification surveysSample size, n
Di Matteo et al [32], 2021Understand whether ambient speech correlates with social anxiety, generalized anxiety, and depressive symptomsMicrophone dataAndroidGeneralized anxiety and depression were correlated with reward-related words. Social anxiety was correlated with vision-related words.N/AaLSASb, GAD-7c, PHQd-8, and SDSe86 Canadian adults
Di Matteo et al [52], 2021Predict general anxiety disorder, social anxiety disorder, and depressionGPS, microphone, screen on and off, and light sensor dataAndroidDepression and social anxiety were associated with increased screen use. Depression was associated with sleep disturbance and death-related word features.A total of 3 logistic regression models were used to predict social anxiety disorder and generalized anxiety disorder with repeated k-fold cross validation.LSAS, GAD-7, PHQ-8, and SDS84 Canadian adults
Wen et al [34], 2021Detect impulsive behavior, positivity, and stressCall log, phone lock and unlock, and phone charging dataiOS and AndroidImpulsivity was correlated with increased phone use and screen checking.Used LASSOf regularization to first select features and trained a linear regression model to estimate trait impulsivity scores.BISg-15, UPPSh, PAMi, and self-reported feelings26 adults
Fukazawa et al [36], 2019Predict anxiety levels and stressLight sensor, gyroscope, accelerometer, and app use dataAndroidAnxiety was higher from Monday to Thursday than on Friday and Saturday. Increased anxiety was associated with decreased mobility. During mild exercise, anxiety was reduced.Used linear classifier by LASSO and XGBoostj to classify the change of anxiety.STAIk20 adults
Pratap et al [55], 2017Detect depressionGPS, call log, and SMS text message dataiOS and AndroidNone of the results related sensors to symptoms of depression.N/APHQ-2 and PHQ-9359 Hispanic or Latino adults
Adams et al [26], 2014Detect stress levelMicrophone dataiOS and AndroidStress can be recognized from pitch, speaking speed, and vocal energy.N/APANASl, PSSm-14, MAASn, and self-reported affect7 adults
Meyerhoff et al [48], 2021Detect anxiety and depressionGPS, call log, app use, and SMS text message dataAndroidChanges in the number of locations visited and social activity duration were associated with depression. Time spent at exercise locations was positively correlated with changes in depressive symptoms.N/AGAD-7, PHQ-8, and SPINo282 adults
Servia-Rodríguez et al [27], 2017Predict moodGPS, Wi-Fi, cell tower, accelerometer, microphone, SMS text message, and call dataAndroidA strong correlation was identified between daily routines and users’ personality, well-being perception, and other psychological variables; the participants who were the most emotionally stable tended to be more active, stayed in more noisy places, and texted less than participants who were unstable.Used stacked RBMsp to classify moods.Big-5 personality test, self-reported mood, and self-reports of locations18,000 adults mainly

aN/A: not applicable.

bLSAS: Liebowitz Social Anxiety Scale.

cGAD-7: Generalized Anxiety Disorder Assessment-7.

dPHQ: Patient Health Questionnaire.

eSDS: Sheehan Disability Scale.

fLASSO: least absolute shrinkage and selection operator.

gBIS: Barratt Impulsiveness Scale.

hUPPS: Impulsive Behavior Scale.

iPAM: Patient Activation Measure.

jXGBoost: extreme gradient boosting.

kSTAI: State Trait Anxiety Inventory.

lPANAS: Positive and Negative Affect Schedule.

mPSS: Perceived Stress Scale.

nMAAS: Mindful Attention Awareness Scale.

oSPIN: Social Phobia Inventory.

pRBM: Restricted Boltzmann Machine.

Studies With Employees

Table 5 presents the data extracted from the studies that were conducted with employees. Among the 4 studies with employees, 1 (25%) study recruited its own participants [56], and the other 3 (75%) studies [40-42] used the Tesserae data set [63]. Compared with students and adults, the employee population was the least studied, with the fewest articles. However, the studies with employees had the largest number of participants, with a mean of 427.3 (SD 280.3). All 4 studies used regression-based predictive modeling, and 2 (50%) of them [40,56] evaluated a variety of models, with logistic regression, support vector machine, and random forest being the most common methods. Detecting and predicting employees’ stress in workplaces were examined in tandem with employees’ work performance. The research goal for these studies was to understand the underlining reasons for lowered work-related productivity. In contrast to the other 2 populations (ie, students and adults), less mobility was seen as positive for employees because less mobility in workplaces was associated with more positivity and higher performance.

Table 5. Summary of the reviewed studies with employee participants.
Study, yearAimData collectedOperating systemBehavioral patternsPredictive modelingVerification surveysSample size, n
Mirjafari et al [56], 2019Predict stress and job performanceAccelerometer, GPS, phone lock and unlock, and light sensor dataiOS and AndroidHigher performers unlocked their phone fewer times during evenings, had less physical activity, visited fewer locations on weekday evenings, were more mobile, and visited more locations during weekends.Evaluated logistic regression, support vector machine, random forest, and XGBoosta in terms of employee performance classification; XGBoost was the best model with 5-fold cross validation.ITPb, IRBc, OCBd, and CWBe554 employees
Nepal et al [40], 2020Detect stress, well-being, and moodGPS, phone lock and unlock, accelerometer, Bluetooth, and phone use dataiOS and AndroidPromoted employees spent more time on their phones during early mornings and late evenings and had more unlocks during the night time than nonpromoted employees. Women’s mobility increased after promotion, whereas men’s mobility decreased.Evaluated logistic regression, support vector machine, Gaussian naive Bayes, random forest, and k-nearest neighbor in terms of their classification between promoted and nonpromoted periods; the best model was logistic regression trained on ROCKETf-based features.CWB, OCB, IRB, and ITP141 employees
Saha et al [41], 2019Predict stress and workplace performanceLight sensor, GPS, accelerometer, and phone lock and unlock dataiOS and AndroidStress was higher with increased role ambiguity.Linear regression was used to predict a well-being score.IRB, ITP, and OCB257 employees
Morshed et al [42], 2019Predict mood instabilityLight sensor, GPS, accelerometer, and phone lock and unlock dataiOS and AndroidMood instability was negatively correlated with the duration of sleep, the number of conversations, the amount of activity, and outdoor mobility.Ridge regression with regularization was used to infer a mood instability score.EMAsg, PAMh, and PANASi757 employees

aXGBoost: extreme gradient boosting.

bITP: Psychological Type Indicator.

cIRB: in-role behavior.

dOCB: organizational citizenship behavior.

eCWB: counterproductive work behavior.

fROCKET: random convolutional kernel transform.

gEMA: ecological momentary assessment.

hPAM: Patient Activation Measure.

iPANAS: Positive and Negative Affect Schedule.

Passive Sensors

Overview

Table 6 provides an overview of the range of sensors used to detect patterns related to mild mental health symptoms and summarizes the evidence of the effectiveness of the various sensors. The first column lists the sensor, and the second column presents how the data from that sensor are interpreted; in other words, it presents the behavior-related information that the sensor data are intended to represent. The third column indicates which articles found significant associations between the specific sensor and stress, anxiety, or mild depression. The fourth column indicates which articles found no significant associations between the specific sensor and mental health outcomes (ie, explicitly stated so in the articles). In the subsequent sections, we discuss the types of activities detected by the sensors.

Table 6. Sensor summary of the reviewed studies.
SensorBehaviorEvidence for effectivenessNo evidence
GPSLocation and physical activity[27,30,35,37,39-42,44-53,56-61][28,31,55]
MicrophoneVoice recognition, ambient sound, and sleep[26,27,32,39,42,43,45,48,50,52,60][28,41,47]
Light sensorTime spent in darkness and sleep[36,39,41,43,45,50,52,56,58-61][28,42,48]
AccelerometerMovement and physical activity[27,30,35-37,39-43,45-47,49-51,53,56-61][31,33]
Phone locks and unlocksPhone use[34,35,39,40,43,45-47,50,53,56,58-61][28,41,42]
Call logsSocial interaction and incoming and outgoing calls[27,33,34,37,44-46,51,53,60][28,30,31,35,47,48,55]
BluetoothSocial interaction[40,42,43,46,47,60][28,51]
Wi-FiIndoor location[27,38,42,47,60]None
KeyboardTyping patterns and muscle activity[29][28]
SMS text messages and emailsSocial interaction and incoming and outgoing messages[27,32,33,37,44,45,52][28,30,31,48,55]
App usePhone use and social media[28,35,36,40,43,45,48,49,61]None
Screen on and offPhone use[40,45,46,49,50,52,53,55,57][51]
GyroscopeOrientation of the smartphone[36,60]None
Social Interaction: Call and Text Logs, Audio, Microphone, and Bluetooth

The social interaction of an individual is reflective of their current mood and mental state [44,64,65,66]. Individuals with depression and stress may be expected to decrease their social interactions. This is measured through the frequency of receiving texts and calls, how fast individuals respond, and the frequency of being around others. Among the 40 included studies, 18 (45%) [27,28,30,31,33-35,37,44-48,51,53,55,60] examined call logs to understand social interaction patterns, mainly through the number of incoming and outgoing calls, the number of missed calls, and the duration of calls. Individuals who experience depression and stress may engage in longer outgoing calls [51]. Evening communications were predictive of depression [47], anxiety, and loneliness [54]. Students who experienced discriminations [53] and anxious participants had more evening communications [54]. Metadata on SMS text messages were examined in 10 (25%) [27,28,30,31,33,37,44,45,48,55] of the 40 studies, including the frequency of receiving SMS text messages and the average time of responses. People who are socially anxious were found to take different amounts of time to respond to SMS text messages and calls [33]. Increases in the number of calls were associated with increased social anxiety [48]. Those who experienced social anxiety were less likely to call or text in public [44]. For students, fewer conversations were associated with more stress [39] and more mood instability [42]. One of the studies found that more emotionally unstable individuals tended to text more than emotionally stable individuals [27].

Location: GPS, Bluetooth, and Wi-Fi

Location data can provide insights into individuals’ mental health state in terms of the normal or abnormal variety and frequency of locations visited [67]. As presented in Table 6, GPS has been one of the most commonly used passive sensors for stress, anxiety, and mild depression research. The findings regarding location consistently demonstrate that students and adults who experienced depression, anxiety, or stress tended to visit fewer places [39,44,50,58-60]. One of the studies [48] found that location data are highly inversely correlated with mild depression severity. The main way in which this is measured is through the frequency of exiting the house, the variety of locations visited, and mobility. The frequency of exiting the house is less for individuals who are depressed, and there is less variety in the visited locations for individuals who are socially anxious. Individuals who are feeling depressed often experience being less energetic [68,69]. Overall, negative emotions were associated with time spent at specific locations, but this is also affected by personal routines and preferences [30]. For students, stress and lower subjective well-being were associated with more time spent on campus [39,49] and less time spent at campus food locations [39]. Students who experienced depression spent more time at home [60], whereas individuals at higher risk of psychosis spent less time at home [51]. Time spent at exercise locations was positively correlated with changes in depressive symptoms [48]. Another study [38] distinguished between students experiencing severe stress and those with normal stress levels, revealing that students with severe stress spent significantly less time on campus and were less involved in work-related activities compared with their counterparts with normal stress levels. As for employees, higher performers were found to visit fewer locations on weekday evenings but more locations during weekends [56].

Voice Recognition: Audio

The microphone is used to measure audio data of speech and ambient noises. One of the studies [26] examined how people with stress speak by analyzing their voice, including the speed of speech, how energetic their vocality is, and the pitch. One caveat is that the study by Adams et al [26] used audio captured within laboratory environments and found that stress could be recognized from the absence of speech. In variable environments, it will be harder to recognize the changing voice patterns. One study found that generalized anxiety and depression related to reward-related words in ambient speech, and social anxiety related to vision-related words [32]. Another study [52] identified that people with depression tend to speak less and use more death-related words.

Sleep: Accelerometer, Audio, and Illuminance

Sleep is highly correlated with individuals’ mental state [26,35,36,42,45-47,59,60]. Among the 40 included studies, 5 (13%) [35,46,52,60] found that more disturbed sleep correlated with more depressive symptoms. However, occasional sleep disturbance is not necessarily predictive. For example, for those with social anxiety, sleep disturbance might be positive because it suggests night-time activity and social interactions. Metadata on the time spent in darkness can be indicative of sleep patterns. The study by Fukazawa et al [36] stated that anxiety levels increase when the time spent in darkness increases. The study by Di Matteo et al [52] found that individuals with symptoms related to social anxiety and depression spent less time in darker environments. Another study [39] stated that stress changed students’ sleep patterns, where they became less likely to move around between 6 PM and midnight. Of the 40 studies, 6 (15%) found that shorter sleep duration was correlated with more mood instability [42], more depressive symptoms [59,60], and more stress [36,44]. One of the studies [45] also found that the student population, in general, tended to sleep less during examination periods and slept more during breaks, and they felt more stressed during both breaks and examination periods.

Phone Use: On and Off Screen, Lock and Unlock, and App Use

Today, smartphones are used for self-regulated “distractions,” such as the use of social media [38]. This type of self-regulated distraction can temporarily reduce stress. The study by Chikersal et al [47] showed that depression can impact concentration levels, so if distraction by phone can be measured, this could be a potential predictive marker. Several studies found that increasing phone use was correlated with more depressive symptoms [46,47,50,52,58-60], anxiety [52,59], impulsivity [34] and lower subjective-wellbeing [49]. The study by Morshed et al [42] outlined that for postsemester depression, phone use at night is not predictive, whereas another study [47] summarized that phone use during the day is predictive of depression. More frequent phone locks or unlocks correlated with higher levels of depressive symptoms [60] and impulsivity [34]. Higher performing employees tended to unlock their phones less frequently in the evenings [56]. Additionally, individuals who were promoted spent more time on their phones during early mornings and late evenings, with more unlocks occurring during nighttime compared with their nonpromoted counterparts [40].

Physical Activity and Mobility: Accelerometer

According to Table 6, along with GPS, accelerometer is one of the most widely used passive sensors in digital phenotyping research to monitor participant’s mobility, activity, and sedentary periods. Increased sedentary time was correlated with increased depressive symptoms [47,48,50,58-60], increased mood instability [27,42], increased stress [36] and decreased subjective well-being [49]. Exercise duration was positively correlated with changes in anxiety [36] and depressive symptoms [48]. The study by Mirjafari et al [56] found that the amount of movement and physical activity was related to employee’s stress level and highlighted that if the activity is regular, it should reduce stress. Different occupations require different levels of physical activity, social interactions, and mobility. For instance, developers spend most of their time at their desks, and their tasks might require less social interaction and mobility at work, but this does not mean they are more stressed. Project managers have more mobility during the day, and this may be because they need to move around to meet with the stakeholders [56]. Several studies have observed variations in mobility and gait consistency. The study by Boukhechba et al [44] reported that individuals with high social anxiety exhibited a narrower range of activities, whereas the study by Xu et al [60] revealed that students experiencing depression demonstrated more consistent mobility patterns. Additionally, accelerometer data indicated that individuals with low social anxiety maintained a steady walking pace, whereas those with high social anxiety tended to walk more rapidly and with greater irregularity [33].

Muscle Activity: Keyboard

Stress can cause muscle tension [70,71]. One of the studies [29] collected the data of users with stress via a keyboard in a laboratory environment and found that typing pressure significantly increased under stressful conditions.

Challenges

Digital phenotyping for mild mental health symptoms in nonclinical participants can present ethical challenges, limitations to the research, and technical challenges. We review the challenges that were stated in the literature.

Ethical Challenges

Among the 40 included studies, 7 (18%) specifically mentioned privacy-related ethical concerns [28,31,35,36,40,41,43]. A major concern for participants across several studies was whether authorities, such as employers or teachers, will have access to their data. One of the studies [28] conducted in-depth interviews with 15 students to understand their perspectives on digital phenotyping through app prototypes. They found that the students’ core concerns were whether the acquainted university staff had access to the data. They also found that students’ acceptability of such apps depends on the perceived relevancy of the data collected and the effects on students’ devices. The study by Nepal et al [40] with employees reported a similar privacy concern of whether the employees’ data would be leaked to their boss; if the boss is aware of a potential mental health issue, it may impact their work performance ratings.

The methods of collecting and storing passive sensing data also present privacy concerns [28,70,72], particularly when the tracked data involve sensitive topics, such as mental health [72]. Sensors that infer individuals’ social interactions provide insights into their mental health status [26,36,53]. However, these types of data were less likely to be shared by participants because of privacy concerns. In the study by Rooksby et al [28], students identified camera, microphone, call log, and keyboard data as highly unacceptable types of data to capture.

Location data were associated with privacy and security concerns. In the study by Wen et al [34], participants felt uncomfortable with location tracking because it might breach their privacy and were hesitant to log their location when they moved from one place to another. Some studies excluded specific sensors to protect the participant’s privacy. Location data were not recorded owing to security concerns, even though they could provide valuable insights into the mental state [36,38]. In the study by Adams et al [26], the microphone was disabled to capture calls and conversations while individuals were talking to their family members. Another ethical concern was regarding the misuse of data. The main focus in studies of digital phenotyping using smartphones was on tracking participants’ usual behavioral patterns and identifying whether they behaved unusually. There were concerns regarding secondary uses. For example, participants’ leaked data can be used for advertising purposes or to create content [34,41].

Limitations to the Research

Coping mechanisms related to stress and anxiety vary among individuals [22]. Individual differences can make it challenging to label individuals as stressed, anxious, or depressed, particularly nonclinical participants. Certain behavioral patterns can be generally expected; however, not all individuals will follow the same pattern. To make generalizable and powerful analyses and understand behavioral patterns associated with mild mental health concerns, it is recommended to study diverse groups for longer than a 2-week period. Of the 40 included studies, 2 (5%) [33,39] focused on a particular demographic subset, namely, undergraduate students. Therefore, the generalizability of the studies is limited. In the studies by Rooksby et al [28], Exposito et al [29], and Wang et al [50], limited variation in representation was seen as a major limitation. The studies by Rhim et al [49], DaSilva et al [39], and Fukazawa et al [36] stressed the importance of selecting a wider age group, as younger people use their smartphones proactively, whereas older people’s behavioral patterns might show differences when they are experiencing mild mental health symptoms. The study by Nepal et al [40] suggested that diverse population testing is required for more reliable results, considering interindividual differences. Furthermore, the accuracy and effectiveness of machine learning models are highly affected by data set quality. We noticed that over the last 4 years [38,46,57,60], there has been increased focus on the generalizability of machine learning models, with the goal of assessing generalizability across students from various years, classes, and institutions.

Technical Challenges

Digital phenotyping studies on mild symptoms related to mental health with nonclinical participants presented technical challenges. A main concern was the accuracy of the sensor data collected from smartphones. The study by Fukazawa et al [36] sought to understand the time spent in darkness and its effects on the relationship between stress and anxiety patterns and sleep. However, when individuals carried their smartphone in their pockets or bags, the smartphone could not detect the darkness of the environment. This presented a challenge because illuminance data were captured even when the phone was not used actively. Similar concerns were raised in the study by Di Matteo et al [52]. The time spent in darkness feature did not distinguish whether the device was in a dark room or a dark location (ie, in the pocket). The study by Melcher et al [35] stated that the captured accelerometer data may not accurately represent daily activity, as not all participants constantly carried their phones throughout the day. In the study by Di Matteo et al [32], environmental audio did not produce clear transcripts in louder environments. This study mentioned that transcripts were produced based on dictionaries, so language analysis of complex speech, such as metaphors and sarcasm, was ignored. Therefore, the entire content of the conversation might not be correctly interpreted. In the study by Di Matteo et al [52], similar challenges were identified, as the speech data produced from smartphones were not clear. The recorded voices of the participants were masked by those of the people around them or even sound from other sources such as television or radio. Moreover, it was not possible to identify whether the death-related words came from the participants or from the people they interacted with.

Another technical challenge identified was battery life [47]. As expected, moment-by-moment data collection requires high power use, which might shorten the battery life. Participants had to charge their phones more often, which was inconvenient, and altered their usual behavior because they could not carry their phones as usual when the phones were charging. The study by Chikersal et al [47] mentioned another technical limitation: the transfer rate was affected if the app stopped working randomly. During these times, data were not transferred or collected. With the increase in the use of 5G technology, Wi-Fi data for indoor locations may cease to be relevant. In the study by Zakaria et al [38], some users were on their 5G indoors rather than their Wi-Fi, and this may point to a future trend of the use of 5G. We now turn to the discussion.


Principal Findings

This literature review examined digital phenotyping studies that detected and predicted stress, anxiety, and depression in their mild states in nonclinical populations using data collected from smartphones. The primary objective of digital phenotyping in the context of mild mental health was similar among the 3 participant cohorts: students, adults, and employees. However, notable distinctions emerged among these groups. Among university students, the geographical proximity and relevance of the university campus were discerned as influential factors. Moreover, academic pursuits, particularly coursework and study-related activities, assumed significance within this demographic. Conversely, among employees, work aspects held salience, accompanied by the workplace environment. The remaining studies encompassed a general population cohort, delineated by undisclosed characteristics. Overall, we found that identifying behavioral abnormalities related to stress and anxiety was possible but raised certain challenges. Generalized stress and anxiety symptoms vary largely among individuals, whereas serious diagnoses, such as bipolar disorder or schizophrenia, have well-documented behavioral changes. Sleep was a strong predictor variable, yet some individuals tended to sleep more while they were stressed, whereas others lacked sleep under stress. This may be one of the reasons why there are fewer studies and reviews completed on stress and anxiety compared with studies on serious conditions such as bipolar disorder, severe depression, and schizophrenia. Another reason is that clinical psychologists and psychiatrists who are familiar with clinical populations are leading the digital phenotyping research.

Studies tended to use self-report to categorize nonclinical populations as stressed, depressed, or anxious. It was not always clear whether the identified patterns of the passive sensor data would effectively discriminate among groups. Most studies used prestudy and poststudy surveys to identify participants’ mental state. There were concerns raised regarding the accuracy of the categorization of self-report surveys. For instance, the study by Sefidgar et al [53] stated that students with stress may not report themselves as very stressed. Melcher et al [70] conducted a review and found that students were concerned regarding their professors learning about their data [71]. Thus, the accuracy of self-report remains an issue for passive sensing studies that use self-report labels, especially when there are privacy concerns. This may be related to the high dropout rates in the studies.

Many types of data sensors were used in the reviewed studies. Few articles related sensor patterns to specific symptoms validated by relevant psychological evidence. One of the studies [46] extracted interpretable rules (such as intermittent sleep episodes or number of bouts of being asleep or number of outgoing calls during weekends) through association rule mining to distinguish the behavioral patterns between students who were depressed and students who were not. However, although the behavioral patterns were identified, they were not validated to be exclusive to the addressed mental health issue; for example, high mobility and physical activity do not necessarily mean that the person is not stressed. In the study by Tseng et al [45], students were more mobile during the examination week, despite being under high pressure and stress. In the same study, some students were less mobile when studying for their examinations, which we cannot necessarily be interpreted as being under stress. Of the 40 included studies, 4 (10%) [35,58,59,61] explored the effects of the COVID-19 pandemic on behavioral and mental health. Additional recent investigations, which independently gathered their own data sets during the COVID-19 pandemic, have shown that quarantine measures have influenced individual behavioral patterns. For the purpose of making precise predictions in digital phenotyping, it is imperative to consider contextual and environmental factors.

Privacy and secondary data uses were the main concerns identified for digital phenotyping. Individuals using digital phenotyping systems have the right to provide informed consent. This means that they should be made aware of how all their data will be used, who will have access to their data, where their data will be stored, and for how long their data will be stored, and they have the right to decline to participate. We urge researchers and medical practitioners to carefully consider the system design and requirements because data transferred to the cloud and other services may fall under various service agreements. To empower end users and improve the quality of digital phenotyping systems, we recommend that transparent algorithms and explainable artificial intelligence be combined with user-accessible and understandable displays so that adults can engage in the process of identifying and categorizing patterns related to mild mental health symptoms.

The digital phenotyping research focused on in this review may enable the design of tailored intervention programs for nonclinical participants who are showing symptoms of stress, anxiety, and mild depression. Most of the studies included in this review were conducted within a restricted timeline and limited scope of detection and prediction. Only 4 (10%) of the 40 studies mentioned potential intervention programs upon predicting stress, anxiety, and mild depression [31,38,47,53].

Our review has some limitations. We excluded studies conducted with teenagers, children, and adults who were clinically diagnosed. Thus, we missed studies that focused on the detection and prediction of stress, anxiety, and mild depression in these populations. These populations are likely to show different patterns than those in adults who are not clinically diagnosed. Further, we excluded studies conducted using technologies other than smartphones. We chose this more limited subset of technologies to scope findings related to widely available technologies. The availability of technologies is changing rapidly, and wearables such as smartwatches are becoming more common. As wearable technologies become ubiquitous, we recommend including them in future systematic reviews.

This literature review is unique in that it examines studies focused on the behavioral patterns of nonclinical populations, namely students, employees, and adults who are stressed, anxious, or mildly depressed. We examined each type of sensor and indicated when it was significantly associated with mild mental health symptoms. We identified commonalities in the studies in terms of ethical challenges, limitations to the research, and technical challenges.

Conclusions

This systematic literature review found that digital phenotyping can be an effective way of identifying certain behavioral patterns related to stress, anxiety, and mild depression. A range of passive sensors was used in the studies, such as GPS, Bluetooth, ambient audio, light sensors, accelerometers, microphones, illuminance, and Wi-Fi. We found that location, physical activity, and social interaction data were highly related to participants’ mental health and well-being. The surveyed literature discussed the ethical and technical challenges that limit the accuracy and generalizability of results. One of the greatest challenges was privacy concerns, and these were primarily related to camera, location, SMS text message, and call log data. Another challenge was the significant variation among individuals and their unique behaviors related to mental health. Finally, technical limitations have not been fully resolved, with issues such as the sensor for illuminance still capturing data while not in use reducing the accuracy of the collected data. It is hoped that this overview of digital phenotyping and mental health studies conducted in the last decade, including the common privacy and technical concerns, can help move this area of research forward, ultimately improving the quality of passive sensing, and provide benefits in terms of the early detection of relevant mild mental health phenomena.

Acknowledgments

The authors would like to thank the School of Computer Science at the University of Auckland for their financial support.

Conflicts of Interest

None declared.

Multimedia Appendix 1

The PRISMA 2020 checklist.

DOCX File , 31 KB

  1. Onnela JP, Rauch SL. Harnessing smartphone-based digital phenotyping to enhance behavioral and mental health. Neuropsychopharmacology. Jun 2016;41(7):1691-1696. [FREE Full text] [CrossRef] [Medline]
  2. Houts CR, Patrick-Lake B, Clay I, Wirth RJ. The path forward for digital measures: suppressing the desire to compare apples and pineapples. Digit Biomark. Nov 26, 2020;4(Suppl 1):3-12. [FREE Full text] [CrossRef] [Medline]
  3. Huckvale K, Venkatesh S, Christensen H. Toward clinical digital phenotyping: a timely opportunity to consider purpose, quality, and safety. NPJ Digit Med. Sep 06, 2019;2:88. [FREE Full text] [CrossRef] [Medline]
  4. Berry JD, Paganoni S, Carlson K, Burke K, Weber H, Staples P, et al. Design and results of a smartphone-based digital phenotyping study to quantify ALS progression. Ann Clin Transl Neurol. Apr 3, 2019;6(5):873-881. [FREE Full text] [CrossRef] [Medline]
  5. Trifan A, Oliveira M, Oliveira JL. Passive sensing of health outcomes through smartphones: systematic review of current solutions and possible limitations. JMIR Mhealth Uhealth. Aug 23, 2019;7(8):e12649. [FREE Full text] [CrossRef] [Medline]
  6. Torous J, Gershon A, Hays R, Onnela JP, Baker JT. Digital phenotyping for the busy psychiatrist: clinical implications and relevance. Psychiatr Ann. 2019;49(5):196-201. [FREE Full text] [CrossRef]
  7. Maharjan SM, Poudyal A, van Heerden A, Byanjankar P, Thapa A, Islam C, et al. Passive sensing on mobile devices to improve mental health services with adolescent and young mothers in low-resource settings: the role of families in feasibility and acceptability. BMC Med Inform Decis Mak. Apr 07, 2021;21(1):117. [FREE Full text] [CrossRef] [Medline]
  8. Adler DA, Ben-Zeev D, Tseng VW, Kane JM, Brian RE, Campbell AT, et al. Predicting early warning signs of psychotic relapse from passive sensing data: an approach using encoder-decoder neural networks. JMIR Mhealth Uhealth. Aug 31, 2020;8(8):e19962. [FREE Full text] [CrossRef] [Medline]
  9. Potier R. The digital phenotyping project: a psychoanalytical and network theory perspective. Front Psychol. Jul 15, 2020;11:1218. [FREE Full text] [CrossRef] [Medline]
  10. Kokel D, Rennekamp AJ, Shah AH, Liebel U, Peterson RT. Behavioral barcoding in the cloud: embracing data-intensive digital phenotyping in neuropharmacology. Trends Biotechnol. Aug 2012;30(8):421-425. [FREE Full text] [CrossRef] [Medline]
  11. Teo JX, Davila S, Yang C, Hii AA, Pua CJ, Yap J, et al. Digital phenotyping by consumer wearables identifies sleep-associated markers of cardiovascular disease risk and biological aging. Commun Biol. Oct 4, 2019;2:361. [FREE Full text] [CrossRef] [Medline]
  12. Fagherazzi G. Deep digital phenotyping and digital twins for precision health: time to dig deeper. J Med Internet Res. Mar 03, 2020;22(3):e16770. [FREE Full text] [CrossRef] [Medline]
  13. Mercier HW, Hamner JW, Torous J, Onnela JP, Taylor JA. Digital phenotyping to quantify psychosocial well-being trajectories after spinal cord injury. Am J Phys Med Rehabil. Dec 2020;99(12):1138-1144. [FREE Full text] [CrossRef] [Medline]
  14. Rodriguez-Villa E, Rauseo-Ricupero N, Camacho E, Wisniewski H, Keshavan M, Torous J. The digital clinic: implementing technology and augmenting care for mental health. Gen Hosp Psychiatry. 2020;66:59-66. [FREE Full text] [CrossRef] [Medline]
  15. Barnett I, Torous J, Staples P, Sandoval L, Keshavan M, Onnela JP. Relapse prediction in schizophrenia through digital phenotyping: a pilot study. Neuropsychopharmacology. Jul 2018;43(8):1660-1666. [FREE Full text] [CrossRef] [Medline]
  16. Orsolini L, Fiorani M, Volpe U. Digital phenotyping in bipolar disorder: which integration with clinical endophenotypes and biomarkers? Int J Mol Sci. Oct 16, 2020;21(20):7684. [FREE Full text] [CrossRef] [Medline]
  17. Kleiman EM, Turner BJ, Fedor S, Beale EE, Picard RW, Huffman JC, et al. Digital phenotyping of suicidal thoughts. Depress Anxiety. Jul 2018;35(7):601-608. [FREE Full text] [CrossRef] [Medline]
  18. Onnela JP. Opportunities and challenges in the collection and analysis of digital phenotyping data. Neuropsychopharmacology. Jan 2021;46(1):45-54. [FREE Full text] [CrossRef] [Medline]
  19. Lardier DT, Lee CY, Rodas JM, Garcia-Reid P, Reid RJ. The effect of perceived college-related stress on depression, life satisfaction, and school satisfaction: the coping strategies of hispanic college students from a hispanic serving institution. Educ Urban Soc. Jan 02, 2020;52(8):1204-1222. [CrossRef]
  20. Rezaei A, Mousanezhad Jeddi E. Relationship between wisdom, perceived control of internal states, perceived stress, social intelligence, information processing styles and life satisfaction among college students. Curr Psychol. Feb 17, 2018;39:927-933. [FREE Full text] [CrossRef]
  21. Futo J. Dealing with mental health issues on campus starts with early recognition and intervention. Campus Law Enforc J. Jun 2011;41(3):22. [FREE Full text]
  22. Jonason PK, Talbot D, Cunningham ML, Chonody J. Higher-order coping strategies: who uses them and what outcomes are linked to them. Pers Individ Differ. Mar 1, 2020;155:109755. [FREE Full text] [CrossRef]
  23. Jorm AF. Mental health literacy: empowering the community to take action for better mental health. Am Psychol. Apr 2012;67(3):231-243. [FREE Full text] [CrossRef] [Medline]
  24. Number of smartphone mobile network subscriptions worldwide from 2016 to 2022, with forecasts from 2023 to 2028. Statista. URL: https://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/ [accessed 2023-11-04]
  25. PRISMA flow diagram. PRISMA. URL: http://www.prisma-statement.org/PRISMAStatement/FlowDiagram [accessed 2023-11-04]
  26. Adams P, Rabbi M, Rahman T, Matthews M, Voida A, Gay G, et al. Towards personal stress informatics: comparing minimally invasive techniques for measuring daily stress in the wild. In: Proceedings of the 8th International Conference on Pervasive Computing Technologies for Healthcare. 2014. Presented at: 8th International Conference on Pervasive Computing Technologies for Healthcare; May 20-23, 2014; Oldenburg, Germany. [CrossRef]
  27. Servia-Rodríguez S, Rachuri KK, Mascolo C, Rentfrow PJ, Lathia N, Sandstrom GM. Mobile sensing at the service of mental well-being: a large-scale longitudinal study. In: Proceedings of the 26th International Conference on World Wide Web. 2017. Presented at: WWW '17: 26th International World Wide Web Conference; April 3-7, 2017; Perth, Australia. [CrossRef]
  28. Rooksby J, Morrison A, Murray-Rust D. Student perspectives on digital phenotyping: the acceptability of using smartphone data to assess mental health. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 2019. Presented at: CHI '19: CHI Conference on Human Factors in Computing Systems; May 4-9, 2019; Glasgow, UK. [CrossRef]
  29. Exposito M, Hernandez J, Picard RW. Affective keys: towards unobtrusive stress sensing of smartphone users. In: Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct. 2018. Presented at: MobileHCI '18: 20th International Conference on Human-Computer Interaction with Mobile Devices and Services; September 3-6, 2018; Barcelona, Spain. [CrossRef]
  30. Cai L, Boukhechba M, Wu C, Chow PI, Teachman BA, Barnes LE, et al. State affect recognition using smartphone sensing data. In: Proceedings of the 2018 IEEE/ACM International Conference on Connected Health: Applications, Systems and Engineering Technologies. 2018. Presented at: CHASE '18: ACM/IEEE International Conference on Connected Health: Applications, Systems and Engineering Technologies; September 26-28, 2018; Washington, DC. [CrossRef]
  31. Boukhechba M, Cai L, Chow PI, Fua K, Gerber MS, Teachman BA, et al. Contextual analysis to understand compliance with smartphone-based ecological momentary assessment. In: Proceedings of the 12th EAI International Conference on Pervasive Computing Technologies for Healthcare. 2018. Presented at: PervasiveHealth '18: 12th EAI International Conference on Pervasive Computing Technologies for Healthcare; May 21-24, 2018; New York, NY. [CrossRef]
  32. Di Matteo D, Wang W, Fotinos K, Lokuge S, Yu J, Sternat T, et al. Smartphone-detected ambient speech and self-reported measures of anxiety and depression: exploratory observational study. JMIR Form Res. Jan 29, 2021;5(1):e22723. [FREE Full text] [CrossRef] [Medline]
  33. Jacobson NC, Summers B, Wilhelm S. Digital biomarkers of social anxiety severity: digital phenotyping using passive smartphone sensors. J Med Internet Res. May 29, 2020;22(5):e16875. [FREE Full text] [CrossRef] [Medline]
  34. Wen H, Sobolev M, Vitale R, Kizer J, Pollak JP, Muench F, et al. mPulse mobile sensing model for passive detection of impulsive behavior: exploratory prediction study. JMIR Ment Health. Jan 27, 2021;8(1):e25019. [FREE Full text] [CrossRef] [Medline]
  35. Melcher J, Lavoie J, Hays R, D'Mello R, Rauseo-Ricupero N, Camacho E, et al. Digital phenotyping of student mental health during COVID-19: an observational study of 100 college students. J Am Coll Health. Apr 2023;71(3):736-748. [FREE Full text] [CrossRef] [Medline]
  36. Fukazawa Y, Ito T, Okimura T, Yamashita Y, Maeda T, Ota J. Predicting anxiety state using smartphone-based passive sensing. J Biomed Inform. May 2019;93:103151. [FREE Full text] [CrossRef] [Medline]
  37. Rashid H, Mendu S, Daniel KE, Beltzer ML, Teachman BA, Boukhechba M, et al. Predicting subjective measures of social anxiety from sparsely collected mobile sensor data. Proc ACM Interact Mob Wearable Ubiquitous Technol. Sep 04, 2020;4(3):1-24. [FREE Full text] [CrossRef]
  38. Zakaria C, Balan R, Lee Y. StressMon: scalable detection of perceived stress and depression using passive sensing of changes in work routines and group interactions. Proc ACM Hum Comput Interact. Nov 07, 2019;3(CSCW):1-29. [FREE Full text] [CrossRef]
  39. DaSilva AW, Huckins JF, Wang R, Wang W, Wagner DD, Campbell AT. Correlates of stress in the college environment uncovered by the application of penalized generalized estimating equations to mobile sensing data. JMIR Mhealth Uhealth. Mar 19, 2019;7(3):e12084. [FREE Full text] [CrossRef] [Medline]
  40. Nepal S, Mirjafari S, Martinez GJ, Audia P, Striegel A, Campbell AT. Detecting job promotion in information workers using mobile sensing. Proc ACM Interact Mob Wearable Ubiquitous Technol. Sep 04, 2020;4(3):1-28. [FREE Full text] [CrossRef]
  41. Saha K, Reddy MD, Mattingly S, Moskal E, Sirigiri A, De Choudhury M. LibRA: on LinkedIn based role ambiguity and its relationship with wellbeing and job performance. Proc ACM Hum Comput Interact. Nov 07, 2019;3(CSCW):1-30. [FREE Full text] [CrossRef]
  42. Morshed MB, Saha K, Li R, D'Mello SK, De Choudhury M, Abowd GD, et al. Prediction of mood instability with passive sensing. Proc ACM Interact Mob Wearable Ubiquitous Technol. Sep 09, 2019;3(3):1-21. [FREE Full text] [CrossRef]
  43. Acikmese Y, Alptekin SE. Prediction of stress levels with LSTM and passive mobile sensors. Procedia Comput Sci. 2019;159:658-667. [FREE Full text] [CrossRef]
  44. Boukhechba M, Huang Y, Chow P, Fua K, Teachman BA, Barnes LE. Monitoring social anxiety from mobility and communication patterns. In: Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers. 2017. Presented at: UbiComp '17: The 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing; September 11-15, 2017; Maui, Hawaii. [CrossRef]
  45. Tseng VW, Merrill M, Wittleder F, Abdullah S, Aung MH, Choudhury T. Assessing mental health issues on college campuses: preliminary findings from a pilot study. In: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. 2016. Presented at: UbiComp '16: The 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing; September 12-16, 2016; Heidelberg, Germany. [CrossRef]
  46. Xu X, Chikersal P, Doryab A, Villalba DK, Dutcher JM, Tumminia MJ, et al. Leveraging routine behavior and contextually-filtered features for depression detection among college students. Proc ACM Interact Mob Wearable Ubiquitous Technol. Sep 09, 2019;3(3):1-33. [FREE Full text] [CrossRef]
  47. Chikersal P, Doryab A, Tumminia M, Villalba DK, Dutcher JM, Liu X, et al. Detecting depression and predicting its onset using longitudinal symptoms captured by passive sensing: a machine learning approach with robust feature selection. ACM Trans Comput Hum Interact. Jan 20, 2021;28(1):1-41. [FREE Full text] [CrossRef]
  48. Meyerhoff J, Liu T, Kording KP, Ungar LH, Kaiser SM, Karr CJ, et al. Evaluation of changes in depression, anxiety, and social anxiety using smartphone sensor features: longitudinal cohort study. J Med Internet Res. Sep 03, 2021;23(9):e22844. [FREE Full text] [CrossRef] [Medline]
  49. Rhim S, Lee U, Han K. Tracking and modeling subjective well-being using smartphone-based digital phenotype. In: Proceedings of the 28th ACM Conference on User Modeling, Adaptation and Personalization. 2020. Presented at: UMAP '20; July 14-17, 2020; Genoa, Italy. [CrossRef]
  50. Wang R, Wang W, daSilva A, Huckins JF, Kelley WM, Heatherton TF, et al. Tracking depression dynamics in college students using mobile phone and wearable sensing. Proc ACM Interact Mob Wearable Ubiquitous Technol. Mar 26, 2018;2(1):1-26. [FREE Full text] [CrossRef]
  51. Currey D, Torous J. Digital phenotyping correlations in larger mental health samples: analysis and replication. BJPsych Open. Jun 03, 2022;8(4):e106. [FREE Full text] [CrossRef] [Medline]
  52. Di Matteo D, Fotinos K, Lokuge S, Mason G, Sternat T, Katzman MA, et al. Automated screening for social anxiety, generalized anxiety, and depression from objective smartphone-collected data: cross-sectional study. J Med Internet Res. Aug 13, 2021;23(8):e28918. [FREE Full text] [CrossRef] [Medline]
  53. Sefidgar YS, Seo W, Kuehn KS, Althoff T, Browning A, Riskin E, et al. Passively-sensed behavioral correlates of discrimination events in college students. Proc ACM Hum Comput Interact. Nov 2019;3(CSCW):1-29. [FREE Full text] [CrossRef] [Medline]
  54. Mendu S, Baglione A, Baee S, Wu C, Ng B, Shaked A, et al. A framework for understanding the relationship between social media discourse and mental health. Proc ACM Hum Comput Interact. Oct 15, 2020;4(CSCW2):1-23. [FREE Full text] [CrossRef]
  55. Pratap A, Anguera JA, Renn BN, Neto EC, Volponi J, Mooney SD, et al. The feasibility of using smartphones to assess and remediate depression in Hispanic/Latino individuals nationally. In: Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers. 2017. Presented at: UbiComp '17: The 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing; September 11-15, 2017; Maui, Hawaii. [CrossRef]
  56. Mirjafari S, Masaba K, Grover T, Wang W, Audia P, Campbell AT, et al. Differentiating higher and lower job performers in the workplace using mobile sensing. Proc ACM Interact Mob Wearable Ubiquitous Technol. Jun 21, 2019;3(2):1-24. [FREE Full text] [CrossRef]
  57. Currey D, Hays R, Torous J. Digital phenotyping models of symptom improvement in college mental health: generalizability across two cohorts. J Technol Behav Sci. Mar 02, 2023:1-14. [FREE Full text] [CrossRef] [Medline]
  58. Huckins JF, daSilva AW, Wang W, Hedlund E, Rogers C, Nepal SK, et al. Mental health and behavior of college students during the early phases of the COVID-19 pandemic: longitudinal smartphone and ecological momentary assessment study. J Med Internet Res. Jun 17, 2020;22(6):e20185. [FREE Full text] [CrossRef] [Medline]
  59. Mack DL, DaSilva AW, Rogers C, Hedlund E, Murphy EI, Vojdanovski V, et al. Mental health and behavior of college students during the COVID-19 pandemic: longitudinal mobile smartphone and ecological momentary assessment study, part II. J Med Internet Res. Jun 04, 2021;23(6):e28892. [FREE Full text] [CrossRef] [Medline]
  60. Xu X, Liu X, Zhang H, Wang W, Nepal S, Sefidgar Y, et al. GLOBEM: cross-dataset generalization of longitudinal human behavior modeling. Proc ACM Interact Mob Wearable Ubiquitous Technol. Jan 11, 2023;6(4):1-34. [FREE Full text] [CrossRef]
  61. Nepal S, Wang W, Vojdanovski V, Huckins JF, daSilva A, Meyer M, et al. COVID student study: a year in the life of college students during the COVID-19 pandemic through the lens of mobile phone sensing. In: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 2022. Presented at: CHI '22: CHI Conference on Human Factors in Computing Systems; April 29-May 5, 2022; New Orleans, LA. [CrossRef]
  62. Wang R, Chen F, Chen Z, Li T, Tignor S, Zhou X, et al. StudentLife: assessing mental health, academic performance and behavioral trends of college students using smartphones. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing. 2014. Presented at: UbiComp '14: The 2014 ACM Conference on Ubiquitous Computing; September 13-17, 2014; Seattle, WA. [CrossRef]
  63. Mattingly SM, Gregg JM, Audia P, Bayraktaroglu AE, Campbell AT, Chawla NV, et al. The tesserae project: large-scale, longitudinal, in situ, multimodal sensing of information workers. In: Proceedings of the Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. 2019. Presented at: CHI '19: CHI Conference on Human Factors in Computing Systems; May 4-9, 2019; Glasgow, UK. [CrossRef]
  64. Teles A, Barros F, Rodrigues I, Barbosa A, Silva F, Coutinho L, et al. Internet of things applied to mental health: concepts, applications, and perspectives. In: Gupta N, Paiva S, editors. IoT and ICT for Healthcare Applications. Cham, Switzerland. Springer; Aug 13, 2020.
  65. Jin J, Gao B, Yang S, Zhao B, Luo L, Woo WL. Attention-block deep learning based features fusion in wearable social sensor for mental wellbeing evaluations. IEEE Access. May 13, 2020;8:89258-89268. [FREE Full text] [CrossRef]
  66. Nyqvist F, Forsman AK, Giuntoli G, Cattan M. Social capital as a resource for mental well-being in older people: a systematic review. Aging Ment Health. 2013;17(4):394-410. [FREE Full text] [CrossRef] [Medline]
  67. Diagnostic And Statistical Manual Of Mental Disorders, Fifth Edition, Text Revision (DSM-5-TR). Washington, DC. American Psychiatric Association; 2013.
  68. Park SC, Kim D. The centrality of depression and anxiety symptoms in major depressive disorder determined using a network analysis. J Affect Disord. Jun 15, 2020;271:19-26. [FREE Full text] [CrossRef] [Medline]
  69. Peltz JS, Bodenlos JS, Kingery JN, Rogge RD. The role of financial strain in college students' work hours, sleep, and mental health. J Am Coll Health. 2021;69(6):577-584. [FREE Full text] [CrossRef] [Medline]
  70. Melcher J, Hays R, Torous J. Digital phenotyping for mental health of college students: a clinical review. Evid Based Ment Health. Nov 2020;23(4):161-166. [FREE Full text] [CrossRef] [Medline]
  71. Cosgrove L, Karter JM, McGinley M, Morrill Z. Digital phenotyping and digital psychotropic drugs: mental health surveillance tools that threaten human rights. Health Hum Rights. Dec 2020;22(2):33-39. [FREE Full text] [Medline]
  72. Zacher H, Rudolph CW. Individual differences and changes in subjective wellbeing during the early stages of the COVID-19 pandemic. Am Psychol. Jan 2021;76(1):50-62. [FREE Full text] [CrossRef] [Medline]


PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses


Edited by L Buis; submitted 01.07.22; peer-reviewed by J Rooksby, KS Sahu, A Joseph; comments to author 09.08.22; revised version received 03.10.22; accepted 27.09.23; published 23.05.24.

Copyright

©Adrien Choi, Aysel Ooi, Danielle Lottridge. Originally published in JMIR mHealth and uHealth (https://mhealth.jmir.org), 23.05.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mHealth and uHealth, is properly cited. The complete bibliographic information, a link to the original publication on https://mhealth.jmir.org/, as well as this copyright and license information must be included.