Review History

All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.


  • The initial submission of this article was received on December 20th, 2014 and was peer-reviewed by 3 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on January 23rd, 2015.
  • The first revision was submitted on June 17th, 2015 and was reviewed by the Academic Editor.
  • The article was Accepted by the Academic Editor on June 19th, 2015.

Version 0.2 (accepted)

· Jun 19, 2015 · Academic Editor


Thank you for your re-submitted article and comments addressing the concerns of the reviewers. I am pleased to accept your article for publication in PeerJ.

Version 0.1 (original submission)

· Jan 23, 2015 · Academic Editor

Minor Revisions

Thank you for your submission which the reviewers and I have found interesting. Please address each of the issues raised in your re submission.


Basic reporting

In general, structure conforms to PeerJ templates. Tables are relevant to narrative.
Though part of a larger study this manuscript represents an appropriate unit of publication.

Specific comments:

Abstract: Recommend adding detail to Methods.

Background: Provides adequate and informative background information.

Methods: Provides adequate detail in this section.

Results: As it reads currently, most findings and themes are in narrative form, however the theme "current uses of information technologies for health-related purposes" has a summary of findings illustrated in table 2. It seems this should be done throughout to table and summarize findings related to the four primary areas listed on page 6 (lines 131-135). For example, it seems the topics overviewed on page 10 (lines 216-219) lend themselves to being tabled. Furthermore, it would be a beneficial to expand table 2 to include exemplar quotes that relate to the optic and findings summary.

Table 1 alignment appears to be off for the age row from the other rows.

When reporting participant quotes, it is accpetable to delete nuances in spoken language to provide clear writing, e.g. delete "Um" from quotes (e.g. line 326).

Discussion and conlcusion: Well written and provides thorough review of findings.

Experimental design

This manuscript adequately describes orginal primary research.
Though the authors state, "we sought to understand homeless veteran’s access to and use of information technologies, and whether using these technologies to communicate with health care providers would be acceptable to them, the introduction could be improved with a clearly stated research question.

Methods are clearly described in the narrative, but could be more detailed in the abstract to improve readership.

Though the authors indicate the IRB # for this study, I recommend adding a sentence about the ethical standards by which they conduct their research (i.e. APA, etc.).

Validity of the findings

Being a qualitative study, concepts of robust, statistically sound, and controlled are not relevant, however authors can support the validity of their qualitative research by indicating the level of conceptual saturation, data richness, and meaningful data findings.

Guidelines suggest, "The conclusions should be appropriately stated, should be connected to the original question investigated, and should be limited to those supported by the results." This can be done effectively with the addition of a clearly stated research question.


Basic reporting

The submission adheres to PeerJ policies, is very well-written, and provides an introduction/literature review to a self-contained study. I have a few additional comments about the introduction section of this manuscript:

1. Rather than 2012 data, it would be timely to use newer statistics from the 2014 PIT count: in January 2014, 49,933 Veterans were homeless, reflecting a substantial decrease over the last several years.

2. When describing substance use disorders, consider using DSM5 terms, e.g., “cocaine use disorder” as opposed to “cocaine use.”

3. In your literature review, is there any information within Veteran populations (presumably non-homeless) specifically that you can cite? Why is it important specifically to study the homeless Veteran (as opposed to all homeless adults) here? Does the study have broad generalizability to the general homeless population?

4. When describing the utility of HIT in non-homeless populations, the paper might benefit from using examples that are relevant to the conditions described in the first paragraph as having high prevalence in homeless Veterans. For example, increased vaccination rates could be indirectly relevant, e.g., flu shots, but do not tie into the type of conditions mentioned above.

Experimental design

The submission reflects primary research within the scope of PeerJ. The investigation was conducted rigorously and has IRB approval. The research question is relevanat and meaningful, though in the aims (last paragraph of the introduction), it would help to understand what sort of IT access the authors seek to understand in homeless Veteran populations. Mobile phones only? Computers? Other IT tools for healthcare purposes? The manuscript does describe this in the methods, but it would be nice to understand up front what we are about to read. In addition, though the study is Veteran-specific, there is little information about how much of this work is translatable to non-Veteran populations, and what issues are unique to Veterans that this study explored? Particularly for a broad audience not limited to the VA, this issue seems critical.

I have the following additional comments on the methods section:

1. Typographical error on line 50: remove the “,” after the word “explore.”

2. Participants – in your purposive sampling, how different are Veterans anticipated to be between these VA programs? Was there an effort to recruit across gender lines? Race and ethnicity considerations? Are some of these geographic locations rural vs urban? Without this sort of information, it’s less clear to me if recruiting across these programs allows for a heterogeneous group alone?

3. Data collection – so, is the key data for analytic purposes the systematic summarization? Or the verbatim transcripts, supplemented by the descriptive field notes. If not the transcripts, is there a reason why the summary was performed? What were the methods for “systematic summarization?”

4. Analysis – what was the inter-rater reliability?

Validity of the findings

A key comment on the findings - as currently written, it is a little hard to see why this study had to be done qualitatively. We don’t see as much rich qualitative data as I would expect for the study design, and much of the information provided could be obtained through a survey – on a much larger and more heterogeneous sample. I think this could be addressed by adjusting the results section to reflect the richness of data collected (though I do query if this is secondary to my questions about the analytic methods described above). The following are additional comments about the results:

1. The data collection does not mention any standardized tools to collect diagnostic information, yet the results include information about commonly mentioned conditions. Were these spontaneously brought up in the interview? Were participants’ medical records queried to get this information?

2. Similarly, I am not clear if the methods mentioned gathering information about sources of health services, yet the results describe a variety of clinic sites. How was this information obtained? Or, is this a general description of services available to these Veterans not obtained through study procedures – if so, I wouldn’t include this in the results section.

3. Both lines 154-156 and 165-166 discuss use of mobile phones. Can this be consolidated?

4. Any information on apps? There is a nice description of use of the phone, mention of texting, but what of applications on smart phones? Moreover, any information about what number of these phones were smart phones vs. traditional cellular phones.

5. I am not sure that non-VA readers will have any familiarity with My HealtheVet. Either consider not using its name, referring to the VA electronic personal health record and listing a few components, or provide a bit of detail about the system when it is first introduced.

Additional comments

This is an interesting and well-written manuscript that describes a small qualitative study of homeless Veterans in the Northeast US, exploring use of mobile phone and other IT for health-related purposes. Use of HIT in homeless populations is certainly an unexplored area that benefits from such needs assessment and the manuscript fills a gap in the literature.

I have some additional comments about the discussion/conculsion sections to add below:


1. It is striking that a study of homeless Veterans only, with a lot of questions about texting, is performed in the setting of the VA not approving mobile phone texting. Can you describe a bit about why this study (particularly parts about texting) is relevant in this light? For example, the VA has the home telehealth system that allows Veterans to call a 1-800 number to report on their chronic conditions. There are probably other examples as well.

2. I like the authors’ description of barriers to use of cell phones in vulnerable populations. One thing missing from the manuscript is tying the results back to the high prevalence of MH and SUD problems described in the introduction. These problems inevitably impact cell phone use and HIT use overall – can we see some discussion of this issue?

3. As this is not a VA journal, a key limitation is external validity to non-Veteran populations as well.

4. I appreciate the mention of different results that may have been obtained if street homeless Veterans were interviewed – what would the authors speculate these differences to be?


5. Half of the interview seemed to be about HIT venues other than just cell phones – can the authors describe future work surrounding use of the internet or non-mobile phone technology for future study?

6. The use of cell phones for individuals “susceptible to suicidality” is certainly possible, but I am not clear that it is justified by the manuscript and this is a much more complicated issue than the authors can easily report on from this particular study. How would you appropriately interact with an individual with high suicide risk using HIT? What do you do when suspicion is raised? How do the authors define susceptibility to suicide? I would like to see, as mentioned earlier, more discussion about how this work is relevant for a population with high rates of MH and SUD problems but this seems a bit of a leap.

7. I see the authors’ point about engagement/adherence to care having linkages to chronic disease management, but it would be helpful to bring more clarity to this linkage in the conclusion section. The introduction sets the stage for studying this important issue in a population with high rates of chronic medical/mental health conditions but this gets lost later on – it would be nice to tie the discussion and/or conclusion back to the introduction in this way.


Basic reporting

The article is well written and details the necessary steps in qualitative research. The purpose is clearly stated. The review of literature is relevant to the topic and appears complete.

Experimental design

The qualitative design is not described in the article. This appears to be a phenomenology but the author should state the qualitative approach they took. The researchers sampled 30 Veterans. It is not clear how 30 was selected. The authors should indicate how/why they selected 30 Veterans (eg data saturation, funding, pilot study etc..). The coding process was adequately described.

Validity of the findings

The authors list control methods for the coding and demonstrates rigor. The limitations are clearly presented.

Additional comments

This is a well written an important article that adds to the body of research. The essential steps where clearly presented. I suggest 2 minor edits to make the paper stronger and clearer
1. Describe the qualitative approach the researchers undertook.
2. Describe how it was determined you would stop at 30 Veterans.

A side note. The VA in San Francisco requires us to add this wording to all publications:

Note. The views expressed here are those of the authors and not necessarily the views of the Department of Veterans Affairs. This material is the result of work supported with resources and the use of facilities at the San Francisco Veterans Affairs Medical Center.

It may just be a San Francisco VA requirement but something you should double check.

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.