ExPAM: Explainable personality assessment method using heterogeneous linguistic features and off-the-shelf LLMs


Abstract

Many organizations are increasingly adopting personalization techniques to enhance a user satisfaction. However, current systems generally lack the ability to automatically infer and interpret individual Personality traits (PTs), which are key drivers of user behavior. Large Language Models (LLMs) are widely used, but they are still not well-suited to reliable and explainable Personality Assessment (PA). To address this gap, we propose ExPAM, a novel Explainable Personality Assessment Method that leverages hybrid feature fusion and in-context learning with off-the-shelf Large Language Models (LLMs) to predict Big Five PTs from textual data. It allows explicitly grounding predictions in interpretable linguistic patterns without requiring Large Language Models (LLMs) fine-tuning. The hybrid fusion is designed to simultaneously enhance predictive performance and model interpretability in Personality Assessment (PA). Specifically, transformer-based embeddings encode local contextual information, while features extracted via the Linguistic Inquiry and Word Count (LIWC) dictionary provide complementary global and local linguistic indicators of PTs. These interpretable feature patterns are incorporated into prompts that guide the LLM to generate both PTs predictions and human-understandable explanations. Evaluated on the ChaLearn First Impressions v2 corpus, ExPAM outperforms models relying on either feature type alone, achieving a mean accuracy (mACC) of 0.891 and a Concordance Correlation Coefficient (CCC) of 0.333. Moreover, prompting the LLM with hybrid global-local patterns yields a relative CCC improvement of 9.6%. Qualitative interpretability analysis reveals trait-specific linguistic patterns, offering valuable insights for psychological research, computational linguistics, and paralinguistic studies. The proposed method thus advances both accuracy and transparency in PA, with promising applications in psychological profiling, personnel selection, and personalized recommendation systems.
Ask to review this manuscript

Notes for potential reviewers

  • Volunteering is not a guarantee that you will be asked to review. There are many reasons: reviewers must be qualified, there should be no conflicts of interest, a minimum of two reviewers have already accepted an invitation, etc.
  • This is NOT OPEN peer review. The review is single-blind, and all recommendations are sent privately to the Academic Editor handling the manuscript. All reviews are published and reviewers can choose to sign their reviews.
  • What happens after volunteering? It may be a few days before you receive an invitation to review with further instructions. You will need to accept the invitation to then become an official referee for the manuscript. If you do not receive an invitation it is for one of many possible reasons as noted above.

  • PeerJ Computer Science does not judge submissions based on subjective measures such as novelty, impact or degree of advance. Effectively, reviewers are asked to comment on whether or not the submission is scientifically and technically sound and therefore deserves to join the scientific literature. Our Peer Review criteria can be found on the "Editorial Criteria" page - reviewers are specifically asked to comment on 3 broad areas: "Basic Reporting", "Experimental Design" and "Validity of the Findings".
  • Reviewers are expected to comment in a timely, professional, and constructive manner.
  • Until the article is published, reviewers must regard all information relating to the submission as strictly confidential.
  • When submitting a review, reviewers are given the option to "sign" their review (i.e. to associate their name with their comments). Otherwise, all review comments remain anonymous.
  • All reviews of published articles are published. This includes manuscript files, peer review comments, author rebuttals and revised materials.
  • Each time a decision is made by the Academic Editor, each reviewer will receive a copy of the Decision Letter (which will include the comments of all reviewers).

If you have any questions about submitting your review, please email us at [email protected].