Predicting Explanation Needs in Software Systems: Challenges and Opportunities

Saturday 06 September 2025

The quest for transparency in software systems has become increasingly important in today’s digital age. As technology advances, so does its complexity, making it challenging for users to understand how their favorite apps work and why certain behaviors occur. A team of researchers from Leibniz Universität Hannover and University of Applied Sciences FHDW Hannover set out to investigate whether explanation needs can be predicted based on app properties.

The researchers analyzed a dataset of 4,495 app reviews, enriched with metadata such as app version, ratings, age restriction, and in-app purchases. They identified mostly weak associations between app properties and explanation needs, except for specific features like app version, number of reviews, and star ratings.

To better understand the relationship between app properties and explanation needs, the researchers used linear regression models to predict the likelihood of users requesting explanations based on these properties. However, their findings showed limited predictive power, indicating that there is no reliable way to forecast which apps will require explanations.

The team also validated their results using a manually labeled dataset of 495 reviews and found similar patterns. They discovered that certain categories such as Security & Privacy and System Behavior had slightly higher predictive potential, while Interaction and User Interface remained the most difficult to predict.

These findings suggest that explanation needs are highly context-dependent and cannot be precisely inferred from app metadata alone. The researchers emphasized the importance of supplementing metadata analysis with direct user feedback to effectively design explainable and user-centered software systems.

In recent years, there has been an increasing focus on developing transparent and explainable AI systems, as well as ensuring that users can understand how their data is being used. This study highlights the need for a more nuanced approach to understanding explanation needs in software development.

The researchers’ work contributes to the growing body of literature on software explainability and provides valuable insights for developers and requirements engineers. By acknowledging the limitations of predictive models and the importance of user feedback, they offer a more comprehensive framework for designing transparent and user-friendly software systems.

As technology continues to evolve at an unprecedented pace, it is essential that we prioritize transparency and user understanding in software development. This study serves as a reminder that there is no one-size-fits-all solution to ensuring explanation needs are met, but rather a need for a multifaceted approach that incorporates both predictive models and direct user feedback.

Cite this article: “Predicting Explanation Needs in Software Systems: Challenges and Opportunities”, The Science Archive, 2025.

Software Explainability, Transparency, App Reviews, Metadata Analysis, Linear Regression Models, Predictive Power, User Feedback, Software Development, Ai Systems, Data Usage.

Reference: Martin Obaidi, Kushtrim Qengaj, Jakob Droste, Hannah Deters, Marc Herrmann, Jil Klünder, Elisa Schmid, Kurt Schneider, “From App Features to Explanation Needs: Analyzing Correlations and Predictive Potential” (2025).

Discussion