Abstract
Introduction: Patient education materials (PEMs) play a vital role in ensuring that patients understand their medical conditions and treatment options. In prostate cancer, complex medical terminology can hamper comprehension and informed decision-making. This study evaluates the readability of prostate cancer PEMs to determine if they meet recommended standards for lay audiences. Methods: A selection of standardized prostate cancer PEMs, including standard surgical consent forms and patient brochures from major German cancer organizations, was analyzed. Readability was assessed using established metrics, including the Flesch Reading Ease Score (FRES), Flesch-Kincaid Grade Level (FKGL), Gunning Fog Score (GFS), Simple Measure of Gobbledygook (SMOG) Index, Coleman-Liau Index (CLI), and Automated Readability Index (ARI). Layperson readability was defined as a FRES of 70 (at or below a seventh-grade reading level) and the other readability indexes ≤7, following European Union recommendations. Results: The readability of prostate cancer PEMs of both surgical consent forms and patient brochures did not meet the recommended thresholds set by the European Union for layperson summaries. The median FRES for consent forms was 25.9 (SD: 1.52), ranging from 24.3 (prostate biopsy) to 28.0 (open RPx). Patient brochures showed a median FRES of 23.2 (SD: 2.87), with scores of 23.2 (German Cancer Aid), 22.5 (DKFZ), and 28.9 (S3-Guidelines). Section-specific values varied, with the highest FRES observed in the “Basic Explanation and Screening” section of the S3-Guidelines (39.0, SD: 7.09) and the lowest in the “Follow-Up” section of the German Cancer Aid brochure (15.8, SD: 10.35). All grade-level metrics (FKGL, GFS, SMOG, CLI, ARI) exceeded the recommended level of grade 7. Conclusion: The readability of prostate cancer PEMs in Germany falls short of recommended thresholds for lay comprehension. To enhance clarity and accessibility, the use of automated readability tools and standardized benchmarks (e.g., FRES ≥70, grade level ≤7) is recommended. Involving multidisciplinary teams may further support the development of patient-centered content. Future research should combine readability metrics with patient feedback to evaluate real-world comprehension and usability.