Competency-based medical education requires comprehensive assessment data in order to make progression decisions based on trainees’ competence, rather than the amount of time spent in a residency program. Yet, despite the many efforts to gather quality data, performance-relevant information continues to remain undocumented on assessment forms, making it unavailable for formal review. Alarmingly, our own preliminary data suggests that the missing information could reflect concerns in the critical and emerging competencies of professionalism, communication, and collaboration, and if so, its absence threatens the validity of our measurements. While existing research has examined supervisor factors that contribute to this missing assessment data, there has been a significant blind spot when it comes to the processes that leave those comments “unwritten”. Studying the absence of comments presents a profound methodological challenge but it is one that is familiar to this team. Guided by constructivist grounded theory, we will use innovative and carefully triangulated data collection strategies to explore why some assessment information is not expressed in written comments, how the comments are written instead, and what information remains unwritten. With the goal of creating a taxonomy of “unwriteable” comments, as well as and an in-depth understanding of the writing process, we expect to enable more assessment data to be documented. However, we also anticipate that this research could reveal incompatibilities of an assessment system that needs supervisors to frequently document brief, point-in-time assessment comments on topics that are socially constructed, nuanced and complex. Our findings may signal the need to refashion assessment systems in order to prevent them from impeding formal documentation of concerns with collegiality and professionalism and better support supervisors in communicating performance-relevant information, that at the moment, remains conspicuously absent from written comments.