AAU PhD Education Initiative

 
AAU-Logo-Inquiry-Innovation-Impact.png

I led Duke’s participation in the AAU PhD Education Initiative as Assistant Dean of Assessment and Evaluation. I built university-wide data systems that uncovered structural gaps, guided policy and curriculum reform, and became a national model adopted by peer institutions across the country.

Modernizing PhD Education Through Strategic Analytics


In 2018, the Association of American Universities (AAU) launched a multi-campus initiative to modernize PhD education in the United States. The premise was clear: while most doctoral programs continued to prioritize tenure-track preparation, the vast majority of PhD graduates were building meaningful careers beyond academia. If research universities wanted to remain relevant and responsive, they had to confront this disconnect. The AAU challenged institutions to take a hard look at where their students actually go, what skills they need to succeed, and how well their programs were preparing them for the full range of professions they enter—across research, policy, teaching, industry, and entrepreneurship.

At Duke, I was appointed to lead our institutional participation in this effort. It was a natural extension of my role as Assistant Dean of Assessment and Evaluation, where I had been developing university-wide data systems to support continuous improvement across more than 100 graduate programs. But this project went beyond reporting. It asked whether we could use data not just to track outcomes, but to stimulate reflection on purpose, preparation, and possibility. Could we shift institutional research from a passive function to an active force for change? Could we help departments see their programs more clearly, and in doing so, reimagine what success could look like?

From Data Silos to Strategic Insight

Like most research universities, Duke had a wealth of data on graduate students—but it existed in disconnected systems, each designed for operational needs rather than strategic insight. Course records were stored in the registrar’s database, advising interactions in decentralized logs, and career outcomes—when collected at all—were scattered across exit surveys or faculty memory. This fragmentation made it difficult to understand the full trajectory of a doctoral student, much less identify where support structures were succeeding or failing. I led the effort to build the connective tissue that had been missing. By collaborating across IT, institutional research, and academic departments, I designed a scalable analytics framework that stitched these systems together. The goal was not just to integrate data, but to reframe how we understood doctoral education—from a series of administrative checkpoints to a holistic and evolving student experience, mapped from admission to graduation and beyond.

Our system made it possible to:

  • Track course and milestone progression by discipline, cohort, and student demographics

  • Visualize career outcomes not just by broad sectors, but by specific fields and identity groups

  • Identify gaps in professional development access, flagging who participated, when, and in what formats

  • Surface patterns in attrition and time-to-degree, enabling departments to intervene earlier and more effectively

We weren’t just documenting student activity—we were creating a diagnostic lens that helped departments understand themselves in sharper, more dynamic ways. The data moved from static reports to living tools, making it possible to spot misalignments, surface inequities, and test assumptions about how students advance, succeed, or leave. For faculty and program leaders, this shifted the role of data from an accountability burden to a source of clarity and possibility. It gave them a means to ask more strategic questions, design more responsive policies, and imagine different futures for their programs. In short, we built a foundation not just for measurement, but for momentum—helping departments move from compliance-driven reporting toward intentional, data-informed change.

Helping Departments See What They Could Improve

Once the tools were built, the real work began: turning data into action at the department level, where policy meets practice. I partnered closely with faculty, program directors, and departmental staff to interpret the findings not as compliance checks or performance audits, but as diagnostic tools that could reveal patterns and overlooked opportunities. My role was part translator, part strategist—helping academic leaders move from raw data to reflective insight, and from insight to practical change. Rather than prescribing solutions, we fostered a collaborative environment where departments could ask deeper questions about their own assumptions and design responses that fit their specific context. This approach honored faculty expertise while grounding decisions in evidence, and it helped build a culture of trust and shared ownership around improvement. Some examples stand out:

  • Teaching Load Disparities
    In a biomedical science program, data showed that graduate students were logging significantly more teaching hours than peers in similar disciplines. The department launched an internal review and implemented new policies to rebalance workloads and protect research time.

  • Access Gaps for International Students
    In the humanities, our data revealed that international PhD students were underrepresented in career and professional development programs. The department responded by embedding career exploration into required coursework and offering expanded resources for international trainees.

  • Mentoring Structures
    Several departments used our data to evaluate advising frequency and distribution, leading to more accessible mentoring assignments and clearer expectations for faculty.

Each insight opened the door to meaningful change. And each change made doctoral education more intentional, more equitable, and more aligned with the lived experiences of students. The process was iterative and collaborative, but its effects were structural. Departments began to see data not as an external judgment, but as an internal compass—a way to steer their programs with greater purpose. Over time, the focus shifted from fixing isolated issues to rethinking the very criteria by which success, support, and preparation were measured. We didn’t just help departments improve their programs. We helped them reframe what improvement could mean, turning institutional research into a tool for academic self-determination and cultural change.

In Washington DC, I met Association of American Universities (AAU) President Mary Sue Coleman at the AAU PhD Data Workshop.

Fun fact: Dr. Coleman was President of the University of Michigan when I was an undergraduate.

From Campus Pilot to National Framework

As the work matured at Duke, it began to resonate far beyond our campus. What had started as a localized response to the AAU’s call for innovation quickly evolved into a national model for using data to support, rather than dictate, graduate education reform. Our approach demonstrated that data could be more than a reporting tool; it could be a catalyst for institutional reflection, cultural change, and strategic alignment. I was invited to consult with institutional research leaders and graduate deans at peer universities, sharing both the technical structure of our analytics framework and the relationship-building strategies that made it effective. These conversations often centered less on tools and more on trust: how we secured faculty buy-in, avoided surveillance-based models, and used data to prompt inquiry rather than judgment. What caught people’s attention was not just the dashboards. It was the way we had integrated data into the rhythms of academic decision-making. We created space for departments to reflect, revise, and act with clarity, embedding continuous improvement into the culture rather than treating it as a one-time intervention. In doing so, we helped shift the narrative around doctoral education from a static pipeline to a dynamic ecosystem of learning, mentoring, and preparation.

Across the AAU network, institutions began adapting our approach to fit their own contexts. They used our model to:

  • Align curricula with the evolving skills and career paths valued across sectors

  • Track gaps in time-to-degree, attrition, and career placement across student populations

  • Integrate doctoral analytics into strategic planning, budget allocation, and accreditation cycles

  • Redefine success metrics for graduate programs, moving beyond academic job placement

This wasn’t just about data systems—it was about reimagining doctoral education itself. We helped campuses move from reporting for compliance to reflecting for change. And in doing so, we gave them a framework for continuous improvement grounded in evidence and aspiration.

This work at Duke unfolded alongside a growing national push for transparency and accountability in graduate education. As part of that broader movement, I was appointed by Duke President Vincent Price to represent the university in the Coalition for Next Generation Life Sciences (CNGLS)—a multi-institutional effort to standardize public reporting of PhD outcomes, admissions, and demographics. That role gave me a wider lens into the policy pressures and data challenges that many institutions were facing, and it reinforced my commitment to building tools that could serve both internal improvement and public trust. This work reshaped how I think about data’s role in institutional change. Designed with care and empathy, analytics can be more than diagnostic—they can build momentum, surface inequities, and expand what's possible in education reform. For me, this project reinforced that data strategy is not just technical work—it’s a form of leadership.


The formal announcement can be found here.

Founded in 1900, the AAU consists of leading research universities in the United States and Canada. Its member institutions award nearly half of all U.S. doctoral degrees. 

Previous
Previous

Rhodes Information Initiative | Data+

Next
Next

Graduate Pathways System