I help colleges build systems that automate compliance reporting and deliver reliable analytics so they can make well-informed decisions.
Let's AlignBuild reports and dashboards that refresh automatically—daily or on demand—eliminating manual updates and ensuring stakeholders always see current, trusted data.
Encode institutional business rules and compliance formulas (e.g., FTES, SCFF, persistence, completion) directly into code so metrics are consistent, auditable, and repeatable.
Map disparate source data into standardized schemas and canonical models, creating a single, well-documented structure that supports reporting, submissions, and long-term analysis.
Design and implement reliable, automated data pipelines that move data from source systems into clean, structured, analytics-ready formats on a consistent schedule.
Challenge: Enrollment dashboards were manually refreshed from static reports, leaving leadership without reliable daily visibility into registration trends.
Approach: Built scheduled, unattended automation that extracts live data from the enterprise SIS, transforms it through a validated pipeline, and refreshes interactive dashboards on a daily basis with no manual intervention.
Result: Administrators have real-time enrollment metrics to support timely decisions. IR team is freed up to work on other projects.
Challenge: Postsecondary Data Partnership submissions required labor-intensive manual preparation, and error resolution was tedious and time-consuming.
Approach: Built SQL pipelines to extract and prepare submission data from the enterprise SIS, and developed Python-based cleaning, validation, and error detection processes that trace discrepancies back to their source.
Result: Streamlined the full PDP workflow from extraction through error correction, significantly reducing submission cycle time and improving data accuracy.
Challenge: Inconsistencies and anomalies across institutional data sources eroded trust in reporting, with root causes buried in legacy system logic and undocumented assumptions.
Approach: Diagnosed data integrity issues by tracing data lineage across source systems, validating long-standing reporting assumptions, and building version-controlled data dictionaries and mapping documentation.
Result: Established a well-documented data foundation that resolved cross-system discrepancies and gave stakeholders a clear understanding of what data fields represent.
Challenge: Dashboard publishing lacked a structured approach to FERPA compliance, with no clear separation between internal analytical access and public-facing content.
Approach: Audited and hardened enterprise analytical platform security configurations, designed permissioned distribution models, and established policies for data aggregation and suppression in published dashboards.
Result: Implemented a compliant publishing framework enabling broad access to data while ensuring student privacy protections are enforced at every level of distribution.
I'm a higher-ed consultant, programmer analyst, and data systems engineer specializing in automation for higher education.
I work in Institutional Research and Effectiveness, where I design and modernize data systems that support compliance, reporting, and executive decision-making.
I started this practice after repeatedly seeing the same problems: talented IR teams trapped in manual workflows, leadership making decisions without reliable data, and compliance deadlines creating unnecessary risk and stress.
I bring deep knowledge of higher-ed data requirements, the technical skill to build lasting solutions, and the ability to translate between IT and institutional leadership.
I hold an MS in Business Analytics from Carnegie Mellon University, with training in advanced analytics, machine learning, data architecture, and enterprise data management.
Based in Southern California. Available to work with institutions nationwide.
Whether you have a specific project in mind or just want to explore how I might help, I'm happy to talk.