 |
|
|
|
|
|
|
|
|
|
|
Job Description
|
The BC Cancer Foundation is at the forefront of a transformative era in cancer research and care.
As the fundraising partner of BC Cancer, we are driving bold initiatives that inspire hope and save lives. Cancer is the largest health challenge globally, and we are mobilizing communities, partners and collaborators across British Columbia to accelerate wide-impact solutions.
With a team of over 90 staff across five sites, more than $110 million in annual revenue, and over 80,000 donors annually, we are one of the province’s largest non-profit organizations. We have a purpose-driven, high-performing culture – fuelled by talent, data and technology to accelerate progress and shape the future of cancer care.
We are a passionate, professional and positive team committed to advancing cancer research and supporting patients and families throughout their journey.
About the Opportunity:
Based in Vancouver, BC, this is a hybrid technical role combining core Data Engineering (building our Data Lakehouse and pipelines) with Business Process Automation (building Power Apps/Automate solutions) to modernize our fundraising operations.
Reporting to the Director, Data Platforms and Analytics, responsibilities of the Senior Data Engineer will include:
Data Architecture & Platform Engineering
- Architect, design, and implement scalable cloud data solutions using Microsoft Fabric/Snowflake for an enterprise-grade Data Lakehouse.
- Lead the implementation and architectural rollout of the Data Platform, including validating architectural decisions against PHSA’s security and infrastructure constraints.
- Partner with the analytics lead to design semantic models and optimize Power BI performance, including DirectLake or DirectQuery performance tuning.
- Own cost guardrails and optimization including warehouse sizing, caching, job scheduling, capacity limits, and monthly budgets and alerts.
- Stay informed on emerging technologies, proposing innovative enhancements to the data platform.
Data Integration & Business Process Automation
- Develop and maintain robust ETL/ELT pipelines integrating diverse data sources such as Blackbaud’s Raiser's Edge, Financial Edge, Google Analytics, and various marketing platforms.
- Develop and maintain API integrations with BCCF’s CRM (Raiser’s Edge) and other systems. To be leveraged for data pipelines as well as process automation.
- Optimize, troubleshoot, and maintain data pipelines to ensure performance, reliability, and cost-efficiency.
- Design and build low-code business applications using Microsoft Power Apps (Canvas/Model-driven) to replace manual Excel workflows and capture clean data at the source.
- Build automations with Power Automate, Azure Functions, or Logic Apps to operationalize data-driven workflows where appropriate.
Data Governance, Security & Quality Management
- Establish, communicate, and enforce best practices for governance and security, including quality assurance, data lineage and cataloging, sensitivity labels, data retention, access recertification, row-level security, and column masking for PII.
Stakeholder Collaboration, Requirements Definition and Mentorship
- Collaborate closely with stakeholders and business analysts to define and document clear technical and functional requirements.
- Produce comprehensive documentation and lead knowledge transfer sessions.
- Mentor junior team members, fostering a culture of continuous learning and technical excellence.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field, or equivalent experience.
- 7+ years of proven experience in data engineering, preferably 5+ within cloud environments (strong preference for Azure/Fabric ecosystem).
- Demonstrated success implementing large-scale data platforms (experience with Microsoft Fabric, Azure Synapse or Snowflake is strongly preferred).
- Demonstrated experience implementing medallion and dimensional models at scale.
- Extensive expertise in data modeling, pipeline architecture, and ETL/ELT processes.
- Experience integrating APIs and managing complex data ingestion workflows.
- Experience with Power Platform (Power Apps/Power Automate) in an enterprise context is highly desirable.
- Preferred experience in healthcare, nonprofit, or regulated sectors (e.g., familiarity with FIPPA compliance).
- Familiarity with the BC healthcare ecosystem and PHSA standards is advantageous. Exposure to delivering in shared-service environments with strict security and change controls. Able to navigate provisioning, firewall, and privacy reviews with PHSA.
Core Knowledge & Skills
- Advanced proficiency in Azure and Microsoft Fabric tools, including Data Factory, OneLake, Pipelines and Notebooks, or equivalent in Snowflake.
- Strong SQL and proficiency in Python, PySpark, or similar.
- Proficiency with Microsoft Power Platform (Power Apps, Power Automate, Power BI).
- Deep understanding of governance including lineage, metadata, master data management, classification, sensitivity labels, data retention, access recertification, and role or row-level security.
- Working knowledge of business intelligence tools, particularly Power BI and semantic model design.
- Experienced with CI/CD processes, DevOps methodologies, and Agile practices.
- Exceptional analytical, problem solving, communication, and documentation skills.
- Proven ability to collaborate across diverse stakeholder groups and lead technical work independently.
Salary Range: $115,000 to $135,000
What We Offer:
- 5 weeks’ vacation plus office closure between Christmas Eve and New Years, as well as half days office closures before each statutory holiday
- Comprehensive benefit package including coverage for health, dental, vision and various paramedical services, plus participation in the Employee & Family Assistance Program
- Participation in the Municipal Pension Plan
- A people-centred workplace recognized as one of B.C.’s Top Employers for 2026, fostering strong culture, employee well-being and progressive people practices
BC Cancer Foundation is committed to fostering, cultivating and preserving a culture of diversity & inclusion. All qualified applicants will receive consideration for employment regardless of age, ethnicity, gender identity or expression, language, national or Indigenous origin, family or marital status, physical and mental ability, political affiliation, race, religion, sexual orientation or socio-economic status.
How To Apply
Along with your application, please include brief responses to the two items below. (Applications without these responses may not be reviewed).
1. Architecture Diagram (Required): Please attach a high-level architecture diagram of a Data Lakehouse or complex pipeline you delivered. Briefly explain: What was the single most difficult technical trade-off you had to make in this design, and why did you choose that path? (Note: We value logic over aesthetics. Hand-drawn sketches (photo/scan) or anonymized block diagrams are perfectly acceptable. Please do not generate generic diagrams using AI.)
2. Scenario Question (Choose ONE):
- Option A - Cost Control: Describe a specific time you dealt with an unexpected cloud cost spike or a difficult budget constraint. What was the root cause, and exactly how did you fix it?
- Option B - Process Automation: Describe a manual business workflow (e.g., Excel-based) that you replaced with an automated app or pipeline. What was the hardest part of getting users to adopt the new solution?
|
|
|
|
|
|