Data Engineer (with Architecture Responsibilities)
Job summary
We are looking for a Data Engineer who is comfortable being both hands-on and slightly architectural.
Job descriptions & requirements
Location: Nairobi (On-site)
Department: Analytics / Insights
Type: Full-time
About Afyanalytics Research Ltd:
Afyanalytics is a health intelligence company transforming fragmented healthcare data into a trusted, unified ecosystem for decision-making, policy formulation, and innovation.
Our mission is to build trust in data collection, analysis, and use across healthcare systems through ethical, patient-centered, and technology-driven analytics.
About the Role
We are looking for a Data Engineer who is comfortable being both hands-on and slightly architectural.
You will design how data flows from our hospital systems into Snowflake, build the pipelines that transform it, maintain quality and governance standards, and ensure the analytics team always has dependable, well-modeled data.
This is a practical, builder-focused role ideal for someone who wants to help lay the foundation of a modern health data platform.
Key Responsibilities
● Build and maintain ETL/ELT pipelines from EMR, pharmacy, lab, billing, and HR
systems into Snowflake.
● Clean, transform, and model data using SQL, dbt, and Python.
● Design and maintain Snowflake schemas (staging, cleaned, and analytics layers).
● Set up workflow automation using Prefect or Snowflake Tasks.
● Implement and monitor data quality checks (Soda, Great Expectations, or custom scripts).
● Troubleshoot pipeline failures and improve performance, reliability, and cost efficiency.
● Document data flows, definitions, and table structures for internal teams.
● Work closely with the Data Analyst to ensure datasets meet reporting and dashboard needs.
● Collaborate with product/engineering teams to improve upstream data capture.
● Support future AI/ML workflows by preparing structured, ready-to-model datasets (no heavy ML required)
Requirements
● 5+ years’ experience as a Data Engineer or similar role.
● Strong SQL expertise and solid Python skills (Pandas/ETL scripting).
● Experience with Snowflake or another cloud data warehouse.
● Hands-on experience with dbt or equivalent transformation frameworks.
● Experience with workflow orchestration (Prefect, Airflow, or similar).
● Comfortable working with messy, real-world data and resolving data quality issues.
● Familiarity with data governance, privacy standards, and secure data handling practices.
● Good communication skills able to explain data structures to analysts and non-technical teams.
Nice to Have
● Experience in healthcare data (EMR, pharmacy, lab, billing).
● Exposure to data quality tools (Soda, Great Expectations).
● Basic understanding of ML workflows (Vertex AI, Snowpark, etc.).
● Experience with cloud infrastructure (GCP/AWS) and Git-based development.
Why Join Afyanalytics
You’ll help build the core data foundation at Afyanalytics. Your work will directly enable clean, trusted, high-quality data that improves clinical operations, supply chains, insights, and eventually AI-driven decision-making.
If you enjoy building pipelines, organizing data at scale, and shaping a modern health data platform, this role is for you.
Important safety tips
- Do not make any payment without confirming with the BrighterMonday Customer Support Team.
- If you think this advert is not genuine, please report it via the Report Job link below.