Comment by ransom1538
Comment by ransom1538 11 days ago
Data Engineer | US Resident | REMOTE | Full-Time | 130-160k (open)
tl;dr: Python + GCP + BigQuery + k8s.
* Develop robust ETL/ELT pipelines to extract, transform, and load data from diverse sources into our data warehouse.
* Enhance and maintain our cloud-based data storage and processing systems for performance, reliability, and cost-efficiency.
* Implement rigorous data quality checks, monitoring, and security measures across all data assets.
* 5+ years of experience in data engineering, with a strong grasp of data warehousing, ETL/ELT principles, and data modeling.
* Experience with BigQuery (, Redshift, Snowflake, etc)
* Experience with infrastructure tools (e.g. Terraform, Kubernetes, Docker) is a plus.
* I will read all resumes! We need more resumes!Job link: https://job-boards.greenhouse.io/epickids/jobs/6669024003
Just applied to this!
I have a lot of experience (5 years) building and optimizing data pipelines from diverse sources using Python, SQL, Snowflake, and AWS Aurora that supported compliance and performance monitoring.
I've implemented schema-drift detection, null tracking, reconciliation layers, and metric QA. I've built automated QA layers in Python and SQL, and have ample experience aligning with Product, QA, compliance, and analytics teams on metric definitions and data products.
Please let me know if I'm a good fit!