Data Engineer
Evnek
Date: il y a 1 jour
Type de contrat: Temps plein
À distance

This is a remote position.
Job Title: Data Engineer
Experience: 3–5 Years
Location: Remote
Notice Period: Immediate Joiners Only
Role Summary
We are seeking a skilled Data Engineer to design, develop, and maintain scalable and reliable data pipelines. The ideal candidate will have expertise in BigQuery, data ingestion techniques, orchestration tools, and a strong command over Python, FastAPI, and PostgreSQL. Experience in handling end-to-end data workflows is essential.
Key Responsibilities
Job Title: Data Engineer
Experience: 3–5 Years
Location: Remote
Notice Period: Immediate Joiners Only
Role Summary
We are seeking a skilled Data Engineer to design, develop, and maintain scalable and reliable data pipelines. The ideal candidate will have expertise in BigQuery, data ingestion techniques, orchestration tools, and a strong command over Python, FastAPI, and PostgreSQL. Experience in handling end-to-end data workflows is essential.
Key Responsibilities
- Data Pipeline Development: Design and implement scalable data pipelines that can handle increasing data loads without compromising performance.
- BigQuery Optimization: Write complex SQL transformations and optimize query performance using best practices such as partitioning, clustering, and efficient JOIN operations.
- Data Ingestion: Develop robust data ingestion processes from various sources, including RESTful APIs and file-based systems, ensuring data integrity and consistency.
- Workflow Orchestration: Utilize orchestration tools like Prefect or Apache Airflow to schedule and monitor data workflows, ensuring timely and reliable data processing.
- Tech Stack Proficiency: Leverage Python and FastAPI for building data services and APIs, and manage data storage and retrieval using PostgreSQL.
- End-to-End Workflow Management: Own the entire data workflow process, from ingestion and transformation to delivery, ensuring data quality and availability.
- Data Pipeline Development: Design and implement scalable data pipelines that can handle increasing data loads without compromising performance.
- BigQuery Optimization: Write complex SQL transformations and optimize query performance using best practices such as partitioning, clustering, and efficient JOIN operations.
- Data Ingestion: Develop robust data ingestion processes from various sources, including RESTful APIs and file-based systems, ensuring data integrity and consistency.
- Workflow Orchestration: Utilize orchestration tools like Prefect or Apache Airflow to schedule and monitor data workflows, ensuring timely and reliable data processing.
- Tech Stack Proficiency: Leverage Python and FastAPI for building data services and APIs, and manage data storage and retrieval using PostgreSQL.
- End-to-End Workflow Management: Own the entire data workflow process, from ingestion and transformation to delivery, ensuring data quality and availability.
Voir plus d'offres d'emploi à distance