
Job Description :
Our client is a global engineering and systems integration company with a global footprint. There is an urgent requirement of a Data Engineer (Who can join within a max of 30 days)
Desired Candidate Profile :
- 3+ years of data engineering experience.
- Prior experience building data platform highly preferred.
- 3+ yrs. Python experience in data management (map reduce, pipelines, distributed processing)
- 1+ year experience in Snowflake : Architecture, data modelling, SQL, Snow pipes, Virtual warehouse management.
- 3+ experience with AWS (S3, KMS, Firehouse, API Gateway, Lambda, ALBs)
- 1+ year Pipeline development using Airflow /Prefect in a multi-tenant environment exposure to data models/ data quality tools like dbt, Great Expectations highly desirable
- In depth understanding of big data ecosystem ( Hadoop, presto, hive, file formats etc.) highly desirable
- Experience using Git, CI tools for deployment required
- Experience with docker, Kubernetes on Linux required.
- Exposure to Spark or other Big Data tooling highly desirable
- An ideal candidate would be someone who has very strong python and SQL experience, as well as modern data warehousing experience with a focus on efficient data modelling and validation.
- A huge plus would be someone already competent with the concept of scheduled DAGs i.e., airflow, or prefect or another similar tool.
