
Hey Guys, I have an opening to share with you.
Technical and Professional Requirements:
• Databricks, Python, Pyspark, SQL hands on experience, Lakehouse knowledge, CI&CD.
• Tasks to ingest data from a different internal source system via Kafka connector (will be built by another team) into bronze, clean data and implement data quality checks.
• Code business rules in an efficient and effective way with good coding principles that other developers in the team easily understand and can built upon.
• Make data available on a regular frequency without human intervention for a consumption layer according to business requirements and with 99% availability and trustworthiness.
• Drive functional and technical discussions independently with stakeholders.
• DevOps understanding.
• Should be flexible to work on both Development and L2 Support tasks.
Preferred Skills:
Technology->Analytics - Packages->Python - Big Data
Technology->Cloud Platform->Azure Analytics Services->Azure Databricks
Job code: 204557
If this is interesting for you then please share your resume while applying. Also, please feel free to reach out in case of any queries.
Thanks,
Suriya


