Service Lee Technologies Pvt Ltd
Servify - Data Engineer - SQL/Python
Job Location
mumbai, India
Job Description
About the Role : We are seeking a highly skilled and experienced Data Engineer to join our dynamic team in Faridabad. In this role, you will be instrumental in designing, developing, and deploying end-to-end data pipelines leveraging best-in-class technologies and serverless architectures. You will be responsible for managing the complete data lifecycle, from development and testing to deployment and ongoing maintenance. Your expertise in advanced SQL, high-volume data management, and cloud platforms will be crucial in solving complex analytical problems and delivering impactful data solutions. You will also contribute to our agile development process and collaborate effectively using productivity tools like Jira and Confluence. Key Responsibilities : - End-to-End Data Pipeline Development : Design, build, and maintain robust and scalable data pipelines using technologies like Python, Spark, and Databricks, adhering to serverless principles. - Data Lifecycle Management : Manage the complete data lifecycle, including data ingestion, transformation, storage, and processing, ensuring data quality and integrity throughout development and testing phases. - Advanced SQL Expertise : Utilize advanced SQL concepts to query, manipulate, and analyze large datasets, ensuring efficient data retrieval and transformation. - Deployment and CI/CD : Implement and manage automated deployment processes using CI/CD pipelines and code management tools like Git (GitHub, GitLab). - High-Volume Data Management : Effectively manage structured and unstructured high-volume data to address complex analytical challenges and provide valuable insights. - Analytical Solutioning Exposure : Gain exposure to providing analytical solutioning on visualization platforms such as Tableau, Power BI, and web reporting frameworks. - Productivity Tool Proficiency : Utilize productivity tools like Jira and Confluence for task management, collaboration, and documentation. - Agile Methodology : Work effectively within an Agile development environment, participating in sprints, stand-ups, and retrospectives. - Cloud Platform Expertise : Leverage cloud platforms (preferably AWS) to build and deploy data pipelines and infrastructure. - Orchestration and Monitoring : Implement and manage data pipeline orchestration using tools like Cron and Airflow. Monitor pipeline performance and system health using tools like Prometheus, Grafana, Kibana, and OpsGENIE. - Database Design : Contribute to the design and implementation of database solutions, including SQL, NoSQL, and Data Warehouse designs (Columnar/Row Level Storage). Requirements : - Strong proficiency in SQL (including advanced concepts). - Experience with NoSQL databases. - Understanding of Data Warehouse designs (Columnar and Row Level Storage). Data Pipelines : - Extensive experience in developing data pipelines using Python and Spark. - Hands-on experience with Databricks. Cloud : - Experience with at least one major cloud platform (AWS or GCP), with a preference for AWS. Orchestration : - Experience with data pipeline orchestration tools like Cron and Airflow. Reporting : - Exposure to data visualization tools like Tableau and Power BI. - Familiarity with Django framework for web reporting (a plus). Code Management : - Proficient in using code management tools like GitHub or GitLab. Cloud Framework : - Exposure to containerization technologies like Kubernetes and Docker setups. Monitoring : - Familiarity with monitoring tools such as Prometheus, Grafana, Kibana, and OpsGENIE. Productivity Tools : - Hands-on experience with Jira and Confluence. Skill Set Understanding : Sound understanding of at least two of the following three skill sets : - Database & DW Design : SQL, NoSQL, Columnar/Row Level Storage concepts. - Data Pipelines & Orchestration : Python, Spark, Databricks, Cron, Airflow. - Cloud & Monitoring : AWS/GCP, Kubernetes/Docker, : - Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field. - Proven experience (as indicated in the "About the Role" section) in developing end-to-end data pipelines and managing the data lifecycle. - Strong analytical and problem-solving skills with the ability to work with complex datasets. - Excellent communication and collaboration skills. - Ability to work independently and as part of a team in an Agile environment. - A proactive and results-oriented approach. (ref:hirist.tech)
Location: mumbai, IN
Posted Date: 4/11/2025
Location: mumbai, IN
Posted Date: 4/11/2025
Contact Information
Contact | Human Resources Service Lee Technologies Pvt Ltd |
---|