Spigot Software

Data Engineer - Python/SQL

Click Here to Apply

Job Location

bangalore, India

Job Description

Role Description : The data engineering role requires creating and managing technical solutions on the modern data stack (Snowflake, dbt, fivetran, Azure), architecting, building, and managing data flows/pipelines and construct data models within Snowflake to be used for other downstream processes or for analysis purposes. Role Responsibility : - Design, develop, and maintain scalable data pipelines and analytics solutions using DBT, Snowflake, and related technologies - Collaborate with stakeholders to gather requirements and translate business needs into technical solutions - Develop efficient code with unit testing and code documentation - Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving - Optimize and fine-tune data models, SQL queries, and transformations for performance and scalability - Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake, - ensure the effective transformation and load data from diverse sources into data warehouse or data lake. - Integrate FiveTran connectors to streamline data ingestion from various sources into Snowflake - Develop custom Python scripts and functions to automate data workflows and enhance system capabilities - Provide technical expertise and support to resolve data-related issues and troubleshoot system failures - Collaborate with the API development team to integrate data pipelines with external systems and applications - Contribute to the development of web-based data visualization solutions and dashboards - Communicate with all the project stakeholders on the project status - Manage, monitor, and ensure the security and privacy of data to satisfy business needs - Contribute to the automation of modules, wherever required - Stay updated on emerging trends and technologies in data engineering, cloud computing, and analytics domains Role Requirements : - Bachelor's degree in Computer Science, Engineering, or a related field - Proven experience of at least 5 years as a data engineer, ETL developer, or similar role, with a focus on DBT and Snowflake - Strong proficiency in SQL and database concepts, with hands-on experience in Snowflake data warehouse Subject Matter expert of Data warehouse and Data Lake concepts (Dimensional Modelling, change data capture, slowly changing dimensions etc.) - Proficiency in programming languages such as Python for data manipulation, automation, and scripting - Knowledgeable in relational databases, nonrelational databases, data pipelines (ELT/ETL), and file stores - Knowledgeable in performance tuning and optimization - Experience with cloud platforms like Azure and tools like Azure OCR for data extraction and processing - Familiarity with data integration tools like FiveTran - Knowledge of API development principles and experience integrating data pipelines with external systems - Proficient in written, verbal and presentation communication (English) - Ability to work effectively in a fast-paced, collaborative environment and manage multiple priorities - Excellent analytical, problem-solving, and communication skills (ref:hirist.tech)

Location: bangalore, IN

Posted Date: 10/21/2024
Click Here to Apply
View More Spigot Software Jobs

Contact Information

Contact Human Resources
Spigot Software

Posted

October 21, 2024
UID: 4907894639

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.