Coders Brain Technology Private Limited
Data Engineer - Python/ETL
Job Location
pune, India
Job Description
Responsibilities : - Act as the SME for data warehousing architecture, overseeing the design patterns, data transformation processes, and operational functions of the data warehouse. - Provide hands-on support for performance tuning, monitoring, and alerting, specifically within the Snowflake environment. - Manage all facets of Snowflake administration, including role-based access control, environment monitoring, and performance optimization. - Analyze complex data patterns to design and implement scalable, efficient data storage solutions in Snowflake. - Define and document best practices for creating data models (source and dimensional), ensuring consistency across the organization. - Mentor team members in data modeling techniques and work with business users to capture and implement data requirements. - Architect and implement scalable, efficient data pipelines using Snowflake and DBT to support data processing and transformation. - Build and optimize data models, warehouses, and data marts to drive business intelligence and analytics initiatives. - Write clean, efficient, and reusable SQL queries within DBT to manage data transformations and ensure high-quality results. - Establish and enforce data quality checks, validation processes, and continuous monitoring using DBT to maintain data integrity. - Design, develop, and maintain robust, scalable data pipelines and ETL/ELT processes to efficiently ingest, transform, and store data from diverse sources. - Collaborate with cross-functional teams to design, implement, and sustain data-driven solutions that optimize data flow and system integration. - Organize and lead discussions with business and operational data stakeholders to understand requirements and deliver solutions. - Collaborate with data analysts, developers, and business users to ensure data solutions are accurate, scalable, and efficient. Qualifications : - 8 to 10 years of experience in data engineering, with a focus on building data solutions at scale. - 5 years of experience in data warehousing and data modeling techniques (both relational and dimensional). - 5 years of hands-on experience in writing complex, highly optimized SQL queries across large data sets. - 4 years of hands-on experience working with Snowflake. - 4 years of experience in scripting languages like Python etc. - 2 years of experience using DBT (Data Build Tool) for data transformation. - Expertise in SQL with a strong focus on database optimization and performance tuning. - Proven experience in data warehousing technologies such as Snowflake including administration, performance tuning, and implementation of best practices. - Extensive hands-on experience with DBT (Data Build Tool) for data transformation, including developing and maintaining modular, reusable, and efficient DBT models. - Strong ability to write and optimize DBT SQL models for transformation layers and data pipelines. - Hands-on experience with data integration tools like Azure Data Factory, FiveTran, or Matillion with a preference for FiveTran. - Proven experience with API integrations and working with diverse data sources. - Ability to understand, consume and use APIs, JSON, Webservices for data pipelines. - Experience in designing and implementing data pipelines using cloud platforms such as AWS, GCP, or Azure. - Proficient in Python for data transformation and automation. - Experience with CI/CD processes and automation in data engineering. - Knowledge of Power BI or similar data visualization tools is a plus. - Excellent communication skills, with the ability to work collaboratively in a team environment. Education : - Bachelors or Masters degree in Computer Science/Information technology or related field. - Equivalent academic and work experience can be considered. (ref:hirist.tech)
Location: pune, IN
Posted Date: 4/13/2025
Location: pune, IN
Posted Date: 4/13/2025
Contact Information
Contact | Human Resources Coders Brain Technology Private Limited |
---|