Ascendion
Data Engineer - Google Cloud Platform
Job Location
bangalore, India
Job Description
Position Details : Title : Data Engineer GCP. Location : All Ascendion Offices Bengaluru, Pune, Chennai, Hyd, Vadodara. YOE : 4 to 7 Years. Description : We are looking for an experienced GCP Data Engineer to join our team. The ideal candidate will have a strong background in designing, developing, and optimizing data workflows and ETL pipelines on Google Cloud Platform (GCP), with a particular focus on BigQuery for scalable data processing and analytics. You'll work closely with cross-functional teams to build robust data architectures, ensuring seamless integration and high data quality. - Design, develop, and optimize ETL pipelines and data workflows on Google Cloud Platform (GCP), leveraging BigQuery for scalable data processing and analytics. - Implement data models and schemas in BigQuery, ensuring efficient storage, query performance, and adherence to best practices for data architecture. - Collaborate with cross-functional teams to gather data requirements, create data ingestion strategies, and manage end-to-end data solutions on GCP. - Monitor and troubleshoot data workflows, performing root cause analysis and implementing solutions to ensure data integrity, security, and system reliability. Key Responsibilities : - Design, develop, and maintain ETL pipelines and data workflows on GCP, ensuring efficient and scalable data processing. - Use Cloud Dataflow, Cloud Composer, and Cloud Functions to build automated and reliable data processing solutions. - Data Modeling and BigQuery Optimization - Implement and optimize data models and schemas in BigQuery for efficient storage, high performance, and easy querying. - Ensure data architecture adheres to best practices for scalability, security, and cost efficiency. - Develop strategies to ingest data from various sources (structured, semi-structured, and unstructured) into GCP, using tools like Pub/Sub and Cloud Storage. - Collaborate with data analysts, data scientists, and other stakeholders to gather and understand data requirements. - Continuously monitor data workflows and troubleshoot issues, ensuring high reliability and data integrity. - Perform root cause analysis for data quality issues and implement corrective measures. - Collaboration and Cross-Functional Engagement Work with cross-functional teams to understand business needs, translate requirements into technical solutions, and ensure smooth data pipeline integration. - Provide guidance on data governance, security, and privacy standards. Required Skills : - Expertise in GCP services, including BigQuery, Cloud Dataflow, Cloud Composer, Pub/Sub, and Cloud Storage. - Strong knowledge of ETL processes, data modeling, and schema design, specifically for cloud-based data warehouses. - SQL and BigQuery Optimization : Advanced SQL skills with experience optimizing BigQuery for performance and cost efficiency. - Proficiency in Python or Java for data processing, with the ability to write and manage production-level code. - Ability to analyze, troubleshoot, and solve complex data problems. - Strong communication skills, with experience working in cross-functional teams and engaging stakeholders. (ref:hirist.tech)
Location: bangalore, IN
Posted Date: 11/10/2024
Location: bangalore, IN
Posted Date: 11/10/2024
Contact Information
Contact | Human Resources Ascendion |
---|