Smartwork IT Services
GCP Teradata Engineer - Data Visualisation Tools
Job Location
in, India
Job Description
Job Title : GCP Teradata : 5 10 : Chennai, Hyderabad, skills : GCP, Big Query, Cloud Storage, Dataflow,Python, Cloud Functions, Pub/Sub, Tera data Notice period : Immediate : - Design, develop, and implement data migration strategies and solutions for moving data and workloads from Teradata to GCP. - Build and maintain data pipelines using Google Cloud Dataflow, Python, and other relevant GCP services. - Utilize Google BigQuery as the primary data warehouse on GCP, including data modeling, schema design, and query optimization. - Leverage Google Cloud Storage for storing and managing large datasets in a scalable and cost-effective manner. - Develop and deploy serverless data processing solutions using Google Cloud Functions. - Implement message queuing and event-driven architectures using Google Cloud Pub/Sub. - Write and optimize complex SQL queries in both Teradata and Google BigQuery. - Develop and maintain Python scripts for data manipulation, automation, and integration with GCP services. - Ensure data quality, integrity, and consistency throughout the data migration and integration processes. - Troubleshoot and resolve issues related to data migration, GCP services, and data pipelines. - Collaborate effectively with data architects, data scientists, business analysts, and other stakeholders to understand data requirements and deliver appropriate solutions. - Implement best practices for data management, data security, and cloud infrastructure on GCP. - Monitor and optimize the performance and cost of GCP data services and Teradata environments. - Create and maintain comprehensive technical documentation for data pipelines, migration processes, and GCP infrastructure. - Participate in the planning and execution of data migration and modernization Skills : Extensive hands-on experience (3 years) with core GCP services, including : - BigQuery : Deep understanding of BigQuery architecture, data loading, query performance tuning, partitioning, clustering, and cost management. - Cloud Storage : Experience with creating and managing buckets, data lifecycle policies, and access controls. - Dataflow : Proven experience in designing, developing, and deploying data pipelines using Apache Beam and Google Cloud Dataflow (Python SDK preferred). - Cloud Functions : Experience in developing and deploying serverless functions using Python. - Pub/Sub : Understanding of message queuing concepts and experience with Google Cloud Pub/Sub for building asynchronous and scalable systems. - Teradata : Solid understanding of Teradata architecture, data warehousing concepts, and SQL. Experience with : - Writing and optimizing complex Teradata SQL queries. - Data extraction and transformation from Teradata using various tools and techniques. - Understanding of Teradata utilities (BTEQ, FastLoad, MultiLoad). - Python : Strong proficiency in Python programming, including experience with data manipulation libraries (Pandas, NumPy) and building data pipelines. - SQL : Excellent SQL skills with the ability to write complex queries and optimize performance in both Teradata and BigQuery. - Data Warehousing Concepts : Deep understanding of data warehousing principles, ETL/ELT processes, dimensional modeling, and data lake architectures. - Problem-Solving : Strong analytical and problem-solving skills with the ability to troubleshoot and resolve complex technical issues. - Communication : Good verbal and written communication skills to effectively collaborate with team members and to Have Skills : - Experience with other GCP services such as Composer (Airflow on GCP), Cloud Build, Cloud IAM, and Data Catalog. - Knowledge of data governance and data quality frameworks on GCP. - Experience with infrastructure-as-code tools like Terraform or Cloud Deployment Manager for automating infrastructure provisioning on GCP. - Familiarity with agile development methodologies and tools (Jira, Confluence). - Experience with data visualization tools (Looker, Tableau, Power BI). - GCP certifications (Google Cloud Certified - Professional Data Engineer). - Experience with NoSQL : - Bachelor's degree in Computer Science, Engineering, or a related field. - 5 - 10 years of professional experience as a Data Engineer with a significant focus on Teradata and Google Cloud Platform. - Proven experience in designing, developing, and implementing data migration and integration solutions involving Teradata and GCP. - Strong understanding of data warehousing principles and cloud-based data architectures. - Ability to work independently and as part of a collaborative team. (ref:hirist.tech)
Location: in, IN
Posted Date: 4/14/2025
Location: in, IN
Posted Date: 4/14/2025
Contact Information
Contact | Human Resources Smartwork IT Services |
---|