Fork Technologies

Tableau Developer - Apache Airflow

Job Location

bangalore, India

Job Description

Job Summary : We are seeking a highly motivated and experienced Tableau Developer with a strong background in data visualization and workflow automation. The ideal candidate will be responsible for designing, developing, and maintaining interactive Tableau dashboards and reports that provide actionable insights to our business stakeholders. This role also requires proficiency in using Apache Airflow to build, schedule, and monitor data pipelines that feed into our Tableau visualizations. You will collaborate closely with data engineers, analysts, and business users to understand data requirements, ensure data accuracy, and optimize data delivery processes. Key Responsibilities : Tableau Development & Visualization : - Design and develop interactive and visually appealing Tableau dashboards and reports to address specific business needs and KPIs. - Translate business requirements into effective and efficient Tableau visualizations. - Optimize Tableau workbooks for performance and scalability, ensuring fast load times and efficient data retrieval. - Implement best practices for data visualization, user experience, and dashboard design. - Maintain and update existing Tableau dashboards and reports based on evolving business requirements. - Troubleshoot and resolve issues related to Tableau dashboards and data discrepancies. - Stay up-to-date with the latest Tableau features and functionalities. Data Integration & Transformation : - Connect Tableau to various data sources, including databases (SQL, NoSQL), cloud platforms, and flat files. - Perform data extraction, cleaning, transformation, and loading (ETL) tasks within Tableau Prep Builder or through SQL queries. - Collaborate with data engineers to understand data models and ensure data integrity and accuracy. Airflow Workflow Management : - Design, develop, and maintain data pipelines using Apache Airflow for scheduling, orchestration, and monitoring data workflows. - Write and debug Python code for Airflow DAGs (Directed Acyclic Graphs). - Implement error handling, logging, and alerting mechanisms within Airflow pipelines. - Optimize Airflow DAGs for performance and efficiency. - Monitor and troubleshoot Airflow pipeline failures and ensure timely data delivery. - Integrate Airflow with various data sources and target systems. Collaboration & Communication : - Collaborate effectively with business analysts, data engineers, and other stakeholders to understand reporting requirements and data needs. - Communicate technical concepts and findings clearly and concisely to both technical and non-technical audiences. - Participate in requirement gathering sessions and provide technical input. - Document development processes, dashboard specifications, and Airflow pipeline configurations. Performance Optimization & Best Practices : - Identify and implement strategies to improve the performance of Tableau dashboards and Airflow pipelines. - Adhere to data governance policies and security standards. - Promote and enforce best practices for Tableau development and Airflow usage within the team. Requirements : - Bachelor's degree in Computer Science, Information Systems, or a related field. - Minimum 3 years of hands-on experience in Tableau development, including creating complex dashboards, visualizations, and calculated fields. - Proven proficiency in Apache Airflow for task orchestration, workflow automation, and building data pipelines. - Strong understanding of data warehousing concepts, data modeling, and ETL processes. - Excellent SQL skills for querying and manipulating data from various databases. - Experience working with different data sources (e.g, relational databases like MySQL, PostgreSQL, cloud platforms like AWS, Azure, GCP). - Solid understanding of Python programming, particularly for writing Airflow DAGs. - Familiarity with version control systems like Git. - Excellent analytical and problem-solving skills with a strong attention to detail. - Strong communication and interpersonal skills, with the ability to collaborate effectively in a team environment. - Ability to work independently and manage multiple tasks effectively. Preferred Skills (Nice to Have) : - Experience with Tableau Server administration and user management. - Familiarity with other data visualization tools like Power BI or Looker. - Knowledge of cloud-based data warehouses like Snowflake, Redshift, or BigQuery. - Experience with containerization technologies like Docker and orchestration tools like Kubernetes. - Understanding of data governance and data quality principles. - Experience working in an Agile development environment (ref:hirist.tech)

Location: bangalore, IN

Posted Date: 3/28/2025
View More Fork Technologies Jobs

Contact Information

Contact Human Resources
Fork Technologies

Posted

March 28, 2025
UID: 5112867180

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.