Link Group
Data Engineer - Financial Markets Fixed Income
Job Location
Kraków, Poland
Job Description
We are looking for a Data Engineer with a strong background in ETL pipelines, database management, and data integrations. The ideal candidate will have hands-on experience with Python and SQL to develop and maintain data processing workflows, ensuring data integrity, performance, and security. This role involves building integrations with both external APIs and internal systems, managing data storage solutions such as data warehouses and relational databases, and conducting data quality checks. The candidate will collaborate closely with data scientists and IT teams to ensure seamless and secure integration of data infrastructure within the organization. Project Description The project focuses on developing and optimizing data pipelines to support business intelligence, analytics, and machine learning initiatives. The Data Engineer will be responsible for integrating data from various sources, ensuring data consistency, and maintaining high-performance storage solutions. The role requires collaboration with data scientists and IT teams to align data architecture with business objectives while maintaining security and compliance standards. Additionally, the engineer will monitor system performance, troubleshoot issues, and contribute to the automation and scalability of data workflows. This is an excellent opportunity to work with cutting-edge data technologies in a dynamic environment, contributing to data-driven decision-making and business transformation. Key Responsibilities Design, develop, and maintain ETL data pipelines using Python and SQL. Implement and manage data storage solutions, including relational databases and data warehouses. Build integrations with external APIs and internal systems to enable efficient data exchange. Perform data analysis and quality checks, ensuring data accuracy and reliability. Monitor data infrastructure to optimize performance, security, and scalability. Work closely with data scientists and IT teams to align data workflows with business needs. Skills & Qualifications 3 years of experience developing ETL processing pipelines using Python and SQL. Strong data modeling and database management skills, particularly with relational databases (preferably PostgreSQL). Proficiency in Linux environments (especially RedHat distributions) and version control systems (Git). Familiarity with DevOps pipelines, particularly Azure DevOps Services. Understanding of containerization technologies like Docker or Podman. Strong problem-solving skills and willingness to learn new technologies. Nice to Have Experience with Apache Airflow or other data orchestration tools. Familiarity with Databricks, Dataiku, MLflow, or similar Machine Learning platforms. Knowledge of kdb/q and distributed processing frameworks like Hadoop and Spark/PySpark. Understanding of financial markets, especially fixed income instruments.
Location: Kraków, PL
Posted Date: 4/8/2025
Location: Kraków, PL
Posted Date: 4/8/2025
Contact Information
Contact | Human Resources Link Group |
---|