In2 IT Technologies

In2IT Technologies - Data Engineer - Predictive Modeling

Job Location

noida, India

Job Description

Job Description : We are seeking a highly skilled Data Engineer with 6 years of experience in data engineering best practices and a proven track record of building or significantly contributing to the development of data platforms and AIOps platforms from scratch. The ideal candidate must be proficient in developing end-to-end data engineering solutions and building the core platform features, including data preparation, schema-on-read, schema-on-write, data lake, data warehouse, ETL pipelines, headless architecture, microservices, and AIOps capabilities. The candidate must possess strong expertise in : - Data Retention Management: Designing and managing data retention policies within data lakes and data warehouses. - Pipeline Mechanisms: Creating consumable mechanisms for end-users to build custom data pipelines. - Data Ingestion: Facilitating seamless ingestion of raw data into data lakes and processed data into data warehouses. - Data Mart Development: Establishing procedures and mechanisms to empower end-users to build data marts on top of the data warehouse, serving as a foundation for Data-as-a-Service (DaaS) and enabling AIOps functionalities. Key Responsibilities : - Platform Development: Build and contribute to scalable data and AIOps platforms supporting ingestion, preparation, transformation, observability, and automation. - Data Lake Architecture: Design protocols to ingest batch/live data into elastic data lakes and integrate with external data lake and data warehouse providers. - AIOps Features: Build or enhance features such as anomaly detection, predictive analytics, root cause analysis, event correlation, and intelligent alerting. - Data Connections: Implement operators and mechanisms to enable file uploads, API connectors, message queues, cloud storage, and IoT stream ingestion. - Data Processing: Build robust procedures for data cleaning, transformation, enrichment, validation, aggregation, classification, and anonymization. - Data Destinations: Develop features for exporting data to warehouses, APIs, message queues, analytics tools, and visualization dashboards. - ETL Pipelines: Create scalable ETL pipelines for seamless data integration and transformation. - Open-Source Integration: Utilize open-source tools and frameworks for real-time processing, automation, and observability. - Microservices: Ensure modular, scalable design using headless architecture and microservice-driven approaches. - Collaboration: Work closely with DevOps, SRE, and cross-functional teams to align data engineering with platform observability and automation. - Governance: Implement robust data governance protocols to ensure security, quality, and compliance. Mandatory Skills and Qualifications : Education : - Minimum Bachelor's degree in Computer Science, Electronics and Communication Engineering (ECE), or Information Technology (IT) from a recognized institution. Technical Skills : 1. Programming: Proficiency in Python, Java, or Scala. 2. Databases: Expertise in relational (MySQL, PostgreSQL, SQL Server) and NoSQL databases (MongoDB, Cassandra). 3. Data Warehousing & ETL Tools: Experience with tools like Amazon Redshift, Talend, Informatica, or Apache Airflow. 4. Data Lake Management: Strong expertise in data retention policies and lifecycle management in data lakes. 5. Cloud Platforms: Hands-on experience with AWS, Azure, and GCP. 6. Open-Source Frameworks: Proficiency with Apache Spark, Kafka, Flink, Druid, and Presto for data processing and orchestration. 7. AIOps Tooling: Familiarity with tools like Prometheus, Grafana, Elasticsearch, and Fluentd for observability and monitoring. 8. Data-as-a-Service (DaaS): Proven experience in designing and exposing data marts as services. 9. Microservices and Architecture: Hands-on experience in implementing headless architecture for scalable and extensible platforms. 10. Data Visualization: Proficiency with tools like Tableau and Excel. 11. Machine Learning: Foundational knowledge of ML principles for integration with AIOps features. Core Platform Features Knowledge: - Data Connections: File upload, API connector, message queue connector, cloud storage, IoT stream ingestion. - Data Processing: Real-time processing, data normalization, machine learning integration, and data classification. - Data Destinations: Cloud storage, cold storage archiving, data warehouse writing, and dashboard building. AIOps Features : - Intelligent alerting mechanisms. - Event correlation and anomaly detection. - Predictive analytics for proactive issue resolution. - Root cause analysis for faster troubleshooting. Soft Skills : - Strong critical thinking and problem-solving skills. - Excellent communication and collaboration abilities. - Effective time management to handle multiple priorities and deadlines. (ref:hirist.tech)

Location: noida, IN

Posted Date: 3/26/2025
View More In2 IT Technologies Jobs

Contact Information

Contact Human Resources
In2 IT Technologies

Posted

March 26, 2025
UID: 5047820946

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.