Digihelic Solutions Private Limited

DigiHelic Solutions - Data Analyst

Job Location

noida, India

Job Description

Job Description : Role : Data Analytics/Data Engineer Location : Noida, India Experience : 5 Years Work Mode : Hybrid About the Role : We are seeking a seasoned Data Analytics/Data Engineer to join our team in Noida. This role requires a strong blend of data engineering and analytics skills, with a focus on building and maintaining robust data pipelines and delivering actionable insights. The ideal candidate will have extensive experience with cloud platforms (AWS/Azure), ETL processes, SQL, Databricks, Python/Scala, PySpark/Apache Spark, and either NiFi or Hive. You will be instrumental in transforming raw data into valuable business intelligence. Responsibilities : Data Pipeline Development & Management : Design, develop, and maintain scalable and efficient ETL/ELT pipelines using tools like Databricks, PySpark/Apache Spark, and either NiFi or Hive. Ingest, transform, and load data from diverse sources into cloud-based data warehouses and data lakes. Optimize data pipelines for performance, reliability, and cost-effectiveness. Cloud Platform Expertise (AWS/Azure) : Design and implement data solutions on either AWS or Azure cloud platforms. Utilize cloud-native services for data storage, processing, and analytics. Manage and optimize cloud resources for data engineering workloads. Data Warehousing & Data Lake Management : Design and implement data warehousing and data lake solutions for analytical purposes. Develop and maintain data models and schemas. Ensure data quality, consistency, and integrity. Programming & Scripting : Develop and maintain data processing scripts using Python and/or Scala. Utilize PySpark/Apache Spark for large-scale data processing and analysis. Implement data transformations and business logic using SQL. Data Analysis & Reporting : Analyze data to identify trends, patterns, and insights. Develop and maintain data visualizations and dashboards. Generate reports and presentations for stakeholders. Tool Proficiency (NiFi or Hive) : Utilize either Apache NiFi for data flow automation and management or Apache Hive for data warehousing and SQL querying. Design and implement data flows or Hive queries for data processing and analysis. Collaboration & Communication : Collaborate with data scientists, analysts, and business stakeholders to understand data requirements. Communicate effectively with technical and non-technical audiences. Participate in agile development methodologies. Troubleshooting & Performance Optimization : Identify and resolve data pipeline and system issues. Optimize data processing and query performance. Implement monitoring and alerting systems. Required Skills : Experience : - 5 years of experience in data engineering and/or data analytics. Cloud Platforms : - Experience with either AWS or Azure cloud platforms. - Knowledge of cloud data services (e.g., S3, Azure Data Lake Storage, EC2, Azure VMs, etc.). ETL/ELT : - Strong understanding of ETL/ELT concepts and best practices. - Experience with data integration and data warehousing. SQL : - Advanced SQL skills for data querying and manipulation. - Experience with database design and optimization. Databricks & Spark : - Proficiency in Azure Databricks or similar Databricks environments. - Strong experience with PySpark/Apache Spark for distributed data processing. Programming : - Proficiency in Python and/or Scala for data engineering tasks. Tool Proficiency : - Experience with either Apache NiFi or Apache Hive. Data Modeling : - Knowledge of data modeling and schema design. Data Analysis : - Ability to analyze data and derive meaningful insights. Problem-Solving : - Strong analytical and problem-solving skills. Communication : - Excellent verbal and written communication skills. Preferred Skills : - Experience with data visualization tools (e.g., Tableau, Power BI). - Knowledge of machine learning concepts and techniques. - Experience with CI/CD pipelines and DevOps practices. - Experience with data governance and data quality management. - Experience with real-time data streaming and processing. Personal Attributes : - Highly motivated and self-directed. - Strong team player with the ability to collaborate effectively. - Detail-oriented and organized. - Ability to work under pressure and meet deadlines. - Commitment to continuous learning and professional development. - Client focused and results oriented (ref:hirist.tech)

Location: noida, IN

Posted Date: 3/26/2025
View More Digihelic Solutions Private Limited Jobs

Contact Information

Contact Human Resources
Digihelic Solutions Private Limited

Posted

March 26, 2025
UID: 5105610611

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.