Contract
Remote
Posted 2 days ago

Seeking a highly skilled Data Engineer with expertise in the AWS stack, Python, and Apache Airflow to join our Enterprise Data Office team. As a Data Engineer, you will play a crucial role in designing, implementing, and maintaining robust data pipelines to support our growing Cloud Data Platform. If you are passionate about data engineering, enjoy solving complex problems, and have a deep understanding of AWS services, Python programming, and Airflow orchestration.

 

Responsibilities:

  • Architect and Develop Data Pipelines:
    • Design, implement, and maintain scalable and efficient data pipelines on the AWS cloud platform.
    • Utilize best practices for data engineering to ensure reliability, performance, and maintainability of the pipelines.
  • AWS Expertise:
    • Leverage your expertise in AWS services, including but not limited to S3, Glue, EMR, Redshift, Lambda, and others, to build end-to-end data solutions.
    • Optimize and fine-tune AWS resources for cost-effectiveness and performance.
  • Python Programming:
    • Develop and maintain data engineering solutions using Python programming language.
    • Collaborate with cross-functional teams to implement data processing, transformation, and enrichment tasks.
  • Airflow Orchestration:
    • Design, implement, and manage complex workflows using Apache Airflow.
    • Create and maintain Airflow DAGs (Directed Acyclic Graphs) for orchestrating ETL processes and other data workflows.
  • Data Modeling and Schema Design:
    • Work closely with data stakeholders to understand data requirements and implement effective data models.
    • Design and optimize database schemas for performance and scalability.
  • Monitoring and Optimization:
    • Implement robust monitoring and alerting mechanisms for data pipelines.
    • Continuously optimize and improve existing data processes for enhanced efficiency.
  • Collaboration and Communication:
    • Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data requirements and deliver high-quality solutions.
    • Clearly communicate technical concepts and solutions to non-technical stakeholders.
    • Maintain comprehensive documentation for data pipelines, workflows, and data models.
    • Document best practices and guidelines for data engineering within the organization.

Skills and Attributes:

  • Proven experience as a Data Engineer, with a focus on AWS, Python, and Airflow.
  • Strong experience with AWS services, especially S3, Glue, Athena, EMR, and Redshift.
  • Expertise in Python programming (SPARK framework) for data processing and transformation.
  • In-depth knowledge of Apache Airflow for workflow orchestration.
  • Experience with data modeling, schema design, and database optimization.
  • Strong problem-solving skills and the ability to work in a dynamic and collaborative team environment.
  • Excellent communication skills with the ability to convey complex technical concepts to a non-technical audience.

Job Features

ExperienceOpen to Fresh graduates (conditions apply)
EducationBachelor’s Degree in Computer Science/Information Technology
RatesRM 450 per day max, no benefits

Apply Online

A valid email address is required.
A valid phone number is required.