Data Engineer

Data and Business Analytics

MA

Permanent Placement

Remote

$175 – $195 per Year

Job Overview

We are seeking a Data Engineer to support the development and optimization of data pipelines that power critical systems and analytics. This role requires a strong foundation in modern data engineering tools and cloud technologies, with an emphasis on building reliable, scalable data workflows. Candidates who demonstrate hands-on experience with pipeline orchestration, data transformation, and cross-functional collaboration will be prioritized for interview consideration.

Must Haves

  • Proven experience building and maintaining data pipelines in a production environment
  • Strong proficiency in Python for data processing and workflow automation
  • Hands-on experience with workflow orchestration tools such as Apache Airflow or similar platforms
  • Solid experience working with cloud-based data services, preferably within AWS environments
  • Advanced SQL skills with the ability to write complex queries and optimize performance
  • Experience with data warehousing solutions, including Snowflake or comparable technologies
  • Understanding of data modeling concepts and best practices for structuring large datasets
  • Ability to implement data validation, monitoring, and quality assurance processes

What the Client Needs You to Do

In this role, you will play a key part in ensuring data is accessible, reliable, and optimized for downstream use cases. You will collaborate closely with technical and non-technical stakeholders to translate business needs into scalable data solutions. Success in this position requires a balance of technical execution, problem-solving, and a commitment to building high-quality data systems that support organizational goals.

  • Translate data requirements into efficient, scalable pipeline designs
  • Partner with cross-functional teams to deliver high-quality data solutions
  • Ensure data integrity, availability, and performance across systems
  • Contribute to continuous improvement of data engineering standards and practices

Key Responsibilities

  • Design, develop, and maintain scalable data pipelines to process large volumes of structured and unstructured data
  • Build and optimize ETL and ELT workflows to support analytics and application use cases
  • Integrate data from multiple sources, ensuring consistency and reliability across systems
  • Manage and transform data within cloud-based data warehouse environments
  • Implement automated data quality checks, monitoring systems, and alerting mechanisms
  • Collaborate with engineers, analysts, and stakeholders to define and deliver data solutions
  • Conduct code reviews and contribute to best practices for testing, documentation, and maintainability
  • Troubleshoot and resolve data pipeline issues, ensuring minimal disruption to downstream processes
  • Continuously evaluate and improve pipeline performance, scalability, and cost efficiency

W2 employees of Overture Partners who work 30 or more hours per week are eligible for the following benefits: medical (choice of 3 plans), 401(k) starting on day one, a variety of voluntary benefits including life and disability insurance, and sick time if required by law in the worked-in state/locality.

#25327


Apply now

"*" indicates required fields

Accepted file types: doc, docx, pdf, txt, Max. file size: 3 MB.
Accepted file types: doc, docx, pdf, txt, Max. file size: 3 MB.