
Data Engineer
$30 - $100/hourpay
Required Skills
Python
ETL
About micro1
micro1 connects domain experts to the development of frontier AI models. Real-world expertise is turned into training data, evaluations, and feedback loops that improve how models perform. AI labs and enterprises use micro1 to train models and build reliable AI agents through advanced evaluations and reinforcement learning environments. Experts contribute directly to how AI systems learn, reason, and perform across domains like finance, healthcare, engineering, and more. Our platform identifies and vets top talent through an AI recruiter, enabling high-quality contributions at scale.
Our goal is to enable 1 billion people to do meaningful work by applying their expertise to AI. We’ve raised $40M+ in funding, and our AI recruiter has powered over 1 million AI-led interviews as our global network of experts grows into the human intelligence layer for AI.
Job Description
Job Title: Data Engineer
Job Type: Contractor
Location: Remote
Job Summary:
Join our customer’s team as a Data Engineer (LInE), where you will play a pivotal role in designing and optimizing robust ETL pipelines and data workflows. This expert-level, remote position is perfect for professionals who thrive in fast-paced environments and are passionate about delivering scalable data solutions using Python.
Key Responsibilities:
- Design, develop, and maintain scalable ETL processes to ensure smooth data integration and transformation.
- Collaborate closely with cross-functional teams to analyze data needs and implement tailored solutions.
- Optimize existing workflows for performance, reliability, and scalability.
- Monitor, troubleshoot, and resolve issues in production data pipelines to uphold data integrity.
- Write clean, well-documented Python code adhering to industry standards and best practices.
- Champion data quality and implement validation mechanisms throughout data processes.
- Communicate complex technical concepts to both technical and non-technical stakeholders, prioritizing clear written and verbal interactions.
Required Skills and Qualifications:
- Expert-level proficiency in Python programming.
- Extensive hands-on experience building and maintaining ETL pipelines and data workflows.
- Proven ability to work independently in a fully remote environment.
- Exceptional written and verbal communication skills, with a strong focus on clarity and collaboration.
- Strong analytical and problem-solving mindset with acute attention to detail.
- Demonstrated expertise in debugging and optimizing large-scale data systems.
- Solid understanding of data modeling, data warehousing concepts, and best practices in data engineering.
Preferred Qualifications:
- Experience within global, distributed teams and working directly with customers.
- Exposure to additional programming or scripting languages and modern data stack tools.
- Background in supporting highly regulated or data-centric industries.