Data Engineer

$30 - $100/hourpay

Required Skills

MySQL
Python
ETL
Data Warehousing
About micro1
micro1 connects domain experts to the development of frontier AI models. Real-world expertise is turned into training data, evaluations, and feedback loops that improve how models perform. AI labs and enterprises use micro1 to train models and build reliable AI agents through advanced evaluations and reinforcement learning environments. Experts contribute directly to how AI systems learn, reason, and perform across domains like finance, healthcare, engineering, and more. Our platform identifies and vets top talent through an AI recruiter, enabling high-quality contributions at scale.
Our goal is to enable 1 billion people to do meaningful work by applying their expertise to AI. We’ve raised $40M+ in funding, and our AI recruiter has powered over 1 million AI-led interviews as our global network of experts grows into the human intelligence layer for AI.

Job Description

Job Title: Data Engineer


Job Type: Contractor


Location: Remote


Job Summary:

Join our customer's team as a Data Engineer and play a pivotal role in designing, developing, and optimizing robust data pipelines and warehousing solutions. Utilize your expertise in MySQL, Python, ETL, and Data Warehousing to drive data-driven decision-making and enable impactful business insights. We value professionals with a passion for clear, effective communication and a commitment to excellence.


Key Responsibilities:

  1. Design, build, and maintain scalable ETL pipelines to support data ingestion, transformation, and integration from multiple sources.
  2. Develop and optimize logical and physical data models within modern data warehousing environments.
  3. Collaborate closely with cross-functional teams to understand data requirements and deliver reliable data solutions.
  4. Monitor, troubleshoot, and enhance data infrastructure for optimal performance and scalability.
  5. Ensure data accuracy, consistency, and security across all platforms and solutions.
  6. Document technical processes, system designs, and data flows with clarity and precision.
  7. Leverage advanced SQL and Python scripting to automate data workflows and processes.


Required Skills and Qualifications:

  1. Expert-level proficiency in MySQL for complex querying, schema design, and performance tuning.
  2. Strong programming skills in Python for data manipulation and automation.
  3. Hands-on experience building and maintaining ETL pipelines in production environments.
  4. Demonstrated expertise with data warehousing concepts and best practices.
  5. Excellent written and verbal communication skills, with a focus on clear documentation and collaborative problem-solving.
  6. Proven ability to manage and prioritize tasks in a dynamic, remote work setting.
  7. Detail-oriented mindset with a commitment to delivering high-quality data solutions.


Preferred Qualifications:

  1. Experience with cloud-based data warehouse platforms (e.g., AWS Redshift, Snowflake, or Google BigQuery).
  2. Background in data modeling or data architecture.
  3. Familiarity with agile development methodologies.

Apply now

Please note that after completing the interview process, you’ll be added to our talent pool and considered for this and other roles that match your skills.

Have any questions? See FAQs

Refer and Earn$500