GCP Data Architect

Required Skills

Python
GCP
BigQuery
ETL
Spark
Google Cloud Platform
About micro1
micro1 connects domain experts to the development of frontier AI models. Real-world expertise is turned into training data, evaluations, and feedback loops that improve how models perform. AI labs and enterprises use micro1 to train models and build reliable AI agents through advanced evaluations and reinforcement learning environments. Experts contribute directly to how AI systems learn, reason, and perform across domains like finance, healthcare, engineering, and more. Our platform identifies and vets top talent through an AI recruiter, enabling high-quality contributions at scale.
Our goal is to enable 1 billion people to do meaningful work by applying their expertise to AI. We’ve raised $40M+ in funding, and our AI recruiter has powered over 1 million AI-led interviews as our global network of experts grows into the human intelligence layer for AI.

Job Description

Job Title: GCP Data Architect

Location: Riyadh, Saudi Arabia

Contract Duration: 12 Months (Extendable)

Salary: 22,000 – 25,000 SAR/month

Experience Required: 10+ Years

Notice Period: Immediate to 30 Days

Nationality: Open

Mandatory: GCC Experience


Job Summary

We are seeking an experienced GCP Data Architect with a strong background in designing and implementing scalable data solutions on Google Cloud Platform (GCP). The ideal candidate should have hands-on experience in GCP solution architecture, data engineering, and cloud-based data platforms within GCC environments.


Key Responsibilities

  1. Design, develop, and implement end-to-end data architecture solutions on GCP.
  2. Lead GCP solutioning, including architecture design, data modeling, and system integration.
  3. Build scalable and secure data pipelines using GCP services.
  4. Collaborate with business stakeholders to understand data requirements and translate them into technical solutions.
  5. Ensure data governance, security, and compliance standards are met.
  6. Optimize performance, scalability, and cost-efficiency of GCP data solutions.
  7. Provide technical leadership and guidance to data engineering teams.
  8. Work closely with DevOps teams for deployment and CI/CD processes.
  9. Troubleshoot and resolve data-related issues in cloud environments.


Required Skills & Expertise

  1. Strong experience in Google Cloud Platform (GCP) services such as:
  2. BigQuery
  3. Cloud Dataflow
  4. Cloud Composer
  5. Pub/Sub
  6. Cloud Storage
  7. Expertise in data architecture, data modeling, and ETL/ELT processes.
  8. Hands-on experience with Python / SQL / Spark.
  9. Experience in real-time and batch data processing.
  10. Strong understanding of data warehousing concepts.
  11. Knowledge of cloud security, IAM, and governance frameworks.
  12. Experience with API integrations and microservices architecture.


Preferred Qualifications

  1. GCP certifications (e.g., Professional Data Engineer / Cloud Architect).
  2. Experience working on large-scale enterprise data platforms.
  3. Exposure to multi-cloud or hybrid environments is a plus.
  4. Strong communication and stakeholder management skills.


Additional Requirements

  1. Must have prior GCC project experience.
  2. Ability to join within Immediate to 30 days.


Apply now

Refer and Earn$100