
Datab
Required Skills
databricks
apache spark structured streaming
delta live tables
kafka
python
scala
sql
aws
azure
gcp
big data architecture
data pipeline design
etl/elt processes
real-time data analytics
cloud migration
data modeling
data lakes
lakehouse architecture
technical consulting
architecture review
stakeholder management
mentoring
communication
project experience in gcc region
About micro1
micro1 connects domain experts to the development of frontier AI models. Real-world expertise is turned into training data, evaluations, and feedback loops that improve how models perform. AI labs and enterprises use micro1 to train models and build reliable AI agents through advanced evaluations and reinforcement learning environments. Experts contribute directly to how AI systems learn, reason, and perform across domains like finance, healthcare, engineering, and more. Our platform identifies and vets top talent through an AI recruiter, enabling high-quality contributions at scale.
Our goal is to enable 1 billion people to do meaningful work by applying their expertise to AI. We’ve raised $40M+ in funding, and our AI recruiter has powered over 1 million AI-led interviews as our global network of experts grows into the human intelligence layer for AI.
Job Description
Job Title: Datab
Job Type: Contractor
Location: Remote
Job Summary:
We are searching for a seasoned Databricks Architect to join our team and spearhead transformative data initiatives. As an expert in big data architecture and data streaming, you will play a pivotal role in designing, implementing, and optimizing scalable data solutions for real-time processing on cloud platforms.
Key Responsibilities:
- Architect, design, and implement scalable end-to-end data solutions using the Databricks platform.
- Lead the development of real-time data streaming pipelines utilizing Apache Spark Structured Streaming and Delta Live Tables.
- Collaborate with business and technical stakeholders to gather requirements and translate them into robust technical solutions.
- Provide expert consulting and advisory support on data architecture, cloud migrations, and modern platform integrations.
- Integrate Databricks solutions with major cloud platforms (AWS, Azure, GCP) according to project needs.
- Optimize big data workflows for enhanced performance, cost efficiency, and scalability.
- Mentor and guide development teams in Databricks, real-time frameworks, and best engineering practices.
Required Skills and Qualifications:
- 10+ years of experience in Data Engineering, Big Data, or Data Architecture roles.
- Demonstrated hands-on expertise with Databricks, including advanced platform capabilities.
- Strong proficiency in streaming technologies such as Apache Spark Streaming, Structured Streaming, and Kafka integration.
- Proven track record in designing data pipelines, ETL/ELT processes, and real-time analytics solutions.
- Advanced programming skills in Python, Scala, and SQL.
- Experience delivering data solutions within cloud environments (AWS, Azure, GCP).
- Exceptional written and verbal communication abilities for effective stakeholder management.
- Mandatory experience working on projects in the GCC region.
Preferred Qualifications:
- Previous experience in consulting or client-facing architectural roles.
- In-depth understanding of data modeling, data lakes, and lakehouse architectures.
- Exposure to technical governance and architecture review processes.