Data Architect (Remote)
More info
Job type
full time
Experience
lead
Work mode
Remote
6 similar jobs hiring
Job description
We are seeking a highly skilled and experienced Data Engineering Lead/Architect to join our dynamic team. The ideal candidate will have a proven track record of designing, building, and maintaining scalable data pipelines, with strong expertise in Python programming, cloud technologies, and large-scale data systems. If you have a passion for working with data and enabling AI/ML capabilities in products, we want to hear from you.
Key Responsibilities
- Design, develop, and maintain robust and scalable data pipelines to support analytics and machine learning applications.
- Collaborate with cross-functional teams, including data scientists and software engineers, to implement data-driven solutions.
- Optimize and manage data storage systems and ensure high availability, reliability, and performance.
- Design, develop, and maintain robust and scalable ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) data pipelines to support analytics and machine learning applications.
- Ensure data pipelines are optimized for efficiency, reliability, and scalability, handling both structured and unstructured data seamlessly.
- Handle large-scale datasets, ensuring data integrity and consistency across platforms.
- Provide technical expertise and mentorship to junior engineers and stakeholders.
- Implement best practices in data engineering, including version control, testing, and deployment.
- Stay updated with emerging technologies and tools in data engineering, AI/ML, and cloud ecosystems.
Requirements
- Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Minimum 5+ years of hands-on experience in data engineering or related roles.
- Proficiency in Python programming and its data-processing libraries (e.g., Pandas, PySpark).
- Proven expertise in handling large-scale data systems such as distributed databases, data warehouses, and data lakes.
- Strong experience with cloud platforms (AWS, Azure, or GCP) and associated tools for data storage, processing, and orchestration.
- Practical knowledge of data pipeline frameworks like Apache Airflow, Kafka, or Spark.
- Hands-on technical expertise in designing and implementing end-to-end data solutions.
- Familiarity with Generative AI (GenAI) and AI/ML technologies.
What We Offer
- Enjoy the flexibility to work from the comfort of your home, with no commute hassles.
- Work directly with the CXO team, gaining valuable insights and contributing to strategic decisions.
- Take the opportunity to initiate, own, and drive impactful data engineering projects across the organization.
- Become a key member of the engineering leadership team, driving innovation and excellence within the data domain.
- Work with state-of-the-art technologies in AI, ML, and data engineering.
- Competitive compensation and ample opportunities for career growth.
Required skills
PythonAWSGCPAzurePandasSparkKafkaAirflowETLLeadership
Skill Match
Sign up to see your skill match