< Back to vacancies

Data Architect




Job Type


English Level

Who We’re Looking For

Data Architect is responsible to define and lead the Data Architecture, Data Quality, Data Governance, ingesting, processing, and storing millions of rows of data per day. This hands-on role helps solve real big data problems. You will be working with our product, business, engineering stakeholders, understanding our current eco-systems, and then building consensus to designing solutions, writing codes and automation, defining standards, establishing best practices across the company and building world-class data solutions and applications that power crucial business decisions throughout the organization. We are looking for an open-minded, structured thinker passionate about building systems at scale.

What You’ll Do
What You’ll Need
  • B.S. or M.S. in Computer Science, or equivalent degree

  • 10+ years of hands-on experience in Data Warehouse, ETL, Data Modeling & Reporting.

  • 7+ years of hands-on experience in productionizing and deploying Big Data platforms and applications, Hands-on experience working with: Relational/SQL, distributed columnar data stores/NoSQL databases, time-series databases, Spark streaming, Kafka, Hive,Delta  Parquet, Avro, and more

  • extensive experience in understanding a variety of complex business use cases and modeling the data in the data warehouse.

  • Highly skilled in SQL, Python, Spark, AWS S3, Hive Data Catalog, Parquet, Redshift, Airflow, and Tableau or similar tools.

  • Proven experience in building a Custom Enterprise Data Warehouse or implementing tools like Data Catalogs, Spark, Tableau, Kubernetes, and Docker

  • Knowledge of infrastructure requirements such as Networking, Storage, and Hardware Optimization with Hands-on experience with Amazon Web Services (AWS)

  • Strong verbal and written communications skills are a must and work effectively across internal and external organizations and virtual teams.

  • Demonstrated industry leadership in the fields of Data Warehousing, Data Science, and Big Data related technologies.

  • Strong understanding of distributed systems and container-based development using Docker and Kubernetes ecosystem

  • Deep knowledge of data structures and algorithms.

  • Experience in working in large teams using CI/CD and agile methodologies.

  • EST Time zone preferred

What We Offer
  • Competitive salary
  • Remote work opportunity
  • Comfortable work in your local time zone
  • Flexible work schedule
  • Professional growth and development
  • Multicultural working environment