Data Engineering Team
Our Data Engineering team is part of the RnD organization and it plays a significant role in almost every major feature we are releasing to production. The ideal candidate is an experienced Data Engineer with a strong technical background in data infrastructure, data architecture design, and robust data pipe building. The candidate must have the ability to lead the design and development of key features and interact effectively with other teams, Product managers, DevOps engineers, Data scientists, onboarding engineers, and Support staff.
Responsibilities:
- Play a significant role in deploying and maintaining critical data pipelines in production.
- Lead strategic technological initiatives and long-term plans from initial exploration and POC to going live in a hectic production environment.
- Design infrastructural data services and coordinate with the Architecture team, R&D teams, Data Scientists, and Product Managers to build scalable data solutions.
- Work in Agile process with Product Managers, and other TLs.
- End-to-end responsibility and development of data crunching and manipulation processes within the Optimove product.
- Design and implementation of data pipelines and data marts.
- Create data tools for various teams (e.g. onboarding teams) that assist them in building, testing, and optimizing the delivery of the Optimove product.
- Explore and implement new data technologies to support Optimove's data infrastructure.
- Work closely with the core data science team to implement and maintain ML features and tools.
Requirements
- Sc. in Computer Science, or equivalent
- 3+ years of extensive experience with programming languages (preferably, Python) - a must!
- 3+ years of extensive SQL experience (preferably working in a production environment) - a must!
- Strong capability of schema design and data modeling
- Experience in building robust and scalable data pipelines in a micro-services environment
- Experience with data services orchestration tools, such as Airflow
- Quick, self-learning and good problem-solving capabilities
- Good communication skills and collaborative
- Process and detailed oriented
- Passion to solve complex data problems
Desired
- Experience with Snowflake and MSSQL
- Experience with MLOps and ML implementations
- Experience with Docker and Kubernetes
- Experience with GCP services
- Experience with PubSub/Kafka