Skip to main content

Senior Data Engineer

Senior Data Engineer
accesa.eu
remote
1 month 3 weeks ago

Company Description

Accesa is a leading technology company headquartered in Cluj-Napoca, with offices in Oradea, Bucharest, Timisoara and 20 years of experience in turning business challenges into opportunities and growth. A value-driven organisation, it has established itself as a partner of choice for major brands in Retail, Manufacturing, Finance, and Banking. It covers the complete digital evolution journey of its customers, from ideation and requirements setup to software development and managed services solutions. With more than 1,200 IT professionals, Accesa also has a fast-growing footprint, establishing itself as an employer of choice for IT professionals who are passionate about problem-solving through technology. Coming together in strong tech teams with a customer-centric approach, they enable businesses to grow, delivering value for our clients, partners, industry, and community.

About the project

Our projects can range between 8 and 20 weeks, while an account usually addresses several projects with different deliverables. We also love to get involved in any kind of AI related activities, be there in the discovery, prototyping, or implementing phase. Often we also deliver joined-effort projects, either for internal purposes or to help customer reach their goal, relying on the collaboration with other teams: IoT, SAP, Hybris, RPA. The projects we deliver are mainly focused on Digital Manufacturing Industry, but sometimes opportunities come from other industries such as Financial or Retail.

Your team

The team involved in delivering AI solutions and services often consists of Data Engineers, Data Scientists and Machine Learning Engineers, as part of the Delivery Team in which several other roles are present: Project Manager, Business Analyst, UX Designer, Application/DevOps Architect, Frontend and Backend Developers, QA Engineer.

Job Description

As part of our Artificial Intelligence Team, you will help out shaping the future of our software. You will develop, test and also maintain data architectures to keep this data accessible and ready for analysis. Among your tasks, you will do Data Modelling, ETL (Extraction Transformation and Load), Data Architecture Construction and Development and also Testing of the Database Architecture.

Daily responsibilities

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Real impact one step at a time

The impact will imply the project's context and will also go beyond this, with the Competence Area community that you will be part of, with a strong focus on your technical skills.

Professional Opportunities

You will have access to AI Community trainings and programs emphasizing skills on the technical and tactical side, while you will be engaged within new projects and opportunities landing in our business line.

Community insights

The community consists of Data Scientists and Machine Learning Engineers, along with Data Engineers sharing knowledge and projects' insights on a regular basis. We engage in projects pertaining to Computer Vision, NLP, Advanced Analytics, Preventions and Trends Analysis.

Qualifications

Must have 5+ years of professional experience. Experience in working with customer stakeholders. Experience working in Agile teams. Experience building and optimizing 'big data' data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Knowledge of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores. Technical experience with Big data tools: Spark, Databricks. Stream-processing systems: Storm, Spark-Streaming. Object-oriented languages: Python. Visualization tools: PowerBI, Tableau. Data pipeline and workflow management tools: Airflow. Relational Databases: Postgres.

Additional information

At Accesa you can enjoy our holistic benefits program that covers the four pillars that we believe come together to support our wellbeing, covering social, physical, emotional wellbeing, as well as work-life fusion.

Expertise level

Work arrangement

Similar Jobs in Romania