Position title
Data Engineer with Python & Azure (Investment Funds)

@ Accesa & RaRo


As part of our Artificial Intelligence Team, you will help out shaping the future of our software.

You will develop, test, and also maintain data architectures to keep this data accessible and ready for analysis. Among your tasks, you will do Data Modelling, ETL (Extraction Transformation and Load), Data Architecture Construction and Development, and also Testing of the Database Architecture.

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Cloud ‘big data’ technologies.
  • Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
  • Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Must have: 

  • 2+ years of professional experience
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Knowledge of manipulating, processing, and extracting value from large disconnected datasets
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Experience with:
  1. big data tools: Hadoop, Spark, Kafka, etc.
  2. Azure cloud services
  3. Databricks
  4. Stream-processing systems: Storm, Spark-Streaming, etc.
  5. object function scripting/object-oriented languages: Python, Java, C++, Scala, etc.

Willing to develop:

  • Relational SQL and NoSQL databases, including Postgres and Cassandra
  • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Extensive knowledge of Visualization tools: PowerBI, Tableau, etc
Job Benefits
  • People-first company culture We invest in creating genuine connections with all the people we work with.
  • Flatter organization We're able to make better, faster decisions, based on the now.
  • Self-driven career development You're in charge of your professional journey.
  • Impactful and challenging projects Make a difference for global brands and their millions of customers, within both regulated and non-regulated industries.
  • Courage to ask and understand, explore and develop the solutions that have a real impact
  • People who work for us, those who work with us, and those who use the platforms and systems we engineer
  • Insight (developed and shared across the community) to know the right thing to focus on and commit to the actions that ensure we capitalize on it
  • Progress made through the consistent development of ourselves, our connection to each other, our clients, and the trust we consistently nurture throughout all our collaborations
Accesa & RaRo
Data & AI Competence Area
Job Location
Remote work from: Romania
Close modal window

Position: Data Engineer with Python & Azure (Investment Funds)

Thank you for submitting your application. We will contact you shortly!