Report TechCompenso 2025L'analisi più attesa su stipendi e trend del mercato tech e digital italiano è ora disponibile!

Leggi il report
TechCompenso
Logo TechCompenso
Disponibile su Google Play Entra nel Talent Radar 🚀
Compri

Data Engineer

🏢 Compri

Full-Remote
⚠️

La posizione è stata chiusa oppure l'azienda non accetta più candidature.

📝 Descrizione

About Us

We’re a team with experience in unicorns (Satispay, Scalapay, Lansweeper), Microsoft, BCG, Bain, Ferrari, and more, tackling one of the biggest (and most outdated) verticals with AI - Procurement. Procurement teams are responsible for sourcing products and services in any company, managing trillion euros every year, yet still relying on emails and spreadsheets. We’re here to change that by providing them with the tech stack they deserve—the same way Sales has Salesforce and HR has Workday. We’re growing fast and looking for top talent to join the ride. Why we founded compri?

  • Products are becoming more complex, and supply chains are now global and highly interconnected. For industrial companies, the Cost of Goods Sold (COGS) can account for 50–70% of their revenue. This makes Procurement one of the biggest strategic levers for profitability, yet it remains underutilized—especially in manufacturing SMEs.
  • Procurement is becoming more strategic and central to companies. We're seeing signals of this trend: salary increases for procurement professionals, IT companies scouting for procurement solutions, and consulting firms fully booked with procurement projects. We are currently a tight-knit team of highly selected individuals, continuously growing. As part of our mission, we are looking for a Data Engineer to play a pivotal role in creating the future of our AI based products.

Position Overview

We're seeking a mid-level Data Engineer to work independently on ERP integration projects using Python, SQL, and Apache Airflow. This role involves building scalable data pipelines that connect enterprise systems, with significant opportunities for process optimization.

Minimum Experience

  • 2-3 years in data engineering or related field
  • Proficiency in Python for data processing and ETL development
  • Strong expertise in SQL and database management
  • Ability to work independently on integration projects

Technical Skills

  • Data Pipeline Development: Design and maintain ETL/ELT pipelines using Python
  • Database Expertise: Complex SQL, schema design, and performance optimization
  • ERP Integration: Experience with enterprise systems integration and API connectivity
  • Python Libraries: pandas, numpy, requests, and other data processing tools
  • Version Control: Git workflow and collaborative development
  • Data Quality: Implementing validation, monitoring, and error handling

Mindset and Traits

  • Process Optimizer: Turn repetitive tasks into scalable, automated solutions
  • Independent Worker: Take ownership of projects from requirements to deployment
  • Systems Thinker: Understand how enterprise systems connect and impact each other
  • Problem Solver: Approach complex integration challenges with creativity

Bonus Skills

  • Experience with Apache Airflow for workflow orchestration
  • Experience with Docker and containerized applications
  • Cloud platforms (AWS, GCP, Azure) and their data services
  • CI/CD pipelines for data engineering workflows
  • Data warehousing tools (Snowflake, BigQuery, Redshift)
  • Modern data stack tools (dbt, Fivetran)

Compensation & Benefits

  • Salary Range: €40,000 - €50,000 based on experience
  • Meal allowance
  • Welfare benefits
  • Flexible work arrangements

What We Offer

  • High Impact Work: Your optimizations directly improve operational efficiency
  • Technical Growth: Architect enterprise-grade data infrastructure
  • Autonomy: Design solutions with minimal micromanagement
  • Process Improvement: Turn manual processes into automated workflows

Selection Process

  • Intro Interview
  • Technical Case Assignment
  • Tech Advisor Interview (optional)

Day-to-Day

  • Design data pipelines connecting ERP systems
  • Develop Python scripts for data extraction and transformation
  • Create and maintain workflow orchestration (Airflow experience preferred)
  • Optimize existing processes for performance and scalability
  • Monitor pipeline health and troubleshoot issues
  • Identify automation opportunities

🔎 Informazioni

💼

Livello di esperienza

Junior/Middle

🖥️

Modalità di lavoro

Full-Remote

💰

Retribuzione annuale

40.000€ - 50.000€

🔹 Python 🔹 SQL 🔹 Data Pipeline 🔹 ERP