TechCompenso
Logo TechCompenso
Disponibile su Google Play Entra nel Talent Radar 🚀
A

Freelance Solution Engineer

🏢 Azienda enterprise

Full-Remote 💼 Partita IVA
⚠️

La posizione è stata chiusa oppure l'azienda non accetta più candidature.

📝 Descrizione

Data Warehouse Maintenance

  • Manage and oversee daily activities of the data warehouse, including image creation, ETL/ELT processes, data loads, performance monitoring, and troubleshooting.
  • Proactively engage with the needed stakeholders to address security and complaince topics.
  • Provide hands-on support and development in managing data pipelines, SQL optimization, and data validation processes.
  • Collaborate with data engineering, analytics, BI, and infrastructure teams to ensure seamless data availability and accuracy.
  • Proactively monitor and optimize warehouse performance, ensuring uptime, scalability, and cost-efficiency
  • Develop and enforce best practices for data quality, governance, backup, and disaster recovery.
  • Create and maintain documentation for data workflows, data dictionaries, operational procedures, and technical designs.
  • Participate in and lead root cause analysis efforts to identify and resolve data-related issues quickly.
  • Evaluate and implement new tools, automation scripts, or practices that improve the reliability and efficiency of warehouse operations.

Project Management

  • Lead end-to-end project lifecycle of big data solutions, ensuring timely delivery and alignment with business goals.
  • Define project scope, timelines, resources, milestones, and deliverables.
  • Coordinate and collaborate with cross-functional teams, vendors, and stakeholders.
  • Apply Agile/Scrum or other project management methodologies to drive iterative development and continuous improvement.
  • Manage risks, issues, and dependencies across project phases.

Solution Engineering

  • Architect and implement big data solutions using platforms such as Hadoop, Spark, Kafka, Hive, and cloud-native technologies (e.g., AWS, Azure, GCP).
  • Develop and optimize ETL/ELT pipelines and data integration strategies.
  • Design scalable data architectures and storage solutions that support analytics, machine learning, and real-time data processing.
  • Ensure data quality, security, and compliance standards are met.
  • Conduct code reviews and guide development best practices across teams.

Required Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or related field.
  • 5+ years of experience in big data engineering or solution architecture.
  • 3+ years in a project management or technical leadership role.
  • Strong experience with Apache Spark, Kafka, or similar technologies.
  • Proficiency with cloud platforms (AWS, Azure) and modern data tools (Syanpse/Fabric, Databricks, Snowflake, etc.).
  • Expertise in data modeling, data lakes, and data warehouses.
  • Hands-on experience with CI/CD pipelines, Git, Jenkins, or similar DevOps tools.
  • Knowledge of programming languages: Python, Scala, Java, or SQL.
  • Background in data governance, security, and compliance (e.g., GDPR, DORA…).
  • Familiarity with agile methodologies and incident management systems (e.g., Jira, PagerDuty).
  • Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field (Master’s a plus)

Preferred Skills

  • Experience in financial services or telecom domains.
  • Familiarity with data governance frameworks and tools (e.g., IDMC/Informatica)
  • Strong communication, leadership, and stakeholder management abilities.
  • Familiarity with model driven approaches to guide and automate the design, development, and maintenance of software systems (and tools like DBT Cora/Platform)

Contract Details

  • 400 euro per day
  • Fully remote
  • Full time

🔎 Informazioni

💼

Livello di esperienza

Senior

🖥️

Modalità di lavoro

Full-Remote

💰

Retribuzione giornaliera

390€ - 400€

🔹 400€/day 🔹 fully-remote 🔹 full-time 🔹 big-data-stack 🔹 technical-leadership