Data Engineer - Athens

Deadline

The deadline for applications is the

17th of May.

What we do

We are a team of data scientists, developers, economists, social researchers, and creatives working together on critical issues facing our society today. We are developing a series of cutting-edge products, as well as providing consulting services, working for the public sectors and charity. Our key aim is social impact and doing good through our work. Our research work spans multiple policy areas, including education, health and social care, housing and homelessness, the environment, public finance, and international development, among others.

Our diverse staff combines decades of experience in economic and social research at the highest levels with the ability to develop state-of-the-art tech tools and powerful communications.

We work with some of the most forward-thinking organisations in the voluntary, public, and private sectors. Our clients include most UK government departments, the Welsh Government, the Scottish Government and we also work with major supranational organisations including the World Bank, UNICEF, and the European Commission, as well as a number of national governments across the globe.

Who we want to work with

We’re looking for someone who takes real ownership of data — not only building pipelines, but ensuring the correctness, reliability, and trustworthiness of the data used across the organisation. You are detail-oriented, structured in your thinking, and naturally question inconsistencies, with a strong instinct for improving systems rather than working around them.

You’re pragmatic and solutions-focused, able to balance ideal architecture with real-world constraints. You enjoy tackling complex, imperfect environments and turning them into clear, reliable, and scalable data platforms, working closely with both technical and non-technical teams to drive meaningful impact.

You will join an established organisation with a startup ethos, encouraging ownership and collaboration. You’ll be an internal entrepreneur, identifying real-world pain points and transforming them into sophisticated, home-grown products. By treating our internal use cases as the primary engine for innovation, your creativity is fuelled by direct impact, and your personal growth is measured by the evolution of the tools we use every day.

Key responsibilities

  • Write high-performance SQL, manage indexing strategies, and tune queries to handle growing data volumes.
  • Own and evolve the end-to-end data pipeline architecture, ensuring reliability, scalability, and clarity across all stages (ingestion → transformation → outputs).
  • Ensure data integrity across the system, by implementing validation layers and data quality checks.
  • Stabilise and improve existing pipelines, particularly those driven by manual inputs (e.g. Excel), by Identifying failure points and data risks as well as reducing manual intervention.
  • Use dbt (or similar tools) to transform raw application data into clean, analysis-ready datasets.
  • Design robust error handling and recovery mechanisms, including safe re-runs (idempotent processes), retry strategies and failure isolation, backfilling and correction workflows.
  • Design and maintain scalable data models, supporting both normalised (operational) use cases and analytical reporting needs.
  • Drive continuous improvement of the data platform, including eliminating manual steps, introducing standards (data contracts, validation rules, documentation) and supporting the evolution towards a more scalable architecture.

Requirements

Essential:

  • Expert-level SQL skills, with the ability to write, optimise, and debug complex queries.
  • Strong Python experience for scripting, data processing, and API integrations.
  • Proven experience designing and maintaining data pipeline architecture with a focus on reliability and data integrity.
  • Experience in PostgreSQL, including relational database design, constraints and data integrity enforcement and query optimisation and performance tuning.
  • Strong understanding of data validation techniques, error handling and recovery strategies, and idempotent pipeline design.
  • Experience making pragmatic technical trade-offs, balancing build vs buy, speed vs robustness and perfection vs business needs.

Desirable:

  • Experience with cloud platforms (e.g. AWS, GCP, Azure).
  • Familiarity with data transformation tools (e.g. dbt) for modular SQL modelling.
  • Experience with workflow orchestration tools (e.g. Apache Airflow, Prefect, Dagster).
  • Understanding of CI/CD practices for data pipelines.
  • Experience implementing monitoring and alerting systems and data lineage and observability.
  • Experience working on ERP, finance, or planning systems or other business-critical data platforms.

Working arrangements

You will have the opportunity to make a significant contribution to the work we do from day one.

Alma Economics is a friendly and informal place, and our offices are designed for both work and play. Our team consists of enthusiastic and talented individuals, who love learning and are always ready to support others.

We do not have a one-size-fits-all WfH policy. Most staff members work from home for one or two days during a typical week, while it is also possible to arrange to WfH for more extended periods (e.g., 2-4 weeks) to visit family or travel.

While most staff work 5-day weeks, for most full-time positions we also offer the option of a 4-day working week, either from day one or as an arrangement employees can transition into later in their Alma career.

Our offices can accommodate disabled access, and we are committed to providing all necessary support to colleagues that require it, but we are also happy to consider remote working arrangements for applicants with disabilities or health conditions that prevent them from working on-site.

Application

Please state your preferred office location within your application notes.

To view all current global opportunities, please click here to return to our Careers Page

If you experience any issues while submitting your application, please try reloading this page. If the issue persists, contact us at 
tech@almaeconomics.com.

Data Engineer