Data Engineer

match digital.

Your digital & tech recruitment partner.

Data Engineer

£59,000 – £69,000 + benefits
Flexible working (1 day in the office per week)


Our client

Over the past few years, we’ve scaled the global Customer Experience Team (a hybrid startup-consultancy) in one of the world’s most powerful brands. The team encompasses Product, Strategy, Design, Analytics and Implementation. They design and deliver experiences for customers – connecting the customer’s devices with the brand and world around them.

With over 150,000 employees spread across almost 200 countries, our client has innovation at their core and is proud to be building products and services that leave a positive and sustainable impact on society, the environment and in education.

They are an organisation that enrich lives with a cross-functional, international environment built upon transparency and empathy. With almost 40 nationalities in the UK HQ, they embrace diversity and encourage applications from mixed backgrounds, genders, nationalities, ages and lifestyles – seeking to learn from these different perspectives.

The role

Our client is moving towards the next phase of customer experience – evolving to delivering integrated journeys that meet customer needs traversing both online and offline channels.

The Data Engineer will design, build, and maintain the infrastructure and systems required to collect, process, store, and analyse large volumes of data. This role will also be responsible for identifying key data sources within different systems.

As Data Engineer, you will:
  • Collaborate with cross-functional teams to understand data requirements, translating these into technical solutions.
  • Design and implement efficient and scalable data pipelines to collect, process, and transform raw data into usable formats for analysis and consumption.
  • Build and maintain data warehouses, data lakes and data lakehouses that function as repositories for both structured and unstructured data.
  • Ensure data quality, reliability, and accessibility.
  • Develop data models and schemas that optimise data storage, retrieval, and querying performance.
  • Integrate data from numerous sources (databases, APIs, streaming platforms, and third-party systems).
  • ETL (extract, transform, and load) data into appropriate data structures and formats. Leverage tools and technologies such as Apache Spark, Hadoop, cloud-based solutions including data virtualisation and data semantic layers.
  • Implement processes and checks to ensure data accuracy, consistency, and compliance with regulatory requirements.
  • Identify and resolve performance bottlenecks in data processing and storage. Optimise query performance and ensure scalability.
  • Monitor data pipelines and systems, troubleshoot issues, and perform routine maintenance tasks (backups, upgrades, and patching).
  • Identify and implement data engineering best practices, standards, and processes. Ensure thorough data governance and security.
  • Stay up to date with emerging data technologies and trends, evaluating their potential application.
  • Document system designs, processes, and workflows to facilitate knowledge sharing and maintain a robust data infrastructure.
We would like you to have
  • Proven experience as a Data Engineer or Data Analyst with strong SQL experience (other programming language – Python, Scala, Java – is beneficial).
  • Confidence in building data virtualisation layers and data semantic layers.
  • Strong knowledge of relational databases (MySQL, PostgreSQL), experience with data warehousing concepts and tools (Snowflake, Redshift), data lakes and data lakehouses.
  • Familiarity with distributed computing frameworks like Apache Hadoop, Apache Spark, and NoSQL databases (such as MongoDB, Cassandra).
  • An understanding of data modelling techniques (relational, dimensional, and NoSQL) and proficiency in designing efficient and scalable database schemas.
  • Experience with workflow orchestration tools (Apache Airflow, Prefect) and data pipeline frameworks (Apache Kafka, Talend).
  • Familiarity with cloud platforms (AWS, GCP or Azure) and their data services (AWS Glue, GCP Dataflow) for building scalable cost-effective data solutions.
  • Knowledge of data quality assessment and governance practices, including data profiling & cleansing. Knowledge of privacy and security regulations.
  • Excellent problem-solving, communication, and collaboration skills (essential).
The perks
  • The chance to develop your career with a global, multicultural team working on a fascinating customer experience transformation programme.
  • A flexible working environment and the ability to work from home / flexible hours.
  • Private healthcare and private dental insurance.
  • Competitive pension, 26 days holiday (excluding bank holidays).
  • Car lease scheme, season ticket loan and cycle to work schemes.

Match Digital specialises in connecting talented individuals with businesses in the digital, tech, media and marcomms industries.

Upload your CV and / or portfolio.

You can apply to this job and others using your online resume. Click the link below to submit your online resume and email your application to this employer.

Job Overview
Job Location