Full Stack Data Engineer - Cape Town
Collinson is a global loyalty and benefits company, partnering with many of the world’s best known brands to create customer experiences that help them acquire, engage and retain choice rich customers. With more than 30 years of experience in over 170 countries and 2 500 colleagues in 20 locations worldwide, Collinson delivers smarter, more personal experiences that deepen loyalty for leading airlines, hotels, financial institutions and retailers.
About Collinson
Collinson specialises in Financial Services, Travel and Retail and also supports clients across multiple sectors, working with over 90 airlines, 20 hotel groups and more than 600 financial institutions and banks. Clients include Accor Hotels, Air France KLM, American Express, British Airways, Cathay Pacific, Diners Club, Mandarin Oriental, Mastercard, Radisson Hotel Group, Sephora, Visa and Vhi, and solutions include Lounge Key and Priority Pass, the world’s best known airport experiences programmes, as well as many leading reward and loyalty initiatives.
Purpose of the job
As a FullStack Data Engineer, you will play a key role in delivering secure bydesign, customer centric data solutions across Collinson’s global footprint. You will design, build and maintain data pipelines and platforms in AWS and other cloud environments, turning complex data into trusted, usable insight for internal and external stakeholders.
This is a hands-on role where you will work across the full data stack: shaping data models, building pipelines, improving performance and ensuring data is available in a timely, fit for purpose way. You will also mentor junior team members, share best practice and help create an inclusive, supportive environment where people at different stages of their careers can thrive.
If you enjoy solving complex data problems, care about quality and security and want to see your work influence real products and customer experiences, this role offers the opportunity to grow, innovate and have visible impact.
Key responsibilities
- Deliver audit ready, traceable data solutions that support assurance, compliance reviews and fraud risk monitoring.
- Design, develop and maintain data pipelines for collecting, transforming and loading data into data warehouses and data lakes.
- Develop and deploy data models that meet diverse business needs across compliance, audit, fraud detection and assurance.
- Write efficient, scalable code in languages such as Python, Scala or Java.
- Lead the design of data solutions with quality, automation, cost effectiveness and performance in mind.
- Own and operate the data pipelines feeding the Data Platform, ensuring they are reliable, observable and scalable.
- Ensure data is available in a fit for purpose, timely manner for analytics, reporting and downstream products.
- Work with the Data Governance team to ensure solutions comply with GDPR, internal security policies and data quality standards.
- Maintain and optimise existing data pipelines to improve performance and quality while minimising business impact.
- Collaborate with cross functional teams, including Product, Assurance and line of business stakeholders, to understand data requirements and support data driven initiatives.
- Set and embed standards for data systems and solutions and share knowledge to keep the team engaged with the latest tools and techniques.
- Prototype and adopt new approaches, bringing innovation into the data platform and solutions.
- Communicate complex data concepts in a clear, accessible way to both technical and nontechnical audiences.
- Mentor and guide junior data engineers, helping them build confidence and capability.
- Advocate for the Data Platform and Data & Analytics team across the business and champion the value of modern, high quality data solutions.
Knowledge, skills and experience
- Extensive experience leading data platform transformations in AWS and other cloud environments.
- Proven track record of delivering large-scale data and analytical solutions in the cloud.
- Handson experience with end-to-end data pipelines on AWS, including extraction, transformation and loading, normalisation, aggregation, warehousing, data lakes and data governance.
- Expertise in developing data warehouses and working with modern data architectures such as data lakes, Lakehouse and data mesh.
- Strong data architecture and data modelling skills, including designing cost effective, maintainable pipelines.
- Familiarity with CI/CD driven data pipelines and infrastructure as code approaches.
- Experience working in Agile environments using Scrum and/or Kanban.
- Ability to scope, estimate and deliver committed work within agreed timelines, both independently and as part of a team.
- Experience supporting QA and UAT processes and understanding the impact of changes to business rules on data flows.
- Strong communication and influencing skills regarding data solutions, outcomes and tradeoffs.
- Experience managing or leading a small team of data engineers and providing day-to-day coaching.
- Deep understanding of how data enables analytics, reporting and automated marketing or personalisation capabilities.
Core technologies include: Python, PySpark, SQL, NoSQL databases (such as MongoDB), Bash scripting, Snowflake, Kafka, NiFi, Glue, DataBrew, AWS, Kinesis, Terraform, APIs and Lakehouse architectures.
Personal attributes
- Self motivated, with a desire to learn new skills and embrace new technologies in a changing data landscape.
- Able to thrive in a fast-moving environment, balancing focus and flexibility.
- Shows initiative and innovation and can work independently when needed while being a strong team player.
- Collaborative and collegiate, tackling project challenges with others and building positive relationships.
- Goal and outcome oriented, with thoroughness and strong attention to detail.
- Clear communicator, able to present, inform and guide others, and to bridge conversations between technical and business focused groups.
- Comfortable working with people at all levels of the organisation and open to feedback and different perspectives.
We know that people from underrepresented groups, including many women, may hesitate to apply unless they meet every requirement, so if this role sounds interesting and you meet most of the criteria, you are encouraged to apply.
Interview process
- Stage 1 – A screener call with the recruitment team, where you can learn more about Collinson, its values and ways of working, and we can understand what you are looking for in a new role.
- Stage 2 – A short (around 30minute) call with an Engineering or Data leader, who will share more about the team and projects and ask about your experience and interests.
- Stage 3 – A take home technical exercise, followed by a review of your solution and approach.
- Stage 4 – A final conversation with members of the Data & Analytics leadership (for example the Data Engineering Lead and/or Head of Architecture & Engineering) to connect the role to the wider data strategy and give you space for any remaining questions.
This clear and structured process is designed to respect your time, reduce uncertainty and offer multiple opportunities to meet future teammates, supporting an inclusive and welcoming candidate experience.
- Division
- Technology & Data
- Role
- Data and ML Platform
- Locations
- Cape Town
- Remote status
- Hybrid
Cape Town
About Collinson
We use our expertise and products to craft customer experiences. Our range of services helps global brand acquire, engage and retain choice-rich customers.
© 2023 Collinson International Limited. Registered in England & Wales under registration No. 2577557
Registered address : 3 More London Riverside, London, SE1 2AQ, United Kingdom.
Already working at Collinson?
Let’s recruit together and find your next colleague.