We are a team on a mission, to put accessible and affordable healthcare in the hands of every person on earth. Our mission is bold and ambitious, and it’s one that’s shared by our team who shares our values, to dream big, build fast and be brilliant.
To achieve this, we’ve brought together one of the largest teams of scientists, clinicians, mathematicians and engineers to focus on combining the ever-growing computing power of machines, with the best medical expertise of humans, to create a comprehensive, immediate and personalized health service and make it universally available.
At Babylon our people aren’t just part of a team, they’re part of something bigger. We’re a vibrant community of creative thinkers and doers, forging the way for a new generation of healthcare. We’re only as good as our people. So, finding the best people is everything to us.
We serve millions, but we choose our people one at a time…
- Coach and mentor a chapter of data engineers
- Write and deliver performance reviews together with the squad leads
- Build and maintain a strategy to take our engineering practices to the next level
- Work closely with our Architect to build ETL frameworks
Key Skills Required
- You are a domain expert in Hadoop and Kafka.
- You are a domain expert in Python, Scala or Java.
- You can write SQL with your hands tied behind your back.
- You have worked in a production cloud environment.
- You are a servant leader of data engineers across one or more teams.
- You know how to performance manage engineers.
- You have contributed to hard and soft skills engineer development.
- You can translate engineering development into strategy.
- You are enthusiastic and full of positive and infectious energy
- You have been part of a distributed team and are a great communicator
- You don't mind changing direction because being agile is in your genes
- You are organised and pro-active, with strong attention to detail
Qualifications and Experience
- A degree in a numerical subject such as Mathematics, Physics, Computer Science or substantial experience in developing robust, scalable and production grade data pipelines