Data Team Software Engineer - remote
In this role you can expect to grow and gain a huge amount of experience in Javascript, Node.JS, OpenBanking, AWS (particularly Athena and SageMaker), Kubernetes, Elastic stack &API development. We’re looking for candidates who are willing to learn (rather than being experts in these areas), but it’s great if you already have experience with a few things on our list, especially those in your preferred area of the development stack.
You’ll thrive using agile methods and enjoy working openly, collaboratively and as part of a multidisciplinary team focused on one or more projects, doing things the right way and producing high quality code.
Your work on our platform will help people get on top of their finances and make better informed financial decisions.
As a member of our technical team you’ll:
Design and build REST APIs to expose generated insights.
Configure and deploy real-time flows from production services into the data warehouse.
Implement services, using already developed machine learning and statistical techniques, that run efficiently over millions of banking transactions.
Design and implement SQL queries to show key information for platform billing, user trends and in-app nudge generation.
And more generally you’ll:
Increase code quality by actively participating in peer code review
Improve our processes and tools through communication, automation and optimisation
Build automated tests as part of our continuous integration and deployment environment
Share knowledge of tools, techniques, new features and ideas with the Moneyhub team of developers and non developers
Apply broad knowledge of web technologies to provide security, performance and scalability
Solve issues and suggest solutions as part of feature development and support
Who you are
We’re interested in people who:
Have experience in back end Javascript development
Understand software design principles such as functional programming (we use Ramda extensively)
Are fluent in SQL.
Have familiarity with techniques to deal with large volumes of data and/or throughput. For instance, you may have used data warehouses such as AWS Athena, Redshift or technologies in the Hadoop ecosystem.
Have built ETL pipelines as part of an organisation’s data warehousing strategy.
Have worked with microservices interacting via REST APIs and through message queues.
Are familiar with different database engines;we use PostgreSQL and MongoDB for OLTP processing.
And more generally:
Enjoy researching and learning new programming tools and techniques and telling others about them
Communicate with accuracy and effectiveness
Take a systematic approach to solve problems
Have experience of using testing to validate solutions.
Understand agile environments and version control
Have a firm understanding of web security
Are aware of technologies used for web applications, e.g databases, backups, CDNs and search, and of Unix-like operating systems, e.g. Linux and/or Mac OS
Have experience of working on, or with modern web technologies
Are familiar with working practices such as TDD, continuous integration, continuous delivery and DevOps (and want to learn more).
While not essential for the role, there will be the opportunity to:
Package and deploy Machine Learning models using Python, Scikit-learn and AWS Sagemaker.
Develop front-end dashboards in React to display the results of analysis and to enable decision making by the business.