Senior Data Engineer - remote

G5
Posted 3 years ago  • Bend, OR
Stack Overflow

Who You Are: 

You are an enthusiastic and capable Data Engineer who is excited to join a growing team of data and analytics experts who are central to the suite of products and services G5 offers. You have experience with expanding and optimizing data and data pipeline architecture, as well as optimizing data flow and collection for use by various teams across a software company. Ideally, you are an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. 

Here at G5, the Data Engineer will support our traditional software developers, data scientists, and business analysts on data initiatives and will ensure high standards for data availability and fidelity are met. Your ability to self-direct and ease in which you support data needs of multiple teams, systems, and products make you a great fit for this opportunity. In addition, you are excited by the prospect of optimizing, or even re-designing, G5’s data architecture to support the next generation of cutting edge products and data initiatives.

Does this sound like you? If so, apply today and let’s start the conversation!

 What You’ll Do:

  • Create and maintain resilient data pipeline architectures.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Work with stakeholders including the Product, Data Science, and other Software Engineering teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across multiple data centers and regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our products.
  • Assist in the deployment and maintenance of machine learning and statistical models as ingestible, usable, and actionable products that scale and are highly available.
  • Build and maintain reporting infrastructure, including but not limited to data warehousing, ETL