closed vacancy Senior Big Data Scalability Engineer (Remote US Based)
VividCortex: Database Performance Monitoring
Posted 4 years ago
*Only candidates residing inside of the United States will be considered for this role*About VividCortex Are you excited by designing and developing a high volume, highly available, and highly scalable SaaS product in AWS that supports Fortune 500 companies? Do you love the energy and excitement of being a part of growing and successful startup company? Are you passionate about diving deep into technologies such as microservices/APIs, database design, and data architecture? VividCortex, founded in 2012, is building a world-class company with a mixed discipline team to provide incredible value for our customers. Hundreds of industry leaders like GitHub, SendGrid, Etsy, Yelp, Shopify, and DraftKings rely on VividCortex. Our company’s growth continues to accelerate (#673 Inc. 5000) for yet another year so we need your help. We are extremely customer focused, engaged in building an authentic, low-drama team that is open, candid, sincerely practicing ‘disagree and commit’, constantly learning and improving, and with a focused, get-it-done attitude about our commitments. A successful candidate thrives in a highly collaborative and fast-paced environment. We expect and encourage innovation, responsibility, and accountability from our team members and expect you to make substantial contributions to the architectural and technical direction of both the product and company. About the Role VividCortex needs an experienced and senior hands-on data and software engineer who has “been there and done that”to help take our company to the next level. We are designing and building our next-generation system for continuous high volume data storage, analysis, and presentation. You are hands-on and working at the intersection of data, engineering, and product. You are key in defining the strategy and tactics of how we store and process massive amounts of performance metrics and other data we capture from our customers' database servers. Our platform is written in Go and hosted entirely on the AWS cloud. It currently uses Kafka, Redis, and MySQL technologies among others. We are a DevOps organization building a 12-factor microservices application;we practice small, fast cycles of rapid improvement and full exposure to the entire infrastructure, but we don't take anything to extremes. The position offers excellent benefits, a competitive base salary, and the opportunity for equity. Diversity is important to us, and we welcome and encourage applicants from all walks of life and all backgrounds.
What You Will Be Doing
- Discover, define, design, document and assist in developing scalable backend storage and robust data pipelines for different types of data streams of both structured and unstructured data in an AWS environment and based on Linux and Golang
- Work with others to define, and propose for approval, a modern data platform design strategy and matching architecture and technology choices to support it, with the goal of providing a highly scalable, economical, observable, and operable data platform for storing and processing very large amounts of data within tight performance tolerances.
- Perform high-level strategy and hands-on infrastructure development for the VividCortex data platform, developing and deploying new data management services in AWS.
- Collaborate with engineering management to drive data systems design, deployment strategies, scalability, infrastructure efficiency, monitoring, and security.
- Write code, tests, and deployment manifests and artifacts.
- Work with CircleCI and GitHub in a Linux environment.
- Issue pull requests, create issues, and participate in code reviews and approval.
- Continually seek to understand, measure, and improve performance, reliability, resilience, scalability, and automation of the system. Our goal is that systems should scale linearly with customer growth, and the effort of maintaining the systems should scale sub-linearly.
- Support product management in prioritizing and coordinating work on changes and serve as a lead in creating user-focused technical requirements and analysis
- Assist with customer support, sales, and other activities as needed.
- Understand and enact our security posture and practices.
- Rotate through on-call duty.
- Contribute to a culture of continuous learning and clear responsibility and accountability.
- Manage your workload, collaborating and working independently as needed, keeping management appropriately informed of progress and issues.
Basic Qualifications:
- Experience developing and extending a SaaS multi-tenant application
- Domain expert in scalable, highly available data storage, scaling, organization, formats, security, reliability, etc.
- Capable of deep technical understanding and discussion of databases, software and service design, systems, and storage
- 10+ years of experience in distributed software systems design and development
- 7+ years of experience programming in Golang, Java, C#, or C
- 7+ years of experience designing and hands-on implementation and maintenance of data pipelines at big data scale, employing a wide variety of big data technologies, as well as cleaning and organizing data to be reliable and usable
- Experience designing highly complex data infrastructures and maintenance of same
- Mastery of relational database concepts including a strong knowledge of SQL and of technologies such as MySQL, Postgres
- Experience with CI/CD, Git, and development in a Unix/Linux environment using the command line
- Excellent written and verbal communication skills
- Ability to understand and translate customer needs into leading-edge technology
- Collaborative with a passion for highly effective teams and development processes
Preferred Qualifications:
- Master’s degree in Computer Science or equivalent work experience
- Experience designing and deploying solutions with no-SQL technologies such as Mongo, DynamoDB
- 3+ years of experience with AWS infrastructure development including experience with a variety of different ingestion technologies,, processing frameworks, storage engines and understand the tradeoffs between them
- Experience with Linux systems administration and enterprise security