About Community Data Platforms
Community Data Platforms believes communities get smarter and stronger by using evidence-based decision-making. CDP supports this mission by building data platforms in communities across the country to help leaders and decision-makers answer their Pressing Questions. With a five-year goal of serving 100 communities, CDP believes it can support data-driven decision-making among governments, business, and nonprofits, leading to better outcomes among a range of critical community organizations. Communities have pressing questions and CDP’s team and approach develops actionable insights and communicates them clearly by using advanced visualizations. We believe a community data platform is an essential community service.
Working with Community Data Platforms
We’re looking for ambitious and resourceful people to join a rapidly growing start-up serving communities across the country.
Benefits include:
A culture of intellectual humility and a passion for understanding the world around us (we don’t care who’s right – we care what’s right!) Working with driven and intellectually curious people who are experts in their fields A close-knit team that is devoted to a common mission The opportunity to be involved at the early stages of a start-up with enormous potential Being at the forefront of the big data wave and its application to community-based decision-making Remote working environment – you set your own hours and complete your work on your own time – no micromanaging!
Duties and Responsibilities:
The successful candidate will be responsible for organizing and improving a growing data warehouse for a fast-paced organization. They will support various teams to create scalable solutions to answer pressing questions across multiple communities in the United States. An excellent candidate will work directly with ETL developers, data scientists, and dashboard developers to advise the storage process from raw data to actionable insight.
Regular activities will include user group management, the participation in brainstorm sessions, the research of appropriate solutions and the translation of data science results into dashboard-ready tables.
Requirements include:
At least 1 year of experience developing PostgreSQL solutions in an enterprise environment.
Maintain cloud infrastructure on AWS and DigitalOcean.
Ability to design and organize a data warehouse for efficiency and scalability.
Expert level scripting of regular processes using Python (Object Oriented and Functional) and SQL.
Proficiency with managing database user groups.
Awareness of emerging technologies and approaches in IT to keep the company at the forefront of innovation.
Optional:
Know how to create Airflow DAGs
Ability to productionize Jupyter Notebook and R scripts written by data science colleagues into Airflow DAGs.
Maintaining SSL Certifications and manage web servers, including Apache, Nginx, and Node deployments.
Troubleshooting user issues with OpenVPN/FreeIPA (LDAP Authentication).
Preferred Specific Technologies:
PostgreSQL Database Administration
Linux (Ubuntu, CentOS) Systems Administration
Airflow
FreeIPA
OpenVPN