About the position
The Sr. Data Engineer (SDE) will lead the expansion, development and operations of our data pipeline architecture, integrations, and data transfer for Internet Society systems/tools.
The Sr. Data Engineer (SDE) will lead the expansion, development and operations of our data pipeline architecture, integrations, and data transfer for Internet Society systems/tools.
Location - Remote
Essential Duties and Responsibilities
- Build, deploy, and maintain infrastructure required for extraction, transformation, and loading (ETL) of data from a wide variety of data sources.
- Ability to structure and combine data from different sources.
- Identify, analyze, design, and implement solutions to automate manual reporting processes and data delivery.
- Have strong analytical skills to work with stakeholders across our organization to assist with measurement and data-related technical solutions and support their data infrastructure needs.
- Support development of visualizations and dashboards as needed.
- Contribute to the development of Internet Society’s Data Management and integration strategy.
- Must be self-directed and adaptable in a fast-paced environment with evolving needs.
- Other duties as assigned
Desired Qualifications
The ideal candidate will have 10 years of experience and proficiency in the following areas:
The ideal candidate will have 10 years of experience and proficiency in the following areas:
- Python 3.x (Primary libraries include Pandas, Requests, Asyncio, SQLAlchemy, Pytest, Falcon)
- Relational and non-relational databases (Postgres, MySQL, MongoDB)
- Pipeline orchestration tools (Apache Airflow)
- Cloud server deployment and management (AWS, Digital Ocean, Linux-based)
- Object storage solutions (Digital Ocean Object Store, S3)
- Docker
- CI/CD tools (Jenkins, TravisCI)
- Version control software (Git/Github/GitLab)
- Developing data models and visualizations in BI tools (Looker)
- Message queuing services (AWS SQS, RabbitMQ, Redis, Celery)
Additional experience/familiarity with one or more of the following domains a plus:
- Data Warehouses (Snowflake, BigQuery, Redshift).
- Developing visualizations using Javascript (Leaflet, Plotly, d3.js).
- Working with Salesforce, Netsuite, and SAP Concur APIs.
- Stream processing (Apache Kafka, Apache Spark, Pyspark).
- Web scraping (Selenium, Splinter, BeautifulSoup).
- Commitment to the Internet Society’s mission, values and objectives.
- Experience working with a Globally distributed workforce and ability to work across time zone.
- Excellent interpersonal skills with the ability interact positively in a multicultural and multidisciplinary environment.