Requirements
- 5+ years experience - you were responsible for building and maintaining ETL pipelines
- You have used multiple languages such as Java, Python, C++, and Javascript
- Experience with big data platforms such as Hadoop, Spark, Bigquery, etc.
- Experience creating data pipelines and backend aggregations
- Experience with ETL workflows on data pipelines with tools such as Apache Spark, Apache Beam, Apache Airflow, Smartstreams, Fivetran, or AWS Glue
- Experience with Cloud Data Warehouse - Redshift, Snowflake, Bigquery or Synapse
- You are comfortable manipulating large data sets and handle raw SQL
- Clear communicator