5 years' experience in ETL,analytics, business intelligence, and data architecture, including data modeling (logical & physical), data visualization, data warehousing, and analytical enablement
4 years' experience
Maintain and build on our data warehouse and analytics environment.
Design, implement, test, deploy, and maintain stable, secure, and scalable data engineering solutions and pipelines in support of data and analytics projects,
General data manipulation skills: read in data, process and clean it, transform and recode it, merge different data sets together, reformat data between wide and long, etc.
Use APIs to push and pull data from various data systems and platforms.
Comfort with data management techniques and ETL platforms
Strong SQL programming skills
GCP Native Services-GCS, GKE, Composer, Functions,etc
Expert skill in writing Python code for data processing, familiarity with MySQL or other relational databases, and navigating Unix or Linux
Actively working on Serverless ETL with Python experience.
Datawarehouse- Snowflake (Must) or RDS.
Proficient with Big Query and GCP Data Warehousing tools
At least one DW project is a must with Snowflake.
Demonstrated ability to partner with peers and leaders using strong written and verbal communication
Good to have:
Strong applied knowledge of basic statistical concepts and data best practices
Excellent listening, interpersonal, communication, and problem-solving skills.
Effective time management skills, including demonstrated ability to manage and prioritize multiple tasks and projects
Demonstrated ability to work effectively in teams, in both a lead and support role.
Demonstrated ability to work independently and be a self-starter.
Good understanding of Agile methodology