Senior Application Developer – Specialization in Big Data and Spark
Budget : 25 L
Looking for immediate candidate from Chennai / In South india
Job Description:
We are looking for someone with strong experience in Google Cloud Platform, Data lake , GCP and Hadoop as part of Hybrid Architecture.
Big Data Engineer key responsibilities:-
· Gather and understand data requirements, build complex data pipelines, and work to achieve high quality data ingestion goals.
· Drive the development of cloud-based and hybrid data warehouses, data ingestion and data profiling activities.
· Analyzing, re-architecting and re-platforming on-premise datalake to data platforms on GCP cloud using GCP/3rd party services.
· Implementation of and maintenance of a standard data / technology deployment workflow to ensure that all deliverables/enhancements are delivered in a disciplined and robust manner.
· Highly self-motivated and able to work independently as well as in a team environment.
Qualifications:-
· 5 + years of experience in a Data Engineer role.
· Experience with cloud services: Google Cloud Platform-- Big Query(must), Dataflow, Airflow, DataProc (Hadoop, Spark, Hive),GCS, Pub/Sub, Cloud Functions - Google Cloud Platform certification is a plus
· Experience with object-oriented/object function scripting languages: Scala, Java, etc.
· Knowledge of ETL concepts, data curation and analytical jobs using distributed computing framework with Spark and Hadoop.
· Exceptional Python and Complex SQL Query skills are essential.
· Experience with CI/CD pipelines integrations and documentation.