Role & Responsibilities:
● 4+ years of hands-on experience in data or platform engineering, with a focus on implementing solutions exclusively using Google Cloud Data plex for all governance functions.
● Strong, practical experience with Data plex features for metadata harvesting, data quality assessments, data profiling, and implementing automated governance workflows.
● Experience in automation and operations (DevOps/DataOps), including building and maintaining CI/CD pipelines (e.g., GitHub Actions) and using Infrastructure as Code (Terraform).
● Proficiency in the GCP ecosystem, including Big Query, Google Cloud Storage, IAM, and Pub/Sub. ● Advanced proficiency in Python for automation, scripting governance tasks, and building data pipelines.
● Experience with diverse data platforms such as Big Query, Oracle, SQL Server, and Spanner.
● Prior experience implementing data governance, data cataloging, and metadata management concepts is a significant plus.
● Experience in writing and executing unit and system integration tests for data pipelines and automation scripts.
● Strong communication and collaboration skills, and ability to work in a global environment.
Skills required:
● Dataplex
● GCP
● Automation
● Python
● CI/CD
● Terraform
● DevOps
What’s in it for you?
● An opportunity to work with an award-winning AI and Big Data software and services company
● To be a part of a dynamic and result-driven team
● To work with a team of vibrant enthusiasts who are always upbeat and self-driven to achieve big milestones
ATS Match is available
1) Upload your resume. 2) Open any job and click Check ATS Match to see your fit score.
Sign in to check your resume match