Company Details
Swiggy is India’s largest on-demand convenience platform, redefining the way people access food, groceries, and everyday essentials. Established in 2014, Swiggy operates across 500+ cities with a robust logistics network that ensures fast and reliable doorstep delivery. From restaurant meals to groceries via Instamart and hyperlocal pick-up and drop services through Genie, Swiggy has become an integral part of daily life for millions of Indians, combining technology, scale, and convenience in a seamless digital experience.
Job Roles & Responsibilities
- Design and implement scalable data pipelines using Apache Kafka for seamless data ingestion and processing.
- Optimize data integration processes to support Swiggy’s on-demand convenience platform efficiently.
- Collaborate with cross-functional teams to align data solutions with Swiggy’s logistics network.
- Monitor and maintain data pipeline performance, ensuring reliability of Swiggy's supply chain data.
- Develop strategies for improving data quality, contributing to Swiggy's overall digital user experience.
- Integrate advanced data solutions to support evolving features on Swiggy's platform.
- Provide technical support and promptly address data-related issues within specified notice period.
Ideal candidate profile
• Deep experience with:
- Streaming data (Kafka, Kinesis, real-time pipelines)
- AWS services : Glue, DMS, Athena, IAM, CloudWatch
- Production operations : monitoring, alerting, incident management CI/CD for data platforms
- AWS Certifications (Solutions Architect, Data Analytics Specialty)
- Java or Scala experience (especially for Spark-based systems)
- Data quality frameworks and validation tooling
- Docker and containerized workloads
- Infrastructure as Code (Terraform, CloudFormation)
ATS Match is available
1) Upload your resume. 2) Open any job and click Check ATS Match to see your fit score.
Sign in to check your resume match