Title:  AWS Cloud Data Engineer

Date:  5 Aug 2025
Location: 

Bangalore, KA, IN

Job Description

We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent that’s bold, industrious, and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration, and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com Looking to jump-start your career? We understand how important the first few years of your career are, which create the foundation of your entire professional journey. At Indegene, we promise you a differentiated career experience. You will not only work at the exciting intersection of healthcare and technology but also will be mentored by some of the most brilliant minds in the industry. We are offering a global fast-track career where you can grow along with Indegene’s high-speed growth. We are purpose-driven.  We enable healthcare organizations to be future ready and our customer obsession is our driving force. We ensure that our customers achieve what they truly want. We are bold in our actions, nimble in our decision-making, and industrious in the way we work.

Must Have

AWS Cloud Data Engineer

Job Description
Role Overview, Responsibilities, Skills, and Qualifications
Position Summary
An AWS Cloud Data Engineer is a specialist who designs, develops, and maintains scalable data architectures and solutions using Amazon Web Services (AWS). This role sits at the intersection of data engineering and cloud computing, requiring deep expertise in both traditional data pipelines and the AWS ecosystem. The ideal candidate will be passionate about building robust data systems, automating workflows, and leveraging cloud technologies to solve complex business problems.
Key Responsibilities
•    Data Pipeline Development: Design, build, and manage scalable, reliable, and secure data pipelines on AWS, enabling seamless data ingestion, processing, and storage across various data sources.
•    Cloud Architecture Design: Architect and implement data solutions using AWS services such as S3, Redshift, Glue, Lambda, EC2, EMR, Kinesis, DynamoDB, RDS, Aurora, and others.
•    ETL (Extract, Transform, Load): Develop, optimize, and monitor ETL processes to transform raw data into meaningful insights. Leverage services like AWS Glue, Data Pipeline, and third-party ETL tools.
•    Data Modeling: Design efficient and scalable data models for both structured and unstructured data. Implement best practices for data modeling on cloud-native platforms.
•    Automation and Orchestration: Automate data workflows using AWS Step Functions, Lambda, CloudWatch Events, and other orchestration tools.
•    Data Quality & Governance: Ensure data quality, integrity, and security. Implement and enforce data governance and compliance standards (e.g., GDPR, HIPAA).
•    Performance Optimization: Monitor and tune data solutions for optimal cost, speed, and reliability.
•    Collaboration: Work closely with data scientists, analysts, and business stakeholders to gather requirements, understand data needs, and deliver solutions.
•    Troubleshooting & Support: Provide ongoing support for data infrastructure, identifying and resolving issues proactively.
•    Documentation: Maintain comprehensive documentation of data pipelines, architecture decisions, workflows, and operational procedures.
•    Research & Innovation: Stay up-to-date with the latest trends in AWS cloud and data engineering, recommending new tools and approaches to enhance the organization’s data capabilities.

Good to have

Required Skills and Qualifications
•    Educational Background: Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, Mathematics, or a related field.
•    Cloud Expertise: Deep hands-on experience with AWS services, particularly those related to data storage, processing, and analytics (S3, Redshift, Glue, EMR, Lambda, DynamoDB, etc.).
•    Programming Skills: Proficiency in at least one programming language commonly used in data engineering, such as Python, Java, or Scala.
•    Database Technologies: Experience with relational databases (PostgreSQL, MySQL, Microsoft SQL Server, Aurora) and NoSQL databases (DynamoDB, MongoDB).
•    ETL Tools: Familiarity with ETL frameworks and orchestration tools such as AWS Glue, Apache Airflow, and others.
•    Data Warehousing: Expertise in designing, implementing, and maintaining data warehouses, with a strong understanding of Redshift, Snowflake, or similar platforms.
•    Deep understanding of VPC, subnets, routing tables, NAT gateways and VPNs
•    Expertise in AWS Athena, Glue and EMR for big data processing
•    Experience with QuickSight for data visualization
•    Integration of Generative AI models, Snowflake, Foundry and 3rd party AWS
•    Proficiency in SageMaker for building and deploying machine learning models
•    Big Data Technologies: Experience working with large-scale data processing frameworks such as Apache Spark, Hadoop, or Flink.
•    DevOps & CI/CD: Understanding of DevOps practices and tools for deploying and managing data pipelines (e.g., CloudFormation, Terraform, CodePipeline, Jenkins).
•    Security & Compliance: Knowledge of AWS security best practices, IAM policies, encryption techniques, and compliance frameworks.

Preferred Qualifications and Certifications
•    AWS Certified Data Analytics – Specialty
•    AWS Certified Solutions Architect – Associate or Professional
•    Experience with containerization technologies (Docker, Kubernetes, AWS ECS/EKS)
•    Background in machine learning workflows and integration with data pipelines
•    Experience with data visualization tools (Tableau, QuickSight, Power BI)
•    Knowledge of data streaming solutions (Kinesis, Kafka, AWS MSK)
•    Experience working in an Agile/Scrum environmen

•    Communication: Excellent verbal and written communication skills; ability to convey complex technical concepts to both technical and non-technical audiences.
•    Problem-Solving: Strong analytical and troubleshooting skills, with a knack for identifying and resolving data pipeline bottlenecks and inefficiencies.
•    Adaptability: Comfort adapting to new technologies and navigating rapid changes in cloud and data engineering landscapes.

EQUAL OPPORTUNITY

Indegene is proud to be an Equal Employment Employer and is committed to the culture of Inclusion and Diversity. We do not discriminate on the basis of race, religion, sex, colour, age, national origin, pregnancy, sexual orientation, physical ability, or any other characteristics. All employment decisions, from hiring to separation, will be based on business requirements, the candidate’s merit and qualification. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristics.