Saltar al contenido
País
India
Acuerdo de trabajo
Tiempo completo
Esquema de Trabajo
Híbrido
Asistencia de reubicación disponible
No
Fecha de publicación
31-Mar-2026
ID del trabajo
16691

Descripción y requisitos

 GG11.1
Role Value PropositionData Engineer plays a critical role in data and analytics life cycle and significantly contributes to production grade data and analytics solutions. The role requires one to demonstrate  Big Data, Engineering and Cloud expertise. This role mentors Data engineers. It is an individual contributor role,  expected to solve wide ranging business problems, 
Experience8-10+ years of relevant experience
EducationBachelors degree in computer science, information technology or equivalent educational qualification
Responsibilities• Design, build, and maintain robust ETL/ELT pipelines on cloud(Azure) or on-prem to collect, ingest and store large volumes of structured and unstructured data for batch/real time processing
• Monitor, optimize, and troubleshoot data pipelines to ensure reliability, scalability, and performance
• Ensure data processing, quality, security, and compliance guidelines, policies and standards are followed
• Collaborate with multiple partners from Business, Technology, Operations and D&A capabilities (Data Governance, Data Quality, Data Modeling, Data Architecture, Data science, DevOps, BI & insights)  
• Mentor Data Engineers
• Independently lead design, solutioning & estimations
Technical Skills• SQL, Python/Scala
• NoSql and distributed databases (Hbase, Cosmos DB)
• ETL pipleine design and development; Solutioning and estimation
• Big Data Frameworks :  Apache Spark, Hadoop, Hive
• Cloud platforms: Azure data factory, Eventhub, Azure functions, Synapse, Databricks
• Datawarehouses, data marts, data lakes
• Medallion architecture
• Performance tuning, optimization, and data quality validation
• Real-time and batch data processing , streaming pieplines with Spark
•Communication skills, analytical skills, structured problem-solving skills, mentorship skills
• Mentorship experience
• Storytelling skills , Partner & Stakeholder engagement experience
Good To have• DevOps practices: Git, AzureDevops, CI/CD pipelines
• Unix shell scripting, Kafka,MongoDB, Nifi
• Exposure to Gen AI technology and tools
• Banking Financial Services and Insurance domain knowledge
About MetLife

Recognized on Fortune magazine's list of the "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™, MetLife, through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East.

Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by our core values - Win Together, Do the Right Thing, Deliver Impact Over Activity, and Think Ahead - we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!