콘텐츠로 이동
국가
인도
고용 형태
상근 정규직(정규직)
Work Arrangement
하이브리드
Relocation 지원여부
아니요
게시일
31-3월-2026
직무 ID
16659

직무 설명 및 자격 요건

 

Position Summary

 

MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights. The operating models consists of business aligned data officers- US, Japan and LatAm & Corporate functions enabled by enterprise COEs - data engineering, data governance and data science.

 

Role Value Proposition 

 

Senior Big Data Engineer plays a critical role in data and analytics life cycle and significantly contributes to production grade data and analytics solutions. The role requires one to demonstrate  Big Data, Engineering and Cloud expertise.It is an individual contributor role, expected to independently function.

 

 

Job Responsibilities

 

        Design, build, and maintain robust ETL/ELT pipelines on cloud(Azure) or on-prem to collect, ingest and store large volumes of structured and unstructured data for batch/real time processing

        Monitor, optimize, and troubleshoot data pipelines to ensure reliability, scalability, and performance

        Ensure data processing, quality, security, and compliance guidelines, policies and standards are followed

        Collaborate with multiple partners from Business, Technology, Operations and D&A capabilities (Data Governance, Data Quality, Data Modeling, Data Architecture, Data science, DevOps, BI & insights)

`

 

Education

 

Bachelor’s degree in computer science, information technology or equivalent educational qualification

 

Experience 

(In Years)

8-11+ years of relevant experience

 

Technical Skills

        SQL, Python/Scala

        NoSQL and distributed databases (Hbase, Cosmos DB)

        ETL pipeline development

        Big Data Frameworks:  Apache Spark, Hadoop, Hive

        Cloud platforms: Azure data factory, Eventhub, Azure functions, Synapse, Databricks 

        Datawarehouses, data marts, data lakes

        Medallion architecture

        Performance tuning, optimization, and data quality validation 

        Real-time and batch data processing , streaming pipelines with Spark

        Communication skills, analytical skills, structured problem-solving skills., 

        Partner, Stakeholder engagement experience

 

Preferred skills

• DevOps practices: Git, AzureDevops, CI/CD pipelines

• Unix shell scripting, MongoDB, Nifi 

• Exposure to Gen AI technology and tools

 

    

 

About MetLife

Recognized on Fortune magazine's list of the "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™, MetLife, through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East.

Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by our core values - Win Together, Do the Right Thing, Deliver Impact Over Activity, and Think Ahead - we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!