View All MetLife JobsMetLife
MetLife is seeking a Bigdata lead software engineer to lead and solution critical business issues, and ingesting data into MDH (Metlife Data Hub) using Bigdata technologies. The successful candidate will have opportunity to work and learn emerging bigdata platform, development, and Global COE team. This position will work close with the existing Development team to provide support in our development and production environments. This individual will work closely with our application development teams to design, develop and implement new enhancements and database applications in MDH. The ideal candidate will be technically savvy, with excellent communication skills and possess the ability to build relationships and collaborate effectively with business and technology partners. A passion for technology with a proven track record of implementing new technology and features is a must.
- Improve/ reengineer existing solutions with data speed/latency, quality and ability to derive analytics insight.
- Operate in various development environments (Agile, Waterfall, etc.) while collaborating with key stakeholders.
- Join the emerging platform organization, which is responsible to build, and deploy customer facing solutions using Big Data technologies.
- Hands on development with Spark, Hive, NoSQL, SOLR, & In-memory data processing.
- Programming using Shell script/Python to create automation scripts.
- Responsible for developing and maintaining the operate runbooks.
- Serve as an escalation point for deployments, and change oversight, enabling highly visible business priorities.
- Develop data architecture and solution designs for classic batch, speed and serving layer.
- Bring in your expertise with developing strategic data supply chain for superior customer experience.
- Optimize and tune queries, job flows on Hadoop environments to meet performance requirements.
- Experience working in cross-functional, multi-location teams.
- Experience with real-time data delivery, Kafka, NiFi, and Change-data-capture
- Cloud experience is plus with Azure, and Google
- Work with Bigdata developers designing scalable supportable Application development.
- Advanced knowledge of the principles of computer application, including software engineering, system analysis and design, design and analysis of algorithms, digital systems, computer system architecture, data structures, computer networks, database management system, and problem solving and computer programming, among others.
- Core hands on development skills in Bigdata ecosystem i.e. Spark Scala, SOLR, Pig, Hive, Kafka and HBase.
- Minimum bachelor’s degree in computer science is required, master’s degree preferred.
- Strong communication skills to lead interactions with business and technology leaders across the global lines of businesses.
- 7+ years of software design and development experience.
- 4+ years of experience in deploying Hadoop components Hive, Spark, HBase, and SOLR.
- Good understanding of release management, CI/CD
- Excellent analytical and problem-solving skills.
Vacancy Type: Full Time
Job Location: Fayetteville, NC, US
Application Deadline: N/A