Search for More Jobs
Get alerts for jobs like this Get jobs like this tweeted to you
Company: Mastercard
Location: Pune, MH, India
Career Level: Associate
Industries: Banking, Insurance, Financial Services

Description

Our Purpose

Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential.

Title and Summary

Lead Data & Platform Engineer (AWS Native Services, DataBricks, Hadoop, Python) Overview:
We are looking for a self-motivated and enthusiastic Lead Cloud Data / Platform Engineer to join the Data Platforms and Engineering Services team. This individual will be responsible to work with our Cloud Architects, Cloud infrastructure, Information Security, Technical Product Managers, Senior Stakeholders and other Cloud data / platform engineers to ensure we are delivering the right solutions for our customers.
Data Platform provides real time streaming, batch data processing, pipeline orchestration, data lake management, data cataloging capabilities and deliver data on time with quality, supported by metrics. As a key player of our data platform team, you will have opportunity to use your expertise to work in solving big data problems, design, coding and analytical skills to build core capabilities, frameworks and data pipelines.

Role:
• Design, develop and implement large scale, high-volume, high-performance, highly available, scalable data platform infrastructure and pipelines for the Lake house data platforms in AWS cloud.
• Work closely with senior data / platform engineers and data architects using Agile methodology, take ownership of the assigned tasks and deliver the results in timely manner.
• Assist in troubleshooting and resolving issues with data platform /pipelines, AWS services ensuring they are running smoothly and efficiently.
• Stay up to date with emerging technologies and trends in the data engineering and cloud space and willingness to learn and use new tools and platforms that can improve efficiency.

All About You:
• Overall 9-13 years of career experience into Software Engineering or a related field
• Architect and implement robust data pipelines across large-scale distributed systems using AWS native services.
• Lead the development and maintenance of data platforms leveraging services such as:
--Data & Analytics: AWS Glue, EMR, MSK, Athena, Redshift
--Data Governance: Lake Formation, Glue Catalog
--Storage: S3 (General Purpose and S3 Tables)
--Monitoring: CloudWatch, CloudTrail
--Work hands-on with open table formats including Apache Iceberg and Parquet.
• Design and deploy infrastructure using CloudFormation and Infrastructure as Code (IaC) principles.
• Develop scalable solutions using at least one programming language (e.g., Python, Java).
• Experience in data processing platforms on cloud such as Data bricks, Apache Spark, and Snowflake to handle large-scale data processing and distributed computing across massive datasets.
• Collaborate with cross-functional teams to ensure data integrity, governance, and accessibility.
• Utilize data modeling techniques for both SQL and NoSQL systems and apply ETL and data warehousing best practices.
• Contribute to machine learning workflows and analytics using tools like Amazon SageMaker and MWA.
• Proven experience in building and managing data platforms in cloud environments, especially AWS.
• Experience and knowledge of Bit Bucket, Rally, and Jenkins a plus
• Supporting all tiers of applications deployed in the cloud, configuring network, cloud native services, infrastructure, and privileges for development teams (AWS Accounts, IAM Users and Roles, VPCs, S3 buckets, cloud native services) and Knowledge of cloud networking concepts, subnets, routing, load balancing, firewalls, and cloud security.
• Knowledge with Hadoop (CDP), relational databases and SQL, ETL development, data validation and testing, Data Analysis, and experience with Hadoop technologies such as Hive, Impala, Sqoop, Hue, Spark, Python of Hadoop Platform is a plus.
• Strong communication and collaboration skills, with the ability to work effectively with data professionals and business stakeholders.
• Detail-oriented with the ability to manage multiple priorities in a fast-paced, deadline-driven environment.
• Bachelor's degree or higher in Computer Science, Information Technology, or a related field.

Corporate Security Responsibility


All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:

  • Abide by Mastercard's security policies and practices;

  • Ensure the confidentiality and integrity of the information being accessed;

  • Report any suspected information security violation or breach, and

  • Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines.




 Apply on company website