JPMorganChase logo

Lead Software Engineer - AWS, PySpark, BigData

JPMorganChase
1 day ago
Full-time
On-site
Telangana, India
Description

We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible.

As a Lead Software Engineer at JPMorganChase within the Consumer and Community Banking - Card & Connected Commerce, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

  • Executes creative software solutions, design, development, and deploy cloud-native applications using AWS services (EMR, S3, Lambda, RDS, DynamoDB, ECS, EKS, etc.).
  • Automate infrastructure provisioning and application deployment using AWS tools (Terraform, AWS CDK).
  • Develops secure high-quality production code, and reviews and debugs code written by others
  • Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems
  • Monitor, troubleshoot, and optimize application performance and reliability in the AWS environment
  • Ensuring security and compliance by implementing IAM policies, encryption, and monitoring.
  • Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies
  • Adds to team culture of diversity, opportunity, inclusion, and respect

 

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 5+ years of applied experience
  • Advance level skills in programming languages like Python or Java
  • Extensive hands-on experience with AWS cloud services and infrastructure
  • Experience with developing /maintaining applications on Public Cloud
  • Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security
  • Experience with Batch and Real-time/streaming pipelines, building end-to-end design of ETL/ELT in Spark, orchestration with Airflow/ EventBridge, schema evolution, and backfills
  • Proficiency in automation and continuous delivery methods
  • In-depth Experience with BigData, Spark and Databricks
  • Hands-on practical experience delivering system design, application development, testing, and operational stability
Preferred qualifications, capabilities, and skills
  • AWS certifications (Solutions Architect, Data Engineer, Developer)
  • Familiarity with GenAI & code generation tools like Copilot, Claude
  • Familiarity with modern, cloud-based data platforms like Snowflake or Databricks
  • Data modeling, Data security and Privacy controls in Financial Industry
  • Experience with developing /maintaining applications on Public Cloud