Software Engineer II- Python, Databricks, AWS, Spark, IDMC

JPMorgan Chase & Co.

FULL_TIME Bengaluru 3 days ago

Job Description

We have an exciting opportunity for you to advance your data engineering career and make a meaningful impact by joining our innovative team.

Job summary

As a Data Engineer II at JPMorgan Chase within Corporate Data and Analytics Service team, you design and deliver trusted, scalable data solutions using modern technologies. You collaborate with us to drive critical technology initiatives that support business objectives and foster a culture of growth and inclusion.

Job responsibilities

  • Design, develop, and maintain scalable data pipelines using Python and Spark
  • Build and optimize ETL workflows in Databricks, leveraging Delta Lake

features

Integrate and manage data across AWS services such as S3, Lambda, and EKSCollaborate with data analysts and business stakeholders to deliver solutionsEnsure data quality, integrity, and security across engineering processesMonitor, troubleshoot, and optimize pipeline performance and resource usageDocument data flows, architecture, and processes for internal knowledge

sharing

Required qualifications, capabilities, and skillsFormal training or certification on software engineering concepts and 2+

years applied experience

Proficient in Python for data processing and automationStrong experience with Apache Spark (PySpark) for distributed data processingHands-on experience with Databricks platform and Delta LakeSolid understanding of AWS cloud services, including S3, Lambda, EKS, and

Aurora DB

Experience with ETL design, data modeling, and data warehousing conceptsFamiliarity with CI/CD tools and practices for data engineering

Preferred qualifications, capabilities, and skills

  • Familiarity with modern front-end technologies
  • Exposure to cloud technologies
  • Experience with orchestration tools such as Airflow
  • Experience with REST APIs and data integration

Requirements

Candidates must have formal training or certification in software engineering concepts along with a minimum of two years of applied experience, demonstrating proficiency in Python and strong experience with Apache Spark (PySpark). Solid hands-on experience with the Databricks platform, Delta Lake, and core AWS services like S3, Lambda, and EKS is required.

Education: Professional Certificate

Experience Level: 2-5 years

Company Information

Apply for this Job

Ready to apply? Click the button below to start your application.

Apply Now You'll be redirected to the employer's application page

Job Details

Job Type: FULL_TIME

Location: Bengaluru

Experience Level: Mid_level

Posted: Mar 20, 2026

Views: 7