Apply for Chubb Off Campus Drive 2025! Hiring Software Engineer Job in Hyderabad for 4+ years with BE/BTech/IT degrees. Expertise in Delta Live Tables, Snowflake, Python, ETL automation, Medallion Architecture, and machine learning/NLP to innovate scalable data pipeline solutions.
Candidates who are interested in Chubb Off Campus Drive Job Openings can go through the below to get more information.
Key Job details of Software Engineer job
Company: Chubb
Qualifications: BE/BTech/IT
Experience Needed: 4+ years
Job Req ID: 24273
Location: Hyderabad

Start Date: 10th November 2025
Click here to Join on WhatsApp:- https://wa.link/28qo7j
Click here to Join on Telegram:- https://telegram.me/qaidea
Job Description
We are searching for a seasoned Databricks Engineer to pioneer Databricks solutions and having good experience in Machine Learning(ML). The Databricks Engineer will serve as a in our data engineering team, delivering highly sophisticated technological solutions using Databricks, Delta Live Tables, Snowflake, and Medallion Architecture.
Key Responsibilities:
- Building, and maintaining medallion architecture-based data pipelines in Databricks platforms.
- Handling raw data ingestion through Delta Live Tables while ensuring data quality and access latency.
- Utilizing knowledge of Snowflake to offer insight into data warehousing and management.
- Deep expertise in writing and reviewing production code in Python. Experience with designing scalable end-to-end Machine Learning/NLP systems.
- Experience on distributed, high throughput and low latency architecture.
- Understanding of NLP techniques around text cleaning/pre-processing, entity extraction, encoder-decoder architectures, similarity matching etc.
- Familiarity with Continuous Integration tools such as Jenkins.
- Work in conjunction with data science teams to implement complex statistical models.
- Ensuring data security and compliance in Databricks and Snowflake environments.
- Developing and deploying ETL job workflows with an emphasis on automation and scalability.
- Keeping up to date with industry best practices and emerging trends in Databricks technology.
Requirements:
- Bachelor’s degree in computer science, Information Technology, or a related field
- 4+ years of experience in Databricks platform, data engineering, and using Delta Live Tables
- Experience with medallion architecture in a data lake environment
- Good to have knowledge on Snowflake data warehousing platform
- Strong understanding of ETL workflows and data pipeline automation.
- Skilled in SQL, Python, or Scala.
- Experience with LLM tooling and frameworks like LangChain, LlamaIndex etc. Knowledge of basic algorithms, object-oriented and functional design principles, and best-practice patterns
- Experience building software on top of containerization technology (Kubernetes, Docker etc.), and familiarity with frameworks/tools such as FastAPI, Uvicorn.
- Good problem-solving abilities, with an innovative approach to tackling complex challenges.
- Exceptional communication skills with proven ability to bridge communication gaps between technical and non-technical stakeholders.
- Ideal candidates are proactive, collaborative, and passionate about data and Databricks technologies. If you have a desire to be at the forefront of data innovation, we would like to meet you.
Apply Now for Chubb Software Engineer Job
How to Apply Chubb Off Campus Drive 2025
Click on Apply to Official Link Chubb Above – You will go to the Company Official site
First of all Check, Experience Needed, Description and Skills Required Carefully.
Stay Connected and Subscribe to Jobformore.com to get Latest Job updates from Jobformore for Freshers and Experienced.
Interview Questions
- How have you designed and optimized data pipelines using Databricks and Delta Live Tables?
- Describe your experience implementing Medallion Architecture in data lake environments.
- Explain your proficiency with Snowflake and data warehousing concepts.
- How do you integrate machine learning/NLP models into scalable data platforms?
- What ETL automation tools and techniques have you applied?
- Share your experience with Python or Scala in production data engineering.
- How do you ensure data quality, security, and compliance in cloud data platforms?
- Describe your knowledge of LLM tooling and frameworks like LangChain or LlamaIndex.
- What strategies do you use for deploying containerized data services with Kubernetes or Docker?
- How do you communicate complex technical concepts to non-technical stakeholders?