**With you Chubb is better!**
Chubb is the world's largest publicly traded P&C insurance company and a leading commercial lines insurer in the United States. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance, and life insurance to a diverse group of clients. As an underwriting company, we assess, assume, and manage risk with insight and discipline. We service and pay our claims fairly and promptly. We combine the precision of craftsmanship with decades of experience to conceive, craft and deliver the very best insurance coverage and service to individuals and families, and businesses of all sizes.
Chubb is also defined by its extensive product and service offerings, broad distribution capabilities, direct-to-consumer platform partnerships, exceptional financial strength and local operations globally. The company serves multinational corporations, mid-size and small businesses with property and casualty insurance and risk engineering services; affluent and high net worth individuals with substantial assets to protect; individuals purchasing life, personal accident, supplemental health, homeowners, automobile and specialty personal insurance coverage; companies and affinity groups providing or offering accident and health insurance programs and life insurance to their employees or members; and insurers managing exposures with reinsurance coverage.
**Position Overview**:
**Position Summary**
**Primary Responsibilities**:
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business. Respond to data related inquiries to support business and technical teams.
**Technical Skills / Experience**:
- 7+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB)
- Skilled in backend technologies (Python, Java, Scala) & Front-end development (JavaScript, React, Angular).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines
- Extensive experience working with Big Data tools and building data solutions for advanced analytics and Machine learning frameworks.
- Solid understanding of P&C Insurance data
- Technical expertise regarding data architecture, models and database design development
- Strong knowledge of and experience with Java, SQL, XML's, Python, ETL frameworks and Databricks
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills
- Solid problem solving, decision makin