Create Optimized and Scalable Data Models (star schemas, snowflakes, etc.)
for Enterprise Analytics Solutions.
- Experience in Data bricks / Snowflakes
- Experience in API, Python, Java
- Work experience in Data injection from salesforce
- Develop ELT/ETL and Implement best practices for ELT/ETL development.
- Create Data Pipelines that are highly optimized with very large data sets.
- Work effectively using scrum with multiple team members to deliver analytical solutions to the business functions.
- Have a high sense of urgency to deliver projects as well as troubleshoot and x data queries/ issues
- Always be on the lookout to automate and improve existing data processes for quicker turnaround and high productivity
- Experience with designing complex Data Models and Data Engineering Solutions for large scale Data Warehouse/Data
- Lakes from various heterogeneous Data Sources.
- Experience with Data Integration, Business Intelligence and Analytics tools and/or any other open source and self-service
- Analytics tool (Pentaho Data Integration,
- Strong Database Management system knowledge; Experience with Microsoft SQL Server, PostgreSQL, MySQL, Oracle and NoSQL Databases are needed.
- Experience with AWS technologies stack including S3, Redshift, Glue, RDS, Sagemaker, or similar solutions.
- Experience in designing data models with Salesforce, Workday, Marketo, Gainsight, Adobe Analytics is Preferred.
- Experience with Agile development methodologies (Scrum, Pair Programming).
**Job Type**: Contract
**Salary**: Up to $42.00 per hour
Ability to commute/relocate:
- New York, NY 10122: Reliably commute or willing to relocate with an employer-provided relocation package (required)
**Experience**:
- Informatica: 5 years (required)
- SQL: 5 years (required)
- Data warehouse: 5 years (required)
**Speak with the employer**
+91 +1 603 483 3320