Negotiable
Outside
Remote
USA
Summary: The Data Engineer role focuses on leveraging Salesforce CRM data structures to design and maintain data pipelines. The position requires expertise in cloud and DevOps tools, with a preference for candidates familiar with Microsoft Fabric. The engagement is remote, targeting candidates in the east coast time zone, and is expected to last 10-14 weeks starting in early to mid-December.
Key Responsibilities:
- Design and maintain data pipelines ingesting Salesforce CRM data.
- Work with Salesforce CRM data structures.
- Utilize Azure DevOps for repository and pipeline management.
- Implement AWS services such as S3, EC2, and Lambda.
- Use orchestration tools like Airflow.
- Support migration, configuration, or pipeline deployment in Microsoft Fabric if required.
Key Skills:
- Strong experience with Salesforce CRM data structures.
- Hands-on experience with Azure DevOps and AWS.
- Familiarity with Airflow or similar orchestration tools.
- Exposure to Microsoft Fabric environments is preferred.
- Understanding of data engineering components in Microsoft Fabric.
Salary (Rate): undetermined
City: undetermined
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Engagement Details:
Job Title(s): Data Engineer
Duration: 10-14 weeks initial phase, starting early to mid-December (will break for holidays)
Location/Travel: Remote, east coast time-zone preferred
Required Expertise:
Salesforce CRM Expertise
- Strong experience working with Salesforce CRM data structures
- Ability to design and maintain pipelines ingesting Salesforce CRM data
- Bonus: Experience with IQVIA CRM data feeds or pipeline integrations
Cloud, DevOps & Workflow Tools
Candidate should have hands-on experience with:
- Azure DevOps (repos, pipeline management)
- AWS (e.g., S3, EC2, Lambda, IAM basics)
- Airflow (or similar orchestration tools)
Microsoft Fabric (preferred, not mandatory)
- Exposure to or hands-on experience with Microsoft Fabric environments
- Understanding of Fabric s data engineering components (Lakehouse, Pipelines, Dataflows, etc.)
- Ability to support migration, configuration, or pipeline deployment in Fabric if required