Search

Azure Snowflake Data Engineer ____ Boston, MA (Onsite) ____ Contract

Acestack
locationBoston, MA, USA
PublishedPublished: 6/14/2022
Technology
Full Time

Job Description

Job Description

Job Title: Azure Snowflake Data Engineer

Location: Boston, MA – HYBDID (4x per week Onsite)
Duration: 6-12+ Months Contract – W2/C2C

Job Summary: We are looking for a highly skilled Azure Snowflake Data Engineer to join our data engineering team on a contract basis. This role will focus on building robust, scalable data solutions using Snowflake and Azure, enabling real-time and batch data processing pipelines. The ideal candidate will be well-versed in modern data engineering practices and Snowflake-specific capabilities such as Time Travel, Zero Copy Cloning, and Snowpipe.

Key Responsibilities:

  • Design and implement end-to-end data pipelines using Azure Data Services and Snowflake.
  • Develop scalable ELT/ETL frameworks and manage large datasets efficiently.
  • Implement Snowpipe for continuous data ingestion and streaming.
  • Leverage Time Travel and Zero Copy Cloning features for data versioning, recovery, and testing.
  • Optimize query performance and storage usage within Snowflake.
  • Collaborate with data analysts, engineers, and business stakeholders to understand data requirements.
  • Ensure data reliability, governance, and security across platforms.
  • Monitor, troubleshoot, and maintain existing data workflows and infrastructure.

Required Skills:

  • Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
  • Must have 8+ Years of in data engineering or analytics roles.
  • 4-6+ years of hands-on experience with Snowflake and Azure data platforms.
  • Snowflake: Advanced expertise including Time Travel, Zero Copy Cloning, and Snowpipe.
  • Azure Data Services: Azure Data Factory, Azure Synapse, Azure Data Lake, Azure Blob Storage.
  • Strong proficiency in SQL for data manipulation and optimization.
  • Experience with Python for scripting and automation.
  • Solid understanding of data warehousing, data lakes, and distributed systems.

Nice to Have:

  • Experience with dbt, Apache Airflow, or similar orchestration tools.
  • Familiarity with CI/CD in data engineering workflows.
  • Knowledge of Delta Lake or Databricks is a plus.

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...