
HYRGPT
Snowflake : 5- 10 years : Anywhere in Responsibilities :
• Design and develop scalable, high-performance data pipelines in Snowflake.
• Lead ETL/ELT development using tools such as SQL, Python, DBT, Airflow, Matillion, or Talend.
• Migrate complex T-SQL logic and stored procedures from SQL Server to Snowflake-compatible SQL or ELT workflows.
• Integrate AWS Glue to automate and orchestrate data workflows.
• Work with structured and semi-structured data formats (e.g., JSON, Parquet, Avro, XML).
• Optimize Snowflake performance and cost through effective query tuning and warehouse resource management.
• Design data models that support business intelligence and analytics use cases.
• Ensure high standards of data quality, validation, and consistency during migration and transformation processes.
• Enforce data governance, security, and access control policies to ensure compliance with organizational standards.
• Collaborate with data architects, business stakeholders, and analytics teams to understand requirements and deliver data solutions.
• Maintain up-to-date technical documentation including data flows, mapping specifications, and operational Skills & Qualifications :
• 5+ years of experience in data engineering or a similar role.
• Hands-on experience with Snowflake and cloud-based data platforms (AWS preferred).
• Strong expertise in SQL and at least one scripting language (preferably Python).
• Experience with ETL/ELT tools like DBT, Airflow, Matillion, or Talend.
• Familiarity with AWS Glue and other cloud-native data services.
• Proven ability to work with semi-structured data.
• Solid understanding of data modeling, data warehousing concepts, and BI tools.
• Strong focus on performance tuning, data validation, and data quality.
• Excellent communication and documentation skills.
(ref:hirist.tech)
To apply for this job please visit in.linkedin.com.