Posted at: 15 January
Snowflake Developer
Company
Supersourcing
Supersourcing is an India-based B2B AI-powered talent acquisition and IT staffing company specializing in full-stack hiring and RPO solutions for tech industries, targeting global markets.
Remote Hiring Policy:
Supersourcing supports remote work for select roles, primarily hiring in India with some remote positions available for candidates in the United States. Team members are located in various regions, including Bengaluru, Pune, Chennai, Indore, and Noida, with remote roles accommodating U.S. time zones.
Job Type
Full-time
Allowed Applicant Locations
United States, India
Job Description
Responsibilities:
- Design, develop, and optimize Snowflake data warehouse solutions to meet business requirements.
- Collaborate with stakeholders to gather and understand data requirements, and translate them into technical specifications.
- Perform data modeling and schema design to ensure efficient and scalable data storage and retrieval.
- Develop and maintain ETL processes using Snowflake's features and capabilities, such as stored procedures, tasks, and streams.
- Implement data integration pipelines to ingest data from various sources into Snowflake, ensuring data quality and integrity.
- Optimize queries and data processing for performance and scalability, utilizing Snowflake's indexing, clustering, and partitioning techniques.
- Troubleshoot and resolve data-related issues, including data quality, data transformation, and performance bottlenecks.
- Collaborate with cross-functional teams, including data engineers, data analysts, and business stakeholders, to deliver high-quality data solutions.
- Stay up-to-date with the latest Snowflake features and best practices, and evaluate their applicability to enhance existing systems.
Requirements:
Must have 6+ yrs. experience in Snowflake, ETL tools like Informatica, DMX and Data analysis
o Must have 3+ yrs. experience in Python
o Hands on exposure into Unix, Unix shell Scripting
o Must have hands on experience in Oracle, SQL, PostgreSQL, Snowflake.
o Must have hands on Data Warehousing
o Experience in Hadoop, Sqoop, Hive will be added advantage.
o Good to have experience in AWS cloud stack with experience in Aws S3 Athena preferred.
o Must understand any scheduling tools like, Autosys, Control-M, Tivoli etc
o Should be able to handle client independently, take up requirements and suggest solution whenever required
Preferred Skills:
- Snowflake certification(s).
- Experience with data visualization tools, such as Tableau or Power BI.
- Knowledge of scripting languages like Python or Shell scripting.
- Familiarity with data governance and data security practices.