Job description
Job Description
- Building Snowpipe, managing data sharing, and designing database, schema, and table structures.
- Experience loading data into Snowflake from both internal and external stages is essential.
- Prior experience with migration projects from Oracle (or similar databases) to Snowflake would be a valuable asset.
- Proficiency in Python or PySpark scripting for automation and orchestration is required, along with experience using orchestration tools such as ActiveBatch for Snowflake automation.
- Familiarity with AWS services like S3 and EC2, and their integration with Snowflake, is important.
- Knowledge of big data technologies is a plus.
- Design and implement Snowflake data warehouse solutions.
- Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
- Optimize Snowflake performance and ensure data security and compliance.
- Develop and maintain ETL processes using Snowflake and other related technologies.
- Provide technical guidance and support to team members and clients.
- Conduct data analysis and generate reports to support business decision-making.
- Stay updated with the latest Snowflake features and industry best practices.
- Minimum of 5 years of experience in data warehousing and cloud computing.
- At least 3 years of hands-on experience with Snowflake.
- Proven experience in designing and implementing Snowflake solutions.
- Strong knowledge of SQL and data modeling.
- At least 7 + years. of experience managing large projects with the below experiences.
- Experience in working with large teams in the Onsite/Offshore model.
- Experience in integrating Near real-time (NRT) / streaming data payloads/feeds from Kafka / APIs into S3 and transforming them using Glue.
- Experience in ingesting and transforming file formats like JSON, Parquet including ELT, and Data Quality Checks.
- Primary skillset in Snowflake, AWS Glue, S3, Kafka, and Iceberg.
- Knowledge of Delta Lake, and Data Lake House.
- Experience and strong knowledge in enabling ACID compliance to the S3 object layer through Iceberg or similar technology.
- Strong knowledge of DW/BI concepts.
- Snowflake SnowPro Core Certification is required.
- Additional certifications in cloud platforms (AWS, Azure, GCP) are preferred.
- Proficiency in Snowflake architecture and features.
- Strong analytical and problem-solving skills.
- Excellent communication and interpersonal skills.
- Ability to work independently and as part of a team.
- Experience with ETL tools and data integration.
- $110,000 - $140,000 per year, depending on experience and qualifications.
- Opportunity to work with cutting-edge technologies and innovative projects.
- Competitive salary and benefits package.
- Supportive and inclusive work environment.
- Professional development and growth opportunities.
Contract job Position
Job Title: Snowflake Consultant
Work Location: Charlotte, NC
Job description:
We are looking for a candidate with strong expertise in Snowflake and the ability to write advanced SQL queries, including statistical aggregate functions, analytical functions, and complex joins. The ideal candidate will deeply understand Snowflake cloud technology, particularly in multi-cluster size optimization and credit usage management.
Key Responsibilities:
Educational Qualifications:
Bachelor’s degree in Computer Science, Information Technology, or a related field. A Master’s degree is a plus.
Experience:
Certifications:
Skills:
Salary Range:
Team Information:
You will be part of a collaborative team of data professionals, including data engineers, analysts, and architects. Our team values innovation, continuous learning, and a client-centric approach.
Why Join Us:
If you are a passionate Snowflake expert looking to make a significant impact, we would love to hear from you. Apply now to join our team in Charlotte, NC!