At evolv Consulting, we are seeking to hire an experienced Snowflake Architect to join our team. If you’re a passionate self-starter, evolv is a great place to get ahead.
Define the resource matrix to support client initiatives.
Be the Single escalation point for performance and resourcing issues.
Define, Design, build, support Source Extract to Staging Load processing using Azure Data Bricks.
Define, Design, implement Security (all aspects: FW/Proxy/roles/column or row level security/etc.): non-functional.
Understand, model, monitor Cost allocation with Azure and SF (these are technical tasks): non-functional.
Define, Design, build, support Tidal automation (for all runtime environments): non-functional.
Define, Design, build, support Snowflake Transformations from Staging to Standard Data Model.
Understand Operational Impact.
Participate in Data Modelling exercises – translate logical to physical.
Tidal agent configuration/binaries (i.e. oracle driver/etc.): non-functional.
Define Operational Monitoring/escalation framework: non-functional.
Documentation within Confluence (our SharePoint Repository).
Participate in project planning and estimating activities for technical aspects.
Performance Tuning / Optimization of Snowflake resource utilization for strong cost.
Design and implement effective Analytics solutions and models with Snowflake.
Examine and identify Data Warehouse structural necessities by evaluating business requirements.
Assess Data Warehouses to ensure they comply with internal and external regulations
Monitor the system performance by performing regular tests, troubleshooting, and integrating new features.
Recommend solutions to improve new and existing Data Warehouses.
Offer support by responding to system problems in a timely manner.
Guidance to developers in preparing functional/technical specs to define reporting requirement and ETL process.
Develop complex data models in Snowflake.
Tune Snowflake for performance and optimize utilization.
Maintain deep understanding of complementary technologies and help organizations leverage Snowflake as part of their larger technology stack.
Expertise in Snowflake – data modeling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts.
Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, zero copy clone, time travel and understand how to use these features.
Expertise in deploying Snowflake features such as data sharing.
Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python.
Other duties as assigned.
Bachelor’s Degree in in engineering, Data Science, AI, Mathematics, Statistics, Computer Science, Computer Engineering, or related discipline.
5+ years of professional technical experience, 5+ years of hands-on Big Data architecture.
5+ years of experience building highly scalable Big Data solutions using Hadoop, Spark, and Lambda architecture.
Experience with Snowflake administration.
Experience with Snowflake transformations from Staging to Standard Data Model.
Experience working in cloud environments (AWS and/or Azure).
Strong client-facing communication and facilitation skills.
Strong database migration strategy and analytics fundamentals including database migration and ETL accelerators.