Snowflake DBA - Pharmatek Consulting Inc
New City, NY 10956
About the Job
Location: New York City, NY
Duration: 9 months C2H
Start: ASAP
Position: Hybrid(2 days onsite in a week)
JOB DESCRIPTION:
We're now looking for a Lead Snowflake DBA contract-to-hire. This is primarily an individual contributor role but may have 1 direct report at some point (e.g., another DBA).
Summary:
• Responsible for managing and optimizing Snowflake data platforms, including performance tuning, security configurations, and data sharing.
• Handle tasks like user management, resource monitoring, capacity planning, and automating data pipelines.
• Ensure the platform runs efficiently, adheres to security policies, and is cost-optimized while also enabling seamless scaling of data operations across cloud environments.
Responsibilities:
• Creation & Management of Snowflake Objects like Database/Schema /Tables/Views/MV's /Pipes, Streams, Task Creation & Management.
• Account level Objects like Warehouse, Resource Monitor, Network Policies, Integration (Storage, Notification), Shares Connectivity issues resolution wrt ODBC, JDBC, Python Connecters.
• Snowflake UDF using SQL, Python, JavaScript and Java and help them to tune their SQL/UDF.
• Create replication and failover groups and restore data from Fail safe or from time travel data.
• Troubleshoot data loading issues, connection issues and Snow pipe issues.
• User management, Implementation of RBAC best practices.
• Access Rights Management, experience with privileged roles Accountadmin, Sysadmin and Securityadmin.
• Analysing and Planning the Workload, Warehouses with Multi-clusters, Scaling policies and Application Specific warehouse creation and management for efficient Compute consumption.
• Generate, manage and analyze Cost/credits Consumption reports and identify cost optimization areas.
• ETL batch Support and issues resolution.
• Secure Data Sharing using managed accounts Creation, Reader Account provisioning and managing resource monitor, credit Quota, IP whitelisting.
• In depth knowledge of Micro partition, Caching, Table Clustering, Recommendation for Cluster key, Analysing the SQLs using query profile, identifying gaps, patterns and trends, recommendation to BI team based upon the analysis.
Requirements
Must Have:
• Snowflake Datawarehouse Administration.
• Snowflake Utilities SnowSql, SnowPipe.
• Time travel, Fail safe, Clustering, Cloning, Metadata management.
• Scripting - Java Script, Unix shell Scripting, Basic of Python.
• Cloud Platform - Aws Storage (S3), Azure Blob Storage and Containers.
• Snowflake Unique features use case and implementation (Time Travel, Un-drop, Zero Copy Clone) as per the requirement.
• Worked on thorough testing and implementation of Dev-ops Pipeline for Objects Promotion using GitHub, Jenkins.
• Working with Snowflake Support, handling P1 , P2 incidents, Engaging Support for technical issue on all the Snowflake Accounts under organization including Readers Account.
Good To Have:
• Expertise in SQL, Snowflake procedures, UDF's using JavaScript.
• Automation- CI/CD using Jenkins Pipeline (Code promotion), Github for version control.
• Scheduling- ESP Jobs Scheduling, AWX Ansible Tower.
• Familiarity with JIRA, ServiceNow / GLPI.
• Experience with ITIL process, Change management, Incident handling, JIRA , Development using Agile methodology.
Write, edit, review, and approve lifecycle documentation for computerized systems. URS, FRS, CDS, CS, ERES assessments, IA, IOQ • Use both Veeva and Kneat to do the above • Integrate equipment with DeltaV and PI historian using OPC, EtherNet/IP, PI interfaces and PI connectors • Execute IQs and IOQs in Kneat on DeltaV