CV For Snowflake Traning
CV For Snowflake Traning
CV For Snowflake Traning
Professional Summary
IT SKILLS
Project :1
Data Engineer July 2019 to Present
Meredith Corporation Former as Time Inc is one of the world’s leading companies, with a monthly global print
audience of over 120 million and worldwide digital properties that attract more than 120 million unique visitors
each month.
The scope of this project is to gather data related to advertising system and load it into the snowflake
Datawarehouse present on Amazon cloud. The data from Datawarehouse would then be used by the reporting
team to generate business reports.
Involved in coordinating with Business team for understanding the business requirements and preparation of
technical design documents. Extensively involved in Requirement Gathering, Design, Analysis & Development.
Expertise in Snowflake data modelling ELT using Snowflake SQL implementing stored procedures and standard
DWH ETL
Worked on multiple project, Used technology python, aws and snowflake to load the data into data warehouse
Good development experience with Python producing data applications.
Developed End to end ETL process to bring the data from Sftp and loaded into snowflake by python API
Using Aws S3 ,Ec2,lambda services to bring the data into our server from sftp server
Build Data Platform in Snowflake using python framework as an ETL tool to built Enterprise
Reporting Data Warehouse on Snowflake.
Involved in migrating TB's of data from Teradata/redshift to Snowflake.
Experience writing complex SQL & SnowSQL.
Have good knowledge of AWS EC2, IAM, and S3.
Prepared the development, UAT, and production environment Migration Check list
Created ETL end to end pipeline from multiple sources to load the data into Snowflake.
Provide the maintenance, support and defects fixing after QA and prod deployments
Attended daily Team meetings as well as Design discussions as per business requirement
Conceptualized, Designed developed & productized new ETL pipeline Using Big data, Python, SFTP,
S3 , ec2 , API,.
Created external and normal tables and views in Snowflake database
IBM IBM Global Services US has entered into an agreement with Directv for outsourcing of Software
Application Development/Installation/Maintenance.
As an integral part of the solution proposed by IBM Global Services to Directv, applications will be maintained
and supported on an on-going basis by IBM Global Services.
Directv needs information technology services presently performed and managed by or for DIRECTV like
Business Intelligence (“BI”) software development, testing and project management services, BI maintenance
services, BI production support and other additional information technology services.
With the in-depth expertise in the Teradata cost-based query optimizer, identified potential bottlenecks with
queries from the aspects of query writing, skewed redistributions etc.
Understood & analysed the requirement documents and resolved the queries
Migrated the data from Teradata to Hadoop ecosystem hive by using hdfs , sqoop .
Created BTEQ, Fload, Mload scripts and fast export scripts according to client requirements and Experience on
performance tuning
Created and updated number of mappings, reusable sessions, workflows
Developed Informatica mappings and Work Flows to load the data from various sources using different
transformations like Source Qualifier, Router, Filter, Sequence Generator, and Expression
Created BTEQ, Fload, Mload scripts and fast export scripts according to client requirements and Experience on
performance tuning.
ACADEMIC DETAILS
2016 B.E. (Computer science and Egg) with 74.2%
SIGNIFICANT HIGHLIGHTS:
• Recognized in Manager’s Choice Award in 2019 and 2020 program for created the new frame work in
python . that helps company to save cost of lambda function (5k$ Yearly )
• Recognized in Manager’s Choice Award in 2017 &2018 program for the practice put the client first.
• Teradata Certified Professional (Version: TD14 –Basic TE0-140)