Naveen Resume

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 4

Naveen Kumar – Curriculum Vitae – Data Engineer

Hand Phone:+91-9666999623 DOB: 30 Aug 1990


Email: [email protected]
www.linkedin.com/in/naveen-kumar-49234a39

Objective and Summary

A creative hands-on with 10 plus years experience, exceptional technical skills, and a business-focused
outlook. Adept in analyzing information system needs, evaluating end-user requirements, custom designing
solutions for complex information systems management.

Diversified exposure in Software Process Engineering, developing, building enterprise applications


Big Data Technologies in various domains like Finance, Insurance sectors.

Proven the ability of programming, Implementation and the project Conceptualization.


Interacting with the clients on regular basis, gathering requirements and implementing the task
with team members.

Technical Skills

Programming and Scripting: Database Development:

 Python  Oracle SQL,PLSQL Development


 SQL,PL/SQL  NOSQL Databases(JSON)
 AWS  SQL Server
 Data Vault  Postgresql
 Airflow
 Redshift
 DBT
 Athena
 AWS Glue
 Pyspark

Environments: Configuration and Build Management:

Windows 2012, Windows 2008 R2, Windows 2002 I have strong skills in Source Control, Continuous
Integration, automated build, deployment and release
Version Control Tools: for complex software solutions and their associated
database objects, including: change control, defect
 PVCS tracking and UAT requirements.
 GIT Good knowledge on branching strategies used in
 BIT BUCKET development life cycle.

Continous integration Tools:

 BAMBOO
 Azure Devops
Professional Summary:
 Intereacting with business team to discuss and understand the data flow and designing the data
pipelines as per the requirements.
 Experience in driving the team to meet the target deliverables.
 Created automated pipelines to load data from S3 into Redshift and orchestrated with Airflow.
 Implemented Spark with Python and Spark SQL for faster processing of data.
 Leveraged Python, Spark Core, Spark SQL, Data Frames to develop high-performance data processing
applications, achieving significant data processing speedups.
 Handled importing of data from various data sources, performed transformations using Spark and
loaded data into S3.
 Identified data sources and developed pipelines using PySpark for data normalization into BRDs
(Business Ready Datasets).
 Experienced in loading and transforming large sets of structured and semi-structured data (including
MySQL, CSV, spreadsheets, text files).
 Designed and developed end-to-end ETL solutions and processing applications for streaming and batch
data.
 Experience in using version control tools like Visual Studio, Bitbucket to share the code snippet among
the team members.
 Created a reusable framework to process data from staging into core database .
 Vast experience in data driven applications ,creating data pipe lines, creating interfaces between up-
stream and down-stream applications and tuning the pipe lines.
 Decent knowledge on data lake and data marts concepts.
 Expertise in designing,development and maintenance of applications using Oracle SQL & PLSQL
 Proficient in Developing PL/SQL Programs Using Advanced Performance Enhancing Concepts like Bulk
Processing, Collections and Dynamic SQL
 Sound knowledge in using Oracle materialized views

 Effectively made use of Indexes, Collections, and Analytical functions

 Sound knowledge in using Oracle External Tables.

Professional Experience:
Sr.Technical Lead Oct 2017 to
Creditsafe Technolgy Private Limited, Hyderabad – Data Engineer Present.

Creditsafe’s core business offering is an on-line company credit report based on original source public and
private data which is brought together to give information on a company and an assessment of its
creditworthiness. Creditsafe provides a credit rating and suggested credit limit in all its reports.
Creditsafe reports are accessed via Creditsafe’s own websites over the internet by most of its customers
however increasing large volume customers purchase solutions that either integrate into their own accounting
software or use one of Creditsafe’s own integrated services such as Creditsafe 3D.
Creditsafe combines data from all its operating countries in its reports in an attempt to provide full international
details on companies that operate in more than one country or who have other international linkages
Projects:
 Creditsafe USA
 DET
 Crystal(CRM Portal)

Technologies:
 SQL,PL/SQL
 Python,S3,Redshift,Airflow,Glue,Data Vault
 Oracle 12C,19C ,SQL Server,Postgresql

Role and Responsibilities:


 Migrating the all ETL pipelines using Airflow.
 Handled importing of data from various data sources,cleaning the data and performing required SQLs
to meet the business requirements.
 Worked on maintaining database code in the form of stored procedures, scripts, queries, views, triggers,
etc.
 Extensively used csv files in ETL pipelines for better data processing.
 Good experience in developing data flows,stored procedures,bulk data loads.
 Extensively worked on NOSQL Structures(JSON) and imported to oracle database using user defined
packages.
 Enhancements of the applications using SQL and PL/SQL.
 Usage of pandas and numpy in most of the data analysis tasks.
 Automated the daily adhoc activities using python .
 Built continuos integration & delivery process for sql queries using bamboo and azure devops.

IBM India Pvt Ltd , Hyderabad – Senior Software Engineer Jan/2017 to Oct/2017
Cisco Revenue Management - NGCCRM (Next Gen Compliance, Commitment and Revenue Manager)
main purpose of CCRM tool is to help in tracking down the revenue deferrals. Current CCRM is a tool
for managing revenue deferrals for non-std deals.
All deals having compliance terms, which require a revenuedeferral, transfer or an invoice hold of at least
$100,000 must be maintained in CCRM.
Also those deals which are not formally won can be included as pipeline deals if there is a confidence of winning
the deals and expecting a booking in 60 days and will result in a revenue hold of at least $100,000
Technologies:
 SQL,PL/SQL
 Informatica 9.1
Role and Responsibilities:
Deftly managed the entire responsibilities based on:
 Implemented the backend test automation framework to reduce the manual testing efforts.
 Understanding the RAD document and the project requirements.
 Implementing parallel processing mechanism for batch jobs execution to achieve the better
performance.
 Enhancements of the applications using SQL and PL/SQL.

AETINS Global Services PVT LTD , Hyderabad – Analyst Oct/2013 to Dec/2016


Programmer
(Formerly know as Aetins India Pvt. Ltd)
Aetins offer a single end-to-end Insurance and Takaful Solution that covers all lines of business: Individual Life,
Group Life, Investment Linked and General. It spans across functions like illustration, quotation, new business,
policy servicing, claims, agency management, commission and benefits, accounting and services. Our business is
to help Insurance and Takaful Companies to strategise and operate by leveraging on Information Technology

 General Insurance System


 Life Insurance System

Technologies:
 SQL,PL/SQL
 Oracle Forms(6i,10G,11G)

Customers I worked with in AETINS:


 HLA, Malaysia
 Salama Takaful, Dubai
 Medgulf ,Bahrain
 LOLC Insurance, Sri Lanka
Role and Responsibilities:
 Deftly managed the entire responsibilities based on:
 Understanding the RAD document and the project requirements.
 Enhancements of the applications using SQL and PL/SQL.
 Creating and modifying the exixting forms as per the user requirements.

Educational Qualifications
Bachelor of Technology – Jawaharlal Nehru University, Kakinada, India
Completed my studies with honors and received a First Class Degree (2007-2011).

You might also like