Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Kotari Vikram

Cary

Summary

• Strong experience in migrating other databases to

Snowflake.
• Work with domain experts, engineers, and other

data scientists to develop,implement, and improve

upon existing systems.
• Experience in analyzing data using HiveSQL.
• Participate in design meetings for creation of the Data

Model and provide guidance on best data

architecture practices.
• Experience with Snowflake Multi - Cluster Warehouses.
• Experience in building Snowpipe.
• Experience in using Snowflake Clone and Time Travel.
• Experience in various methodologies like Waterfall and

Agile.
• Extensive experience in developing complex stored

Procedures/BTEQ Queries.
• In-depth understanding of Data Warehouse/ODS, ETL

concept and modeling structure principles
• Build the Logical and Physical data model for snowflake

as per the changes required
• Define roles, privileges required to access different

database objects.
• In-depth knowledge of Snowflake Database, Schema

and Table structures.
• Define virtual warehouse sizing for Snowflake for

different type of workloads.
• Worked with cloud architect to set up the environment
• Coding for Stored Procedures/ Triggers.
• Designs batch cycle procedures on major projects using

scripting and Control
• Develop SQL queries SnowSQL
• Develop transformation logic using snowpipeline.
• Optimize and fine tune queries
• Have good Knowledge in ETL and hands on experience

in ETL.
• Experience on Migrating SQL database to Azure data

Lake, Azure data lake Analytics, Azure SQL Database,

Data Bricks and Azure SQL Data warehouse and

Controlling and granting database access and

Migrating On premise databases to Azure Data lake

store using Azure Data factory.
• Analyze, design and build Modern data solutions using

Azure PaaS service to support visualization of data.

• Understand current Production state of application and

determine the impact of new implementation on

existing business processes.
• Extract Transform and Load data from Sources Systems

to Azure Data Storage services using a combination of

Azure Data Factory, T-SQL, Spark SQL and U-SQL

Azure Data Lake Analytics . Data Ingestion to one or

more Azure Services - (Azure Data Lake, Azure

Storage, Azure SQL, Azure DW) and processing the

data in In Azure Databricks.
• Operationalize data ingestion, data transformation and

data visualization for enterprise use.
• SAS Metadata and ETL developer with extensive

knowledge of building & implementing metadata

repository & metadata security
• Expertise in SAS Data Integration (DI) studio, SAS

Management console and SAS BI suite
• Strong knowledge of installation, configuration and

troubleshooting of SAS Information Map Studio, SAS

OLAP Cube Studio, SAS Information Delivery Portal and

SAS Web Report Studio
• SAS ETL developer with expertise in design and

development of Extract, Transform and Load processes

for data integration projects to build data marts

16+ years of experience on SAS Intelligence Platform

and its components.
• Statistical data analyst with extensive experience in

business data mining, database marketing and

statistical modeling using SAS.
• More than four years of experience as a SAS

Programmer and expert in data mining, database

marketing, direct marketing, predictive modeling,

customer profiling, clustering and segmentation

modeling using SAS Enterprise Miner & SAS EG.
• Extensive knowledge of using Base SAS, SAS/Macros,

SAS/SQL, SAS/STAT & SAS EG.
• Advanced knowledge of statistical models such as

clustering, segmentation, predictive modeling, ANOVA,

regression analysis, decision tree.

Overview

22
22
years of professional experience
1
1
Certification

Work History

Lead Data Engineer

Siemens Healthineers
01.2019 - Current

• Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks. Pipelines were created in Azure Data Factory utilizing Linked Services/Datasets/Pipeline/ to extract, transform, and load data from many sources such as Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool, and backwards. Used Azure ML to build, test and deploy predictive analytics solutions based on data. Developed Spark applications wif Azure Data Factory and Spark-SQL for data extraction, transformation, and aggregation from different file formats in order to analyze and transform the data in order to uncover insights into customer usage patterns. Analyzed the SQL scripts and designed it by using PySpark SQL for faster performance. Applied technical knowledge to architect solutions that meet business, and IT needs, created roadmaps, and ensure long term technical viability of new deployments.

• Worked on SnowSQL and Snowpipe
• Created Snowpipe for continuous data load.
• Used COPY to bulk load the data.
• Created internal and external stage and transformed data during load.
• Redesigned the Views in snowflake to increase the performance.
• Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to

Extract, Transform and load data from different sources like Azure SQL,

Blob storage, Azure SQL Data warehouse, write-back tool and backwards.
• Developed stored process to perform deltas.
• Loaded the tables from the DWH to Azure data lake using azure data

factory integration run time.
• Loaded the tables from the azure data lake to azure blob storage for

pushing them to snowflake
• Migrating data from Teradata to Snowflake
• Experience in developing Spark applications using Spark-SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats for Analyzing transforming the data to uncover insights into the customer usage patterns
• Extract Transform and Load data from sources Systems to Azure Data Storage services using a combination of Azure Data factory, T-SQL
• Spark SQL and Azure Data Lake Analytics
• Data ingestion to one or more Azure services (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in Azure Databricks
• Develop Spark applications using pyspark and spark SQL for data extraction, transformation, and aggregation from multiple file formats for analyzing and transforming the data uncover insight into the customer usage patterns
• Extract, Parse cleaning and ingest data
• Responsible for estimating the cluster size, monitoring, and troubleshooting of the Spark databricks cluster
• Ability to apply the spark DataFrame API to complete Data manipulation within spark session
• Good understanding of Spark Architecture including spark core, spark SQL, DataFrame, Spark streaming, Driver Node, Worker Node, Stages, Executors and Tasks, Deployment modes, the Execution hierarchy, fault tolerance, and collection
• Experience in data ingestion and processing pipelines using spark and python
• Design ETL processes SAS Data Integration Studio to populate data from

various hospital information systems (Invitro , Vista and others) and the modelling of clinical workflow
• Build Jobs in DI Studio to support Ad hoc requests
• ETL Performance tuning, by using analytical functions and executing SQL within Teradata
• Build Stored Process in SAS EG with static and dynamic prompts for product-specific analyses
• SAS Stored process to Job execution services on SAS Viya
• Modifying existing codes for improvement in the performance
• Build visualizations and reports using the SAS Visual Analytics
• Build automated CAS User Transform to upload In Memory tables using

Delta's
• Migrate DI jobs from DI to SAS Viya ( import metadata , jobs create job service to replace existing stored process)
• Build & Deploy Flows and sub flows on LSF
• Environment: SAS 9.4, SAS Enterprise Guide 5.1, 7.1 SAS Data Integration Studio 4.9, SAS Information Map Studio 4.4, SAS Web Report Studio 4.4, Teradata 14.10, SAS The Cloud Analytic Services (CAS) Viya 3.4, Python 3.5.7, LSF(Platform Scheduler),ADF (Azure Data Factory) Databricks , Pyspark,Snowflake,Sql Sever

Sr SAS Technical Consultant

Texas General
06.2017 - 07.2019
  • Design and Develop ETL processes using Base SAS and SAS Data Integration Studio to populate Data Warehouses and Reporting Data Marts
  • Experience in developing DI jobs to ingest data into HDFS, Hive
  • Build DI Jobs to extract data from HIVE
  • Migrate Enterprise Miner project (Credit Models) from SAS to Data Bricks using azure storage area SQL Endpoint)
  • Validate and integrate Data Model for Warehouse changes as per new requirements
  • Build Jobs in DI Studio to support the ETL functions
  • Perform the ETL tuning (i.e., using lookups instead of join if possible, removing unnecessary components Create Custom transformation to Optimize ETL jobs executed on DB2 (SCD1 & SCD2 Transforms)
  • Build reports using SAS BI tools (Web Report Studio, Information Maps and SAS OLAP Cube Studio)
  • Build visualizations and reports using the SAS Visual Analytics and SAS BI Tools
  • Migrate reports from Cognos to Power BI
  • Data access, manipulation and reporting functions with R, implementation of analytics solution as per project requirements
  • Migrate SAS DI Jobs to Pipelines and schedule in Airflow
  • Ingest data from Sql Server & Teradata to Delta’s using pipelines
  • Build Delta’s in Azure storage and import deltas to Azure SQL Server
  • Design ETL & LSF Flows and Schedule LSF flows (DI Studio jobs) using LSF (IBM’s Load Sharing Facility)
  • Environment: SAS 9.4, SAS Enterprise Guide 5.1, 7.1 SAS Data Integration Studio 4.9, SAS Information Map Studio 4.4, SAS Web Report Studio 4.4, DB2, SQL Server 10.2, SAS VA 7.3, Hive 0.14, Power BI ,Databricks (Python ,Spark &Pyspark) ADF, airflow

Sr SAS Technical Consultant

Schlumberger
12.2016 - 06.2017
  • Operational models will be developed, reports and visualizations will be created based on the marts
  • Designed and developed ETL processes using Base SAS and SAS Data Integration Studio to Data Marts
  • Convert SAS code used in Aggregator models to Python for providing insights to Management using SAS VA
  • Design and develop ETL jobs on SAS DI studio and use Base SAS wherever required to upload data to marts on daily basis
  • Build stored process and integrate Java Script and Html with stored process in SAS to create an Input form to SAS Report to enable end user input required metrics
  • Migrate Python code to SAS, used in implementation of ad hoc requests to automate and produce data to be pushed to the LASR server using DI flow as source for SAS Visualizations
  • Migrate models Developed in Python to SAS DI jobs
  • Execute Proc Imstat to maintain the LASR datasets on LASR server
  • Build jobs using Hadoop transformations, to read the files exported from Big Data environment and export files from SAS HDFS file system
  • Extract Data from HDFS and load them into LASR Server and build visualizations using Visual Analytics data builder and designer
  • Migrate python code to SAS jobs and existing DI Studio jobs to replace SAS tables with Hive tables
  • Build stored process and integrate Java Script and Html with stored process in SAS to create an Input form to SAS Report to enable end user input required metrics
  • Created stored processes in Enterprise Guide as objects in SAS VA to support the end user's needs for visualizations
  • Build pricing index visualizations in SAS VA as per Business requirements
  • Environment: SAS 9.2, 9.4 SAS Enterprise Guide 5.1, 7.1 SAS Data Integration Studio 4.6, 4.9 SAS Information Map Studio 4.2, SAS Web Report Studio 4.2, MS SQL 10.9, Hive 0.14, Python 3.6.4, SAS VA 7.3 .

SAS BI Consultant

Shell
08.2016 - 12.2016
  • Build data marts, design and develop ETL jobs, info maps and cubes as part of Phase2 reporting to support the campaigns across different countries
  • Build ETL jobs using DI studio to extract and integrate data required to implement campaigns
  • Designed and developed ETL processes using Base SAS and SAS Data Integration Studio to populate Data Warehouses and Reporting Data Marts
  • Migrate existing DI Studio jobs to replace SAS tables with SAP HANA tables
  • Optimize jobs to ensure in database processing is implicitly enforced to improve performance
  • Engaging with the clients to get the requirements for the campaigns and driving the initiate to approach for the campaign in SAS
  • Creating SAS Information Maps and customizing it for SAS Customer Intelligence (CI) Studio
  • Setting up and administering CI business contexts and campaign resources and templates to support SAS CI campaign management
  • Environment: SAS 9.2, 9.4 SAS Enterprise Guide 5.1, 7.1 SAS Data Integration Studio 4.6, 4.9 SAS Information Map Studio 4.2, SAS Web Report Studio 4.2, MS SQL 10.9, SAP HANA, SAS Activity-Based Management 7.2, SAS MA 6.4 (Customer Intelligence)

SAS DI Consultant

UKAR
07.2015 - 08.2016
  • Neptune & Phoenix Brand Separation):
  • Sale or divestment of our mortgage servicing operations (Phoenix) and the sale of assets (Neptune), part of these portfolios were sold off to a different client (Cerberus, TSB, BAWA and Computershare).Servicing though was retained by UKAR, sale required generation of new reports and marts specific to clients
  • Migrate/separation existing data to reconciliate with treasury on the MI reports after separation of books
  • Develop MI reports as specified on BRS on DI Studio
  • Build data marts for different areas of business and develop reports for Performance Based Pricing
  • Modify existing IFRS reporting code residing in Enterprise Guide projects scheduled on windows server to DI studio jobs and schedule then on LSF (International Financial Reporting Standards) based on requirements defined by Project team
  • Interact with the business user's onshore development team on regular basis to consolidate requirements
  • Design and develop ETL process for Performance Based pricing Mart
  • Create ETL flows to incorporate the new jobs developed using SAS Data Integration Studio
  • Modify and optimize existing jobs and build new flows in LSF
  • Develop automated process to maintain the storage on databases by creating partitions and purge data where required
  • Modify existing Web reports to meet the new Business requirements
  • Build summarised tables to be used in creating visualizations from LASR server and create stored process as objects to implement statistical procedures
  • Environment: Windows, Unix, SAS9.3, 9.2, 9.1, MS Sql server, Base SAS, SAS Macros/Access, DI Studio 4.2,4.6 EG 5.1, Web report Studio, Platform LSF.SAS VA 7.1
  • Shell

SAS BI Consultant

London UK
12.2014 - 06.2015
  • Build data marts, design and develop ETL jobs and info maps cubes as part of Phase2 reporting to support the campaigns across different countries
  • Build ETL jobs using DI studio to extract and integrate data required to implement campaigns
  • Designed and developed ETL processes using Base SAS and SAS Data Integration Studio to populate Data Warehouses and Reporting Data Marts
  • Build jobs using Hadoop transformations, to read the files exported from Big Data environment and export files from SAS HDFS file system
  • Extract Data from HDFS and load them into LASR Server and build visualizations using Visual Analytics data builder and designer
  • Migrate existing DI Studio jobs to replace SAS tables with SAP HANA tables
  • Created stored processes in Enterprise Guide to support the end user's needs for Marketing Automation
  • Optimize jobs to ensure in database processing is implicitly enforced to improve performance
  • Environment: SAS 9.2, 9.4 SAS Enterprise Guide 5.1, 7.1 SAS Data Integration Studio 4.6, 4.9 SAS Information Map Studio 4.2, SAS Web Report Studio 4.2, MS SQL 10.9, SAP HANA .

SAS Technical Consultant

The AA, Basingstoke
04.2011 - 12.2014
  • Develop Experian Marts for the Pricing campaigns
  • Migration of the existing ETL process on data warehouse from SAS DI Studio 3.4 to SAS DI studio 4.2.ETL processes on DI Studio 3.4 were using SPD database causing performance issues to make the process efficient, decision was made to move the database to Oracle
  • Interact with the business users on regular basis to consolidate and analyze the requirements and extensively involved in defining design & technical specifications
  • Migrated existing processes on SAS Di Studio version 3.4 to SAS DI Studio 4.2 with older version having its database as SPD to Oracle
  • Fine-tuned SAS programs to enable process to execute on oracle to increase performance and reduce resource utilization
  • Designed and developed ETL processes using Base SAS and SAS Data Integration Studio to populate Data Warehouses and Reporting Data Marts
  • Configuration and maintenance of Users, Groups and Roles in SAS Metadata
  • Incident management and change management on SAS Platforms in timely manner
  • Provide process improvement recommendations for the development team
  • Re design ETL flows to ensure that flows run efficiently utilizing resources and decrease run time
  • Re design the user written transform to ensure they are metadata driven
  • Analyze the data uploads for quality and validity
  • Prepares and maintains Defect Logs
  • Attends meetings for UAT test overview
  • Design ETL & LSF Flows and Schedule LSF flows (DI Studio jobs) using LSF
  • Develop the graphical design of the process workflow
  • Environment: Windows, Unix, SAS9.2, 9.1, Oracle 11g, Base SAS, SAS Macros/Access, DI Studio 3.4,4.2, EG (3.4,4.3,5.1), SMC(9.1,9.2), Platform LSF.

SAS Technical Lead

British Gas, Staines UK
04.2010 - 04.2011
  • Design data marts from the business warehouse (SAP BW) to support analytics for British Gas
  • Data profiling and segmentation of portfolio to implement new smart metering technology
  • Develop new and automate existing reporting strategies
  • Analyze the data requirements for new reports for marketing department
  • Performed Dimensional Data modelling and created Star Schema and Snowflakes Schema data model to meet the reporting needs
  • Interacted with the business users to consolidate and analyze the requirements and present them with prototype of report designs
  • Segmentation and profiling of data for analysis and uploads to the mart
  • Designed and developed ETL processes using Base SAS and SAS Data Integration Studio to populate Data Warehouses and Reporting Data Marts
  • Designed and implemented prototype of group-based and role-based security via SAS Management Console
  • Development of processes aimed at the synchronization of the Groups\Users between Active Directory and Metadata Server
  • Create ETL flows and automate jobs
  • Develop and test the upload code using SAS
  • Analyze the data uploads for quality and validity
  • Adding new data sources for supporting new reports through management console
  • User and Server administration from management console
  • Performance tuning
  • Performed Test Cases review, walkthroughs and Test Execution
  • Performed manual system testing, functional testing and end to end testing of the reports developed
  • Design ETL & LSF Flows and Schedule LSF flows (DI Studio jobs) using LSF
  • Develop the graphical design of the process workflow
  • Environment: Windows, Unix, SAS9.2, 9.1, SAP R/3,Base SAS, Macros, SAP Data Surveyor, SAS/Access, DI Studio4.2, EG (4.2), Management Console& Data Flux, Agile, Platform LSF.

SAS Consultant

Hutchison 3G, Maidenhead
12.2009 - 04.2010
  • Implement changes to the existing Geo-Marketing data base warehouse for business to develop new pricing strategies for the existing customers as a part of cross selling campaigns and to acquire new customer base to generate revenues for the business
  • Involved in defining Business requirements
  • Identify changes required for the existing Geo-Marketing data base warehouse and implement changes
  • Extensively involved in defining design & technical specifications
  • Analyze the data requirements for new reports for marketing department
  • Designed and developed ETL processes using Base SAS and SAS Data Integration Studio to populate Reporting Data Marts
  • Development of Info Maps, with SAS Information Map Studio, to create relational reports, or multidimensional, based on SAS Web Report Studio with integration User Prompt and Stored Process
  • Deploy the jobs through grid manager to ensure performance improvement
  • Create ETL flows and automate jobs
  • Develop and test the upload code using SAS
  • Analyze the data uploads for quality and validity
  • Validating/Profiling data and applying data transforms using data flux
  • Adding new data sources for supporting new reports
  • Develop the graphical design of the process workflow
  • Environment: Windows, SAS9.2,9.1, Teradata, Base SAS, Macros, SAS/Access, DI Studio4.2, EG (4.1), Management Console, Data Flux, SAS Grid Manager, SAS Web Report Studio 4.1, SAS BI Dashboard 4.1.

Senior Data Analyst

Barclay's PLC
03.2009 - 08.2009
  • (CREC Automation), Automate CREC (Credit Risk Economic Capital) reporting process is to gauge the risk for accounts (probability of default, loss given default, exposure at default ) on local business and mortgage portfolio for the remaining loan term with Barclay's by monitoring the performance of loans by individual accounts at present (point in time) and past status (12 months back).This is required to estimate economic capital this is the internal estimate of the capital required by the bank to meet FSA & BASEL requirements
  • Involved in defining Business requirements
  • Extensively involved in defining design & technical specifications
  • Performed Dimensional Data modelling and created Star Schema and Snowflakes Schema data model to meet the OLAP needs
  • Create ETL flows (job chains) to automate jobs into live/production environment
  • Retrieving data from Teradata database using SAS/ACCESS
  • Automate the credit models code and do the stress testing on models as per requirements
  • Coding and Validating SAS Programs as per technical specifications
  • Maintenance of Report Specification Document
  • Performed Test Cases review, walkthroughs and Test Execution
  • Performed manual system testing, functional testing and end to end testing of the developed reports
  • Performance tuning
  • Identify triggers to initiate report generation and automate the report generation process
  • Responsible for maintaining report
  • Use DDE to export data into excel spreadsheet, to create Pivot tables and provide insights for the ad-hoc requests
  • Using ODS to display outputs in HTML, RTF, CSV, or other desired file formats
  • Automate delivery of reports thru auto email facility in SAS
  • Mentor offshore development team
  • Environment: Windows, SAS9.1, SAS (ETL)-IAB Administrator, UNIX, Teradata, Base SAS, Macros, SAS/Access

Senior SAS Consultant

BankAmerica, MBNA
05.2008 - 07.2008
  • BOA has wide range of products and among them are loans, current accounts, and some accounts from portfolio turn bad and are moved to recovery and collections a generic scoring model is in place to calculate risk score for these accounts
  • Based on risk score decisions are made
  • In place of old system a new and intelligent system is developed where these accounts would be scored by corresponding model developed for an individual product and risk score is calculated accordingly
  • Involved in defining Business requirements
  • Extensively involved in defining design & technical specifications
  • Create ETL flows (job chains) to automate jobs into live/production environment
  • Retrieving data from Teradata database using SAS/ACCESS
  • Coding and Validating SAS Programs as per technical specifications
  • Performance tuning
  • Identify triggers to initiate report generation and automate the report generation process
  • Maintenance of Report Specification Document
  • Recording, reporting and closing defects through the entire System Integration and User Acceptance testing cycle
  • Prepares and maintains Defect Logs
  • Attends meetings for UAT test overview
  • Provides UAT support during Formal Acceptance Testing
  • Prepares test data for UAT phase
  • Responsible for validating scorecard models outputs
  • Use DDE to export data into excel spreadsheet, to create Pivot tables and provide insights for the ad-hoc requests
  • Using ODS to display outputs in HTML, RTF, CSV, or other desired file formats
  • Automate delivery of reports thru auto email facility in SAS
  • Environment: Windows, SAS9.1, SAS (ETL)-IAB Administrator, UNIX, Teradata, Base SAS, Macros, SAS/Access.

Project Lead

Target Corporation, Minneapolis
01.2007 - 04.2008
  • ( Asset and Campaign Management, Extract data as specified by econometricians for analysis of single-period models and data was for different data points 6, 12 &18 months based specific portfolio and risk probability of the asset
  • Analysis of specific factors that impact the asset value over a period of time (change criterion as specified for individual factors that impact the model)
  • Analysis for liability of a specific portfolio to eliminate the exposure with change factors that influence rate of interest
  • Based on account performance evaluation analysis decision is made to reprice the existing credit card account holders and to determine the accounts that fall into these categories business defines metrics for segmentation and these metrics are implemented using SAS
  • Retrieve the accounts and reprice the accounts to rates decided
  • Involved in defining Business requirements
  • Extensively involved in defining design & technical specifications
  • Create jobs in live/production environment
  • Retrieving data from Oracle database using SAS/ACCESS
  • Data profiling and segmentation for campaigns
  • Coding and Validating SAS Programs as per technical specifications
  • Automate the process and auto email the generated report using SAS
  • Using ODS to display outputs in HTML, RTF, CSV, or other desired file formats
  • Oracle, Base SAS, Macros, SAS/Access
  • ETL process is built using Data stage and SAS Base, Macros and SAS/Access are used to extract data from the marts and report generation
  • Data is accessed from Oracle database & DB2
  • Responsibilities:
  • Involved in defining Business requirements
  • Extensively involved in defining design & technical specifications
  • Create ETL flows (job chains) to automate jobs into live/production environment
  • Retrieving data from Oracle database using SAS/ACCESS
  • Coding and Validating SAS Programs as per technical specifications
  • Automate the process and auto email the generated report using SAS
  • Using ODS to display outputs in HTML, RTF, CSV, or other desired file formats
  • Environment: DB2, SAS 9.1(EG4.1), Base SAS, Macros, SAS/Access

Manager Analytics

HSBC Retail Services Analytics, Chicago
05.2004 - 01.2007
  • Household Retail Services SAS Data Marts at Subject level detail for Consumer, Closed End &Commercial data
  • ETL applications are developed using SAS Base, Macros and SAS/Access
  • Data is accessed from AR source systems flat files, VSAM files & DB2
  • The Source System data is transformed into Data Warehouse Staging areas
  • It's then validated, structured, integrated and summarized, then loaded into the respective Data marts (SAS) from the Staging area
  • Involved in defining Business requirements
  • Extensively involved in defining design & technical specifications
  • Create ETL jobs in live/production environment
  • Retrieving data from Mainframes using FTP infield option
  • Coding and Validating SAS Programs as per technical specifications
  • Maintenance of Report Specification Document
  • Validating the SAS Programs, by cross checking the data
  • Performance tuning
  • Identify triggers to initiate report generation and automate the report generation process
  • Automate reports using SAS ODS and scheduling automated jobs using CA-7 scheduler
  • Automate the process and auto email the generated report using SAS
  • Using ODS to display outputs in HTML, RTF, CSV, or other desired file formats
  • Environment: SAS V 8.1.2, MVS, Base SAS, Macros, SAS/Access.MVS .

SAS Data Analyst

American Express
03.2003 - 05.2004
  • CAS (Credit Authorization system) for American Express was based on fraud and credit rules developed in-house in UNIX and this online system functionality was simulated in development environment to test new policies and rules before implementation was involved in creation of SAS simulator which translates the fraud, credit policies and rules for approval and declining transactions the functionality of online system for testing purposes
  • This project involves in data profiling & quality for tracking critical variables, which are used in implementing statistical models, and variables are monitored for the data specified actions are performed
  • Planning ETL process
  • Identify Feeder File
  • (Data Source)
  • Identify Data files and Copybooks in MF
  • Decide Data Gathering Approach (100% or Sampling)
  • Identify Critical variables by running across model variables
  • Coding for all the above activities in Base SAS
  • Involved in defining Business requirements
  • Extensively involved in defining design & technical specifications
  • Create ETL jobs in live/production environment
  • Retrieving data from Mainframes using FTP infile option
  • Coding and Validating SAS Programs as per technical specifications
  • Performance tuning
  • Identify triggers to initiate report generation and automate the report generation process
  • Automate reports using SAS ODS and scheduling automated jobs using CA-7 scheduler
  • Automate the process and auto email the generated report using SAS
  • Using ODS to display outputs in HTML, RTF, CSV, or other desired file formats
  • UNIX, Mainframes
  • SAS V8, Project Description: Credit Authorization System Simulation (CAS-AA Simulation)
  • Coding rules and policies in Base SAS & Macros and for extracting data from mainframes
  • Responsibilities:
  • Involved in defining Business requirements
  • Extensively involved in defining design & technical specifications
  • Create ETL jobs in live/production environment
  • Retrieving data from Mainframes using FTP infile option
  • Coding and Validating SAS Programs as per technical specifications
  • Performance tuning
  • Identify triggers to initiate report generation and automate the report generation process
  • Automate reports using SAS ODS and scheduling automated jobs using CA-7 scheduler
  • Automate the process and auto email the generated report using SAS
  • Using ODS to display outputs in HTML, RTF, CSV, or other desired file formats
  • Environment: UNIX, Mainframes.SAS V8

Education

BACHELOR OF ENGINEERING - ELECTRONICS & COMMUNICATION

Osmania University
April 2000

Skills

  • Programming:
  • SQL, VB-script, Base SAS and SAS/Macros Connect, Share, Access, Python , Pyspark,Spark
  • BI Software/ETL Tools: SAS Enterprise Guide 41, 42, 51, 71 SAS DI Studio 34, 42, 49 SAS Web Report Studio 31, 42, 44 SAS OLAP Cube Studio 91, SAS Information Map Studio 31, 42, SAS Grid Manager, Abilities ‎Python 364,Spark,ADF,Databricks
  • Database: Oracle, MS-SQL, Teradata, Hive ,HANA (SAP), Snowflake
  • IDE/Reports/Middleware: SAS/EISPower BI,SAS Viya
  • Operating Systems: UNIX Windows 9x, NT, 2000, MVS
  • Scheduling Tools: Platform Scheduler ,SAS IAB,Airflow
  • Applications: Excel 2003, Power Point, MS-Word
  • Python Programming

Certification

  • SAS Certified Data Integration Developer for SAS 9
  • SAS Certified BI Content Developer for SAS 9
  • SAS Certified Advanced Programmer for SAS9
  • SAS Certified Base Programmer for SAS9

Timeline

Lead Data Engineer

Siemens Healthineers
01.2019 - Current

Sr SAS Technical Consultant

Texas General
06.2017 - 07.2019

Sr SAS Technical Consultant

Schlumberger
12.2016 - 06.2017

SAS BI Consultant

Shell
08.2016 - 12.2016

SAS DI Consultant

UKAR
07.2015 - 08.2016

SAS BI Consultant

London UK
12.2014 - 06.2015

SAS Technical Consultant

The AA, Basingstoke
04.2011 - 12.2014

SAS Technical Lead

British Gas, Staines UK
04.2010 - 04.2011

SAS Consultant

Hutchison 3G, Maidenhead
12.2009 - 04.2010

Senior Data Analyst

Barclay's PLC
03.2009 - 08.2009

Senior SAS Consultant

BankAmerica, MBNA
05.2008 - 07.2008

Project Lead

Target Corporation, Minneapolis
01.2007 - 04.2008

Manager Analytics

HSBC Retail Services Analytics, Chicago
05.2004 - 01.2007

SAS Data Analyst

American Express
03.2003 - 05.2004

BACHELOR OF ENGINEERING - ELECTRONICS & COMMUNICATION

Osmania University
Kotari Vikram