Overall 9+ years of experience in Software Development, which include experience working 4+ years in Hadoop development and Analysis, Design, Implementation, Testing and Deployment of Web-Based, Distributed and Enterprise Applications. Hands on experience in installing, configuring, and architecting Hadoop and Hortonworks clusters and services - HDFS, MapReduce, Yarn, Hive, Oozie, Flume, HBase, Spark, Sqoop, and Oozie. Understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce programming paradigm. Knowledge in Installation, Configuration with migrating and upgrading of data from Hadoop MapReduce, HIVE, HDFS, HBase, Sqoop, Oozie, Pig, Cloudera, YARN, Zookeeper, Flume. Experience in importing, exporting data from and into HDFS using Sqoop and then processing the data which is in schema or non-schema oriented using PIG. Experience with Oozie Workflow Engine in running workflow jobs with actions that run Hadoop Map/Reduce and Pig jobs. Hands on experience in using Spark tools like RDD transformations and spark QL. Experience in Extraction, Transformation & Loading of data with different file formats like CSV, text files, sequence files, Avro, Parquet, JSON, ORC and used file compression codecs like gzip, lz4 & snappy. Strong programming experience using PL/SQL Packages, Stored Procedures, Functions, Cursors, Constraints, Triggers, Indexes, Views, Materialized Views. Good experience in Extraction, Transformation, and Loading (ET) data from multiple database sources for medium to large Enterprise Data Warehousing using. Strong understanding of the principles of Data Warehousing concepts using Fact tables, Dimension tables, and Star / Snowflake Schema modeling. Experience in using Snowflake Clone and Time Travel. Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python. Develop SQL queries SnowsQL, Develop transformation logic using snowpipeline. In-depth knowledge of Snowflake Database, Schema, and Table structures. Participates in the development improvement and maintenance of snowflake database applications. Experience in using version control tools like CVS, GIT, SVN. Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage and backup on AWS. Capable of doing Chef setup, managing hosts file, authoring various playbooks and custom modules with Chef. Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data. Experienced in integrating Hadoop with Kafka & NIFI, experienced in uploading data from to HDFS. Built and deployed ANSIBLE Servers in AWS for infrastructure automation. Experienced in cloud implementation using TERRAFORM. Profound experience in creating real time data streaming solutions using Apache Spark/Spark Streaming, Kafka. Expertise in life cycle implementation using CDH (Cloudera) and HDP (Hortonworks Data Platform) distributions. Hands on experience in application development using Java, RDBMS and Linux Shell Scripting. Experience in applying the latest development approaches including applications in Spark using Scala to compare the performance of Spark with Hive and SQL/Oracle. In depth understanding of Hadoop Architecture and its various components such as Resource Manager, Application Master, Name Node, Data Node, HBase design principles etc. Experience in handling messaging services using Apache Kafka. Extensive knowledge in programming with Resilient Distributed Datasets (RDDs). Created in Confluence that integrate with JIRA projects. Developed Tableau Visualizations and dashboards using Tableau desktop. Experience in working with different databases like NoSQL and MYSQL along with exposure to Hibernate, JDBC for mapping an object-oriented domain model to a traditional relational database. Built and deployed CHEF Servers in AWS for infrastructure automation. Experienced in cloud implementation using TERRAFORM. Experience in processing large volume of data and skills in parallel execution of process using Talend. Have a good experience working in agile development environment including Scrum methodology. Possess Strong communication, logical, Analytical and Interpersonal Skills. I am an active team player. Responsive expert experienced in monitoring database performance, troubleshooting issues and optimizing database environment. Possesses strong analytical skills, excellent problem-solving abilities, and deep understanding of database technologies and systems. Equally confident working independently and collaboratively as needed and utilizing excellent communication skills.