Srushti Kaushik Professional Summary
Srushti Kaushik Professional Summary
Professional Summary
7 years of experience in designing and developing JAVA application using JAVA technologies, System
Analysis, Technical Design, Implementation, Performance Tuning and Testing.
Around 2 years of experience with Cloudera Hadoop Ecosystems including HDFS, MapReduce (MRV1 and
understanding of YARN) and Hadoop tools (Pig, Hive, HBase, Sqoop, Zookeeper, Oozie).
Excellent Hands on Experience in developing Hadoop Architecture within the project in Windows and
Linux platforms.
Analyzing client’s Big Data requirement and transforming it in to the Hadoop eco system with account of
performance bottlenecks and tunings on the existing Hadoop infrastructure.
Proficient in Data migration from existing DBMS or RDBMS to Hadoop files system using Sqoop.
Experience in handling variety of data sets from customer which includes structured as well as
unstructured data using HDFS and HBase.
Developed map programs to perform data transition and analysis using java, hive and pig.
Excellent Scripting Skills in Pig and Hive Systems.
Good Experience in Table Partitioning in Hive and Parameter passing in Pig.
Built libraries, user defined functions, and frameworks around Hadoop Ecosystems.
Excellent hands on experience in handling 10TB data/week and successfully implanted in Production
Cluster running 100 nodes.
Experience writing Java Map Reduce Jobs, HIVEQL for Data Architects, Data Scientists.
Good Experience in data loading from Oracle and MYSQL databases to HDFS system using Sqoop
(Structure Data) and Flume (Log Files & XML).
Defining job flows in Hadoop environment-using tools like Oozie for using capacity and fair scheduler.
Experience in Cluster coordination services through Zookeeper.
Understanding of loading a streaming data directly in to HDFS using Flume.
Preparation of proof of concept about Hadoop and micro services.
Working knowledge of database such as Oracle 11g.
Strong experience in database design, writing complex MySQL Queries and Stored Procedures
Strong problem solving skills, good communication, interpersonal skills and a good team player
Ability to work in a fast changing environment and learn new technologies effortlessly
Technical Skills:
Professional Experience:-
DIRECTV is one of the world's leading providers of digital television entertainment services delivering a
premium video experience through state-of-the-art technology, unmatched programming, and industry
leading customer service to more than 37 million customers in the U.S. and Latin America.
Description:-The project is about click stream data analysis which is beneficial for targeting and pushing ads
directly to the customer and reducing customer churn by targeting. The project includes the data ingestion
from the websites which customers go through and storage of that data directly to HDFS using Flume for
streaming. Direct TV utilizes this data for pushing ads to directed customer.
Responsibilities:
Coordinated with business customers to gather business requirements and migrated the existing data
to Hadoop from RDBMS (MySQL) using Sqoop for processing the data.
Using Cloudera distribution of Hadoop. At present migration to Horton works sandbox from Cloudera us
under development.
Ingestion of log data to HDFS using Flume.
Analyzed click stream data using Hadoop components Hive and Pig by querying.
Implemented Performance tuning in Hive queries for transformations like joins.
Involved in creating internal and external Hive Tables, loading data, generating partitions and buckets and
User Defined Functions for optimizing the categorical distribution over ingested data.
Design and implement Map-Reduce jobs to support distributed data processing to process large data sets
utilizing Hadoop cluster.
Developed Map Reduce programs to cleanse and parse data in HDFS obtained from various data sources
and to perform joins on the Map side.
Exported the business required information to RDBMS from HDFS using Sqoop to make the data available
for BI team to generate reports.
Implemented daily workflow for extraction, processing and analysis of data with Oozie.
Environment: Java 1.6, Hadoop 2.0.0, Map Reduce, HDFS, Sqoop 1.4.3,Hive 0.10.0, Pig 0.11.0, Linux, XML,
Eclipse Juno service, Cloudera- CDH3/4 Distribution, Oracle 11g, MySQL, HBase 0.94.6
Responsibilities:
Developed and Supported Map Reduce Programs those are running on the cluster.
Created Hive tables and working on them using Hive QL.
Handled 2 TB of data volume and implemented the same in Production.
Weekly meetings with technical collaborators and active participation in code review sessions with senior
and junior developers.
Responsible to manage heterogeneous data coming from different sources
Supported HBase Architecture Design with the Hadoop Architect team to develop a Database Design in
HDFS.
Involved in HDFS maintenance and loading of structured and unstructured data.
Designed workflow by scheduling Hive processes for Log file data, which is streamed
Into HDFS using Flume.
Wrote Hive queries for data analysis to meet the business requirements.
Installed and configured Pig and also written Pig Latin scripts.
Developed Scripts and Batch Job to schedule various Hadoop Program.
Upgrading the Hadoop Cluster from CDH3 to CDH4 and setup High availability Cluster Integrate the HIVE
with existing applications.
Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
Installed Oozie workflow engine to run multiple Hive and Pig jobs.
Developed Hive queries to process the data and generate the data cubes for visualizing
Environment: Cloudera Hadoop, CDH 3/4, Hive, Pig, Map Reduce, Oozie, Sqoop, Flume, Eclipse, Hue,
MySql, JAVA, Shell, Linux.
State of West Virginia is a state government service which provides assistant in all the area of public interest
like health, pension for senior citizen, drivers, license etc.
Description:-As a JAVA developer the job was to do programming and testing of various java applications
which can be then deployed to public for usage. Also, as a part of job, maintenance and reporting also was
included. Large scale application development/maintenance and/or Web applications design and
development. Programming and set up of databases, organize web site architecture and programming
website functions. Procurement and management of complex databases and software to map analyze or
display epidemiological data. Conducted test runs, debugging programs and preparing documentation.
Responsibilities:-
Involved in all phases of the SDLC of the project.
Followed SCRUM methodology for project implementation.
As a Senior Java Developer, involved in understanding requirements and development of Application.
Design, Coding and Unit Testing required for the product in Java.
Involved in writing the design documents and technical specifications.
Involved in writing java script and style sheets (CSS) for the html pages.
Involved in all the phases of the SDLC for enhancing the Web Content Management System (WCMS) i.e.
to make the system faster and easier for the users to use. Enhancements which are required by the
internal content creation team have been designed, implemented and tested.
Used ORACLE as the backend database for storage of the data.
Was also involved in the maintenance support for the project.
Used Team Track for quality control.
Mentored a team of Java J2EE developers in understanding the project, delegating tickets/ issues and
conveying user requirements/ design specifications to developers.
Environment: Oracle 11g, Serena team track, Eclipse, Windows XP (for development), Linux (for server
deployment), JAVA 1.6, Servlets 2.4, JSP, JDBC, ANT, AJAX, HTML, CSS, JavaScript, Team Track 6.6
ICICI Bank is India's largest private sector bank. Provides wide range of banking services to customers. Bank
currently has a network of 4,450 Branches and 13,820 ATM's across India. The principal objective of the bank
is to create a development financial institution for providing medium-term and long-term project financing to
Indian businesses.
Description: The project was to manage and develop application for online portals. This included providing
the solution of application and production issues through service request reported by users. The application
was to help user with their credit card activity as well as online banking services. The project also included
bug fixing as well as fixing of wrongly responding services which include wrong calculations of EMI or interest
rate.
Responsibilities: -
Responsible for gathering and analyzing requirements and converting them into technical specifications
Used Rational Rose for creating sequence and class diagrams
Developed presentation layer using Java, HTML and JavaScript
Used Spring Core Annotations for Dependency Injection
Performed Performance Tuning activities using SQL scripts. Involved in scripts preparation using SQL.
Designed and developed Hibernate configuration and session-per-request design pattern for making
database connectivity and accessing the session for database transactions respectively. Used SQL for
fetching and storing data in databases
Participated in the design and development of database schema and Entity-Relationship diagrams of the
backend Oracle database tables for the application
Implemented web services with Apache Axis
Designed and Developed Stored Procedures, Triggers in Oracle to cater the needs for the entire
application. Developed complex SQL queries for extracting data from the database
Handling the all types of issues of PRIME and ONLINE application like Interest levied wrongly,
transaction related, installment plans, statement not generating, reward points not getting
properly, EMI conversion, product change etc.
Environment: apache axis, Rational Rose XDE, spring 2.5, notepad++, eclipse, JAVA script, HTML, Oracle
database 11g, log4j
SBI is the largest Indian, multinational Public Sector Banking & Financial Services Company. Headquartered in
Mumbai. Providing specialized and customized financial services designed to cater the needs of Corporate,
SMEs, and NRIs& Retail customers.
Description: Preparation of Use cases and Test cases. Using JSP/HTML/Struts developed UI screens. Always
used the best practices of Java/J2EE, continuous re-factoring of code, minimize database calls, optimized
queries to get better performance of application.
Responsibilities:-
Developed code for handling exceptions and converting them into Action Messages.
Used JavaScript for validations and other checking functionality for the UI screens.
Involved in Struts Based Validation
Involved in Personal Information module.
Designed and developed the user interface layer using JSP, Java Script, Ajax, HTML, CSS.
Used HTML to control the display, position HTML elements and to handle events in the user interface.
Used JavaScript objects to handle events on text boxes, forms to call business logic.
Involved in resolving business technical issues.
Involved in writing code to invoke Web services in other applications based on the WSDL files
Used Hibernate ORM to interact with the oracle database to retrieve, insert and update the data.
Written the JUNIT test cases for the functionalities.
Developed and tuned the database SQL queries.
Used Eclipse IDE and Tomcat 5.5 web application server in development.
Used CVS version control and Clear Quest in bug tracking.
Environment: Java/J2EE,Spring 3, Oracle 10g, JavaScript, CSS, AJAX, JUnit, Log4j,SOAP Web Services,
Restful Web Services, Eclipse IDE.
Sterling group is the one of the biggest hospital services as a private sector in India. They provide health
services in a large range to the patients.
Description: -As a software engineer the job was to maintain the database of the system and retrieve and
add data with all the details and diagnosis for the patients using Oracle database and SQL queries.
Responsibilities:-
This project covered hospital functions, management activities and decision-making. It Provide all-round
and all-angle support for the modern hospital.
Worked on outpatient registration module and emergency registration module
Made all requests and processes controlled by the system.
Used interface-oriented programming manner improving flexibility and expandability of the
System
Build Database and tables according to client requirements
Debugged and fixed the problems that were found during the different phases of the project
Maintenance of the Database and the Systems, also updated the System from time to time.
Environment: Java, JDBC, Servlets, JSP, HTML, JavaScript, Eclipse, Windows 2000,oracle database, Microsoft
Excel.