Venkateshwaran Gopal: Professional

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Venkateshwaran Gopal

+91-8197333366
[email protected]

PROFESSIONAL SUMMARY
 Overall 12 years of experience in IT arena, that includes 3 years of current experience into Big
Data development and 9 years of expertise in Oracle PL/SQL Developer, Unix Shell scripting.
 Experienced on big data software stacks such as HDFS, Hive, HBase, Hbase-Hive Integration,
Sqoop and Spark.
 Developing Sqoop jobs with incremental load from heterogeneous RDBMS using native dB
connectors.
 Hands on experience with major components including Hive, Sqoop, Basic Spark RDDs, Spark
SQL using Dataframes and HBASE.
 Experienced on loading large sets of structured, semi structured and unstructured data.
 Involve in create Hive tables, loading with data and writing Hive queries.
 Development and Implementation of various methods, to load HIVE tables from HDFS and Local
File System.
 Developed Hive Queries to parse the raw data, populated to external & managed tables and
store the refined data in partitioned external tables.
 Experience in working with the Different file formats like TEXTFILE, SEQUENCEFILE, ORC,
Parquet and JSON.
 Applied HIVE query tuning technique such as Partitioning, Bucketing, Index and CBO.
 Integrated Hive and HBASE using Hive HBase Storage handler.
 Experienced in NoSQL Column-Oriented Database – HBASE (Phoenix Queries as well).
 Good experience on Apache Spark open-source data analytics cluster computing framework.
 Handled the structure data (csv, json) using Data Frames by using SQLContext.
 Created basic RDDs to form a dataframe to apply SQL query on it for analyzing the data.
 Created dataframes using REFLECTION(Case Class, toDF) , COLLECTION(StructType,StructField)
and also using direct READ method for csv, JSON etc.,
 Working extensively in developing Spark SQL applications using dataframe concepts.
 Implementing Hive queries on Spark Engine by importing HiveContext.
 Working knowledge on HORTONWORKS on a production cluster, with 80 live nodes and
development cluster with 20 dev nodes, having incoming data flow of 30GB per day approx.
 Having a very good knowledge on Apache Kafka.
 Having a good understanding of Tableau Reporting tool.
 Expertise in Oracle Database systems.
 Created Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based).
 Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers
using SQL and PL/SQL.
 Developed materialized views for data replication in distributed environments.
 Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables.
 Expertise in converting query result set into an excel sheet using UTL.FileType.
 Loaded Data into Oracle Tables using SQL Loader.
 Worked extensively on Ref Cursor, External Tables and Collections.

1|Page
TECHNICAL SKILLS
BIG DATA ECOSYSTEMS Hadoop, HDFS, Hive, Sqoop, Hbase, Spark, Pheonix
PROGRAMMING LANGUAGES Scala, SQL
DATABASE Oracle
NoSQL DATABASE HBase
OPERATING SYSTEM Linux, Windows
DEVELOPMENT TOOLS Eclipse, Winscp,Putty
METHODOLOGIES Agile
DISTRIBUTION Hortonworks

EMPLOYMENT DETAILS

 System Analyst - Hexaware Technologies June 2017 to till date.


 Senior Analyst - DerivIT Solutions, Chennai Nov 2016 to May 2017.
 Technical Lead - Happiest Minds Technologies Nov 2012 to Oct 2016.
 Software Engineer - TATA Consultancy Services Jul 2007 to Apr 2012

PROJECTS CONTOUR

Company Name : Hexaware Technologies Solutions Pvt Ltd.,


Client Name : EnvisionRx
Role : BigData Developer
Methodology : Agile

Project Scope:
This project provides a converged data lake that helps the enterprise to acquire, cure, store,
process and visualize the end to end touch points of the pathway of each and every patients using the
data ingested from heterogeneous data sources such as enterprise data warehouses, paid claims,
transactions, log data, call center chat, health plans, Electronic Health Records(EHR) etc.,

Roles and Responsibilities:

 Worked closely with business for requirement gatherings.


 Developing Sqoop jobs with incremental load from heterogeneous RDBMS (IBM & Oracle) using
native dB connectors.
 Designed Hive repository with external tables, internal tables, buckets, partitions, UDF and ORC
compressions for incremental data load of parsed data for analytical & operational dashboards.
 Developed Hive Queries on different data formats like Text file, CSV file, Log files and leveraging
time based partitioning yields improvement in performance using HiveQL
 Implemented functionality based data modelling on Hive Tables and stored the resultants record
sets into HBASE via Hbase storage Handler Column mapping

2|Page
 Implemented dynamic column binding on Hbase tables whenever a new attribute of functionality
entities are added.
 Interactive analysis of Hive tables through various data frame operations using SparkSQL.
 Loading Hive tables into HBase for low latency querying through Phoenix layer.
 Used the JSON and XML SerDe’s for serialization and de-serialization to load JSON and XML data
into Hive Tables.
 Used Spark-SQL to load JSON data and create schema RDD and loaded it into Hive Tables.
 Utilized dataframe and spark SQL API for faster processing of data.
 Created basic RDDs to form a dataframe and applied SQL queries on top of it for analyzing the
data.

Company Name : derivIT Solutions Pvt Ltd.,


Client Name : Standard Chartered Bank
Role : Database Developer
Methodology : Agile

Project Scope:
The Basel III framework developed to act as a credible supplementary measure to the risk-based
capital requirements. A simple leverage ratio framework is critical and complementary to the risk-based
capital Frame work and a credible leverage ratio is one that ensures broad and adequate capture of both
the on- and off-balance sheet sources of banks leverage.

Roles and Responsibilities:

 Worked on Fermat tool to understand the risk involved in banking system.


 Exporting and Importing of Schemas using Data Pump.
 Also, export and import can be done using Fermat Tool.
 Created scripts to create new tables, views, queries for the application using SQL Developer.
 Created stored procedures with XML parsing. Extracting XML and inserting the corresponding
values into the table.
 Created triggers for audit functions.
 Created indexes on the tables for faster retrieval of the data to enhance database performance.
 Created records, tables, collections (nested tables and arrays) for improving Query performance
by reducing context switching.
 Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic
SQL.
 Handled errors using Exception Handling extensively for the ease of debugging and displaying the
error messages in the application.
 Worked on SQL*Loader to load data from flat files obtained from various facilities every day.
 Worked on finding out Database differences using DB Forge.

3|Page
Company Name : HappiestMinds Technologies Pvt Ltd.,
Client Name : Dataflow
Role : Database Developer
Methodology : Agile

Project Scope:
Dataflow Group provides compliance and integrity services thus prevents illegal immigration. It
helps in protecting professional bodies maintaining high standards. Dataflow also uses multiple portals
for their customers so that Customers, Applicants and Dataflow Employees can keep track of their
incoming and outgoing activities.

Roles and Responsibilities:


 Created multiple databases with the required character set.
 Created multiple schemas for different tracks.
 Giving Grants to the schemas based on the requirement.
 Exporting and Importing of Schemas using Data Pump.
 Created scripts to create new tables, views, queries for the application using SQL Developer.
 Created stored procedures with XML parsing. Extracting XML and inserting the corresponding
values into the table.
 Created triggers for audit functions.
 Generated MIS report using very complex queries.
 Created indexes on the tables for faster retrieval of the data to enhance database performance.
 Created records, tables, collections (nested tables and arrays) for improving Query performance
by reducing context switching.
 Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic
SQL.
 Handled errors using Exception Handling extensively for the ease of debugging and displaying the
error messages in the application.
 Worked on SQL*Loader to load data from flat files obtained from various facilities every day.
 Worked on finding out Database differences using DB Forge.

Company Name : HappiestMinds Technologies Pvt Ltd.,


Client Name : Essilor of America
Role : Database Developer
Methodology : Agile

Project Scope:

Essilor’s OPTUITIVE Lab Management System is designed as SaaS. OPTUITIVE™ provides a


specialized and comprehensive ERP and prescription eyewear manufacturing solution. This project
extensively developed bearing in scope of Lab Management along with daily activities including Order
Management as well many others including Sales, customer management etc., The project modules
include Job Management, Order Management, and Lab Management. Functional point of testing includes
Order Management, Lab Management. Lab Management being core area of interest.

4|Page
Roles and Responsibilities:

 Involved in full development cycle of Planning, Analysis, Design, Development, Testing and
Implementation.
 Developed and modified triggers, packages, functions and stored procedures for data conversions
and PL/SQL procedures to change database objects based on user inputs.
 Created all DDL scripts and executed in the respective environments.
 All the scripts were maintained in SVN with up-to-date.
 Improved the performance of the application by rewriting the SQL queries.
 Wrote packages to fetch complex data from different tables using joins, sub queries.
 Wrote SQL, PL/SQL, SQL*Plus programs required to retrieve data using cursors and exception
handling.

Company Name : TATA Consultancy Services


Client Name : SEI Investments
Role : Database Developer
Methodology : Waterfall

Project Scope:
SEI GIPP is an integrated banking solution for universal banks specializing in investment
management for Corporate, Retail and Investment Banking operations with a highly flexible and scalable
modular architecture.

Responsibilities:
 Involved in Requirement Analysis.
 Preparation of High Level Design/Technical Design for PowerBuilder design and PL/SQL Codes.
 Created stored procedures and Functions in PL/SQL.
 Creating Unit Test scenarios/cases.
 Closely interacting with Business Analysts regarding the functionality.
 Providing support/solutions to Production issues/Live Implementation.
 Involved in analyzing the bugs reported by the customer.
 Analyzing the reusable components to increase the performance.
 Review of Low Level Design, Code and Unit Test Cases.
 Ensured that project timelines were met.
 Sharing relevant knowledge within the team.
 Ensured that high quality of work was delivered.

Declaration

I hereby declare that the above written particulars are true to the best of my knowledge and belief.

Venkateshwaran Gopal

5|Page

You might also like