Venkateshwaran Gopal: Professional
Venkateshwaran Gopal: Professional
Venkateshwaran Gopal: Professional
+91-8197333366
[email protected]
PROFESSIONAL SUMMARY
Overall 12 years of experience in IT arena, that includes 3 years of current experience into Big
Data development and 9 years of expertise in Oracle PL/SQL Developer, Unix Shell scripting.
Experienced on big data software stacks such as HDFS, Hive, HBase, Hbase-Hive Integration,
Sqoop and Spark.
Developing Sqoop jobs with incremental load from heterogeneous RDBMS using native dB
connectors.
Hands on experience with major components including Hive, Sqoop, Basic Spark RDDs, Spark
SQL using Dataframes and HBASE.
Experienced on loading large sets of structured, semi structured and unstructured data.
Involve in create Hive tables, loading with data and writing Hive queries.
Development and Implementation of various methods, to load HIVE tables from HDFS and Local
File System.
Developed Hive Queries to parse the raw data, populated to external & managed tables and
store the refined data in partitioned external tables.
Experience in working with the Different file formats like TEXTFILE, SEQUENCEFILE, ORC,
Parquet and JSON.
Applied HIVE query tuning technique such as Partitioning, Bucketing, Index and CBO.
Integrated Hive and HBASE using Hive HBase Storage handler.
Experienced in NoSQL Column-Oriented Database – HBASE (Phoenix Queries as well).
Good experience on Apache Spark open-source data analytics cluster computing framework.
Handled the structure data (csv, json) using Data Frames by using SQLContext.
Created basic RDDs to form a dataframe to apply SQL query on it for analyzing the data.
Created dataframes using REFLECTION(Case Class, toDF) , COLLECTION(StructType,StructField)
and also using direct READ method for csv, JSON etc.,
Working extensively in developing Spark SQL applications using dataframe concepts.
Implementing Hive queries on Spark Engine by importing HiveContext.
Working knowledge on HORTONWORKS on a production cluster, with 80 live nodes and
development cluster with 20 dev nodes, having incoming data flow of 30GB per day approx.
Having a very good knowledge on Apache Kafka.
Having a good understanding of Tableau Reporting tool.
Expertise in Oracle Database systems.
Created Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based).
Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers
using SQL and PL/SQL.
Developed materialized views for data replication in distributed environments.
Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables.
Expertise in converting query result set into an excel sheet using UTL.FileType.
Loaded Data into Oracle Tables using SQL Loader.
Worked extensively on Ref Cursor, External Tables and Collections.
1|Page
TECHNICAL SKILLS
BIG DATA ECOSYSTEMS Hadoop, HDFS, Hive, Sqoop, Hbase, Spark, Pheonix
PROGRAMMING LANGUAGES Scala, SQL
DATABASE Oracle
NoSQL DATABASE HBase
OPERATING SYSTEM Linux, Windows
DEVELOPMENT TOOLS Eclipse, Winscp,Putty
METHODOLOGIES Agile
DISTRIBUTION Hortonworks
EMPLOYMENT DETAILS
PROJECTS CONTOUR
Project Scope:
This project provides a converged data lake that helps the enterprise to acquire, cure, store,
process and visualize the end to end touch points of the pathway of each and every patients using the
data ingested from heterogeneous data sources such as enterprise data warehouses, paid claims,
transactions, log data, call center chat, health plans, Electronic Health Records(EHR) etc.,
2|Page
Implemented dynamic column binding on Hbase tables whenever a new attribute of functionality
entities are added.
Interactive analysis of Hive tables through various data frame operations using SparkSQL.
Loading Hive tables into HBase for low latency querying through Phoenix layer.
Used the JSON and XML SerDe’s for serialization and de-serialization to load JSON and XML data
into Hive Tables.
Used Spark-SQL to load JSON data and create schema RDD and loaded it into Hive Tables.
Utilized dataframe and spark SQL API for faster processing of data.
Created basic RDDs to form a dataframe and applied SQL queries on top of it for analyzing the
data.
Project Scope:
The Basel III framework developed to act as a credible supplementary measure to the risk-based
capital requirements. A simple leverage ratio framework is critical and complementary to the risk-based
capital Frame work and a credible leverage ratio is one that ensures broad and adequate capture of both
the on- and off-balance sheet sources of banks leverage.
3|Page
Company Name : HappiestMinds Technologies Pvt Ltd.,
Client Name : Dataflow
Role : Database Developer
Methodology : Agile
Project Scope:
Dataflow Group provides compliance and integrity services thus prevents illegal immigration. It
helps in protecting professional bodies maintaining high standards. Dataflow also uses multiple portals
for their customers so that Customers, Applicants and Dataflow Employees can keep track of their
incoming and outgoing activities.
Project Scope:
4|Page
Roles and Responsibilities:
Involved in full development cycle of Planning, Analysis, Design, Development, Testing and
Implementation.
Developed and modified triggers, packages, functions and stored procedures for data conversions
and PL/SQL procedures to change database objects based on user inputs.
Created all DDL scripts and executed in the respective environments.
All the scripts were maintained in SVN with up-to-date.
Improved the performance of the application by rewriting the SQL queries.
Wrote packages to fetch complex data from different tables using joins, sub queries.
Wrote SQL, PL/SQL, SQL*Plus programs required to retrieve data using cursors and exception
handling.
Project Scope:
SEI GIPP is an integrated banking solution for universal banks specializing in investment
management for Corporate, Retail and Investment Banking operations with a highly flexible and scalable
modular architecture.
Responsibilities:
Involved in Requirement Analysis.
Preparation of High Level Design/Technical Design for PowerBuilder design and PL/SQL Codes.
Created stored procedures and Functions in PL/SQL.
Creating Unit Test scenarios/cases.
Closely interacting with Business Analysts regarding the functionality.
Providing support/solutions to Production issues/Live Implementation.
Involved in analyzing the bugs reported by the customer.
Analyzing the reusable components to increase the performance.
Review of Low Level Design, Code and Unit Test Cases.
Ensured that project timelines were met.
Sharing relevant knowledge within the team.
Ensured that high quality of work was delivered.
Declaration
I hereby declare that the above written particulars are true to the best of my knowledge and belief.
Venkateshwaran Gopal
5|Page