Format
Format
Technical Summary
Kafka: POC on AWS kafka Kafka-cluster setup, Confluent kafka 6.1, Confluent operator, opensource kafka v2.8
Big Data Technologies: Hortonworks, Cloudera, HDFS, Hive, Map Reduce, Cassandra, Pig, Avro, HBase, Storm,
Confluent Kafka 4.0X, 5.3, Open source (Apache Kafka)
DevOps Tools and Languages: CHEF, Ansible, Jenkins, Git, Java, Shell Scripting, SQL
Application Servers: Apache WebLogic Server, Web sphere
Cloud Platforms: Google Cloud Platform (GCP), Kubernetes GKE, Amazon Web Services (AWS),
NoSQL Databases: HBase, Cassandra
Operating Systems: Linux, UNIX, Mac OS X 10.9.5, Windows Security: Kerberos, SSL, SASL and mTLS
Professional Experience:
Telefonica Wipro:
Working On installation of new Confluent- kafka v 6.1 cluster for our new client at different stages
Completed training on Google cloud platform, Kubernetes GKE
Involved in preparation of Design document
Successfully Installed confluent operator
Worked on generation of mTLS certificate for securing kafka cluster using Cfssl and CfsslJson
Successfully worked on enabling Acls, Kakfa ACL's with anonymous users and with different hostnames.
Worked on generation of external client certificate
Successfully Installed confluent kafka in Dev, Test and Stage different environments
Involved in setup confluent kafka cluster in production environment
Successfully installed different components confluent kafka control center, zookeeper, Broker, connect and
ksqldb
Successfully installed Schema Registry, connect and Replicator
Worked on connectivity testing, performance tuning (Perf test)
Worked on creation of topic, produce, and consume messages
Worked on storage of credentials secrets, root CA at Git Secret
Successfully Installed Kafka connectors (Debezium etc…)
BestBuy Wipro, MN
Responsibilities:
Led the installation, configuration and deployment of product software’s on new edge nodes that connect and contact
Kafka cluster for data acquisition.
Installed Kafka cluster with separate nodes for brokers and creating a backup for all the instances in Kafka Environment
Supported and worked for the Docker team to install Apache Kafka cluster in multimode and enabled security in the
DEV environment.
Successfully completed POC on kafkahq GUI for kafka topic monitoring tool
Successfully configured LDAP for group access and SSL for Kafkahq
Installed Kafka manager for consumer lags and for monitoring Kafka Metrics also this has been used for adding topics,
Partitions etc.
Created Chef Site cookbook which installs and configures Kafkahq for multimode cluster
Created and wrote shell scripts (Bash),Python and PowerShell for automating tasks( which
successfully deploys schema into schema registry)
Worked on creation of Grafana Dashboards for Kafka consumer groups monitoring
Involved in creation of Kafka topics using Chef Automation
Successfully Generated consumer group lags from Kafka using their API
Successfully test Kakfa ACL's with anonymous users and with different hostnames.
Successfully completed POC on setting Kafka clusters on AWS environment
Worked on Kafka Mirror maker transfer data between AWS clusters to on premise Kafka clusters
Worked on creation of chef site cook for Mirror maker
Worked on creation of Chef site cookbook and role cookbook which install and deploys Schema-Registry
Involved in writing in spec tests for Ibmmq Inventory chef site cookbooks
Experience working with production servers at multiple data centers
Setup MQ and Message Broker Pub/Sub environments
Documented all the installations and maintenances organized and provided documents for
the offshore team.
Environment: Apache Kafka 5.3.1, UNIX, SQL, Kafka Cluster, Linux, Git, Chef, Jenkins, Ibmmq
Responsibilities:
Installed and configured Hadoop multi-node cluster and maintenances by Nagios.
Install and configure different Hadoop ecosystem components such as Spark, HBase, Hive, Pig etc. as per requirement.
Configuring HA to various services such as Name Node, Resource Manager, HUE etc., as required to maintain the SLA of
the organization.
Performed POC to assess the workloads and evaluate the resource utilization and configure the Hadoop properties
based on the benchmark result.
Tuning the cluster based on the POC and benchmark results.
Commissioning and de-commissioning of cluster nodes.
Monitoring System Metrics and logs for any problems
Expertise building Cloudera, Hortonworks Hadoop clusters on bare metal and Amazon EC2 cloud.
Experienced in installation, configuration, troubleshooting and maintenance of Kafka & Spark clusters.
Experience in setting up Kafka cluster on AWS EC2 Instances.
Worked on setting up Apache NiFi and performing POC with NiFi in orchestrating a data pipeline.
Currently working as admin in Hortonworks (HDP) distribution for 4 clusters ranges from POC to PROD.
Monitoring System Metrics and logs for any problems using Check Mk monitoring tool.
User provisioning (creation and deletion) of user on Prod and Non-Prod cluster according to client request.
Ensure that critical customer issues are addressed quickly and effectively.
Apply troubleshooting techniques to provide solutions to our customer's individual needs.
Investigate product related issues both for individual customers and for common trends that may arise.
Resolve customer problems via telephone, email or remote access.
Maintain customer loyalty through integrity and accountability.
Research customer issues in a timely manner and follow up directly with the customer with recommendations and
action plans.
Escalate cases to the engineering team when the problem is beyond the scope of technical support or falls out of the
support team's expertise.
Environment: Hadoop, Yarn, Spark, Kafka, Hive, Pig, Sqoop, Cloud Era, Kerberos, NiFi, Java 8.0, Log4J, GIT, AWS, JIRA.
Responsibilities:
Installation, Configuration, upgradation and administration of Windows, Sun Solaris, Red Hat Linux and Solaris.
Linux and Solaris installation, administration and maintenance.
User account management, managing passwords setting up quotas and support.
Worked on Linux Kick-start OS integration, DDNS, DHCP, SMTP, Samba, NFS, FTP, SSH, and LDAP integration.
Network traffic control, IPSec, Quos, VLAN, Proxy, Radius integration on Cisco Hardware via Red Hat Linux Software.
Installation and configuration of MySQL on Windows Server nodes.
Responsible for configuring and managing Squid server in Linux and Windows.
Configuration and Administration of NIS environment.
Package and Patch management on Linux servers.
Worked on Logical volume manager to create file systems as per user and database requirements.
Data migration at Host level using Red Hat LVM, Solaris LVM, and Veritas Volume Manager.
Expertise in establishing and documenting the procedures to ensure data integrity including system fail-over and
backup/recovery in AIX operating system.
Managed 100 + UNIX servers running RHEL, HPUX on Oracle HP. Solaris Disk Mirroring (SVM), ZONE installation and
configuration
Escalating issues, accordingly, managing team efficiently to achieve desired goals.
Environment: Linux, TCP/IP, LVM, RAID, Networking, Security, user management, MySQL.
Education:
MSc: 2000 to 2002
Kuvempu University, Shankaraghatta, India
BSc: 1997 to 2000
Kuvempu University, Govt. Science College Chitradurga
References:
Ramesh Mourya Pradeepta Sarangi
Manager TCF Bank Manager
Contact No.9164138156 Contact No. 6124138939
Email: [email protected]