0% found this document useful (0 votes)
51 views8 pages

Bharathi AWS

aws

Uploaded by

Madhava Rao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views8 pages

Bharathi AWS

aws

Uploaded by

Madhava Rao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Classification: Internal

Bharathi Vasu - AWS Solution Architect


[email protected]
Ph: (309) 310-0068
LinkedIn: https://www.linkedin.com/in/bharathi-v-28ba8274/
_____________________________________________________________________________________

PROFESSIONAL SUMMARY
Innovative and assertive, with the ability to pick up new technologies and assess situations quickly
andincorporate business process improvements by automating the work.Demonstrated success in
improving the customer satisfaction in large diverse organization with 17+ years of experience. As AWS
Certified Solutions Architect I hold 5+ years of extensive knowledge of cloud design architecting,
strategy and infrastructure implementation experience. Currently working asDevops & AWS solution
architect Strong sense of ownership and high attention to detail, creative and analytical problem-solving
skills and experience in Business operations and Finance (Financial Planning Solutions).

CERTIFICATIONS
 Core: AWS Solution Architect Professional Certified, Dec 9th, 2022.
 Core: AWS Solution Architect Associate Certified, Jan 4th, 2021.
 Core: IBM Certified 31 March 2015, Cognos BI developer. Grade marks - (Applied and
Mastered)
 Core: CSM Certified 2nd Dec 2017. (Applied and mastered)

TECHNICAL SKILLS
AWS SERVICES BIGDATA TECHNOLOGIES
• AWS GLUE • HADOOP, HIVE, IMPALA, HDFS
AWS DYNAMO DB • Postgress SQL
IAM ROLES • PYTHON 3.7
LAMBDA • SNOWFLAKE
• S3 /EFS • DATABRICKS
• REDSHIFT • PRESALES
• KINESIS • ORACLE SQL DEVELOPER
• DMS • SQL WORKBENCH
• ATHENA • Teradata
• EVENT HUB
• SNS, SQS BI ANALYTICS TOOLS
• CLOUDWATCH/Event Bridge • DATA MODELLING
• DATA MANAGEMENT
• ETL
Others TABLEAU
HP ALM, BIT BUCKET COGNOS

PROFESSIONAL EXPERIENCE

Client : THE OCC COMPANY


April 2023 till Date,

CURRENT ROLE : AWS DATA ARCHITECT.

Job Description:
Classification: Internal

AWS Data Architect for Chicago location for a Test Data Management project. It’s a migration project of Mainframe
DB2 to AWS cloud (Presto DB) using a Java – Kafka data hydration mechanism. Below is the detailed JD:

Client is currently working on legacy platform migration to cloud reengineering using AWS cloud platform in a
phased manner. Migration involves ~25-30 applications with multiple interdependent teams involved.

Job Description:
 Understanding the over-all Architecture of the system and ensure the deliverables are
in-line with the proposed AWS architecture
 Build the AWS data processing solution focusing development of reusable components
 Work with AWS services like S3, Lambda, Step Functions any of the relational databases
AWS - PrestoDB , RDS PostgreSQL
 AWS Lambda function development experience with Java and/or Python.
 Experience working with relational database hosting Appian internal data and metadata,
and also additional relational databases hosting Appian business data.
 Analyze data processing requirements, source data, and data domain models
 Prepare architecture and design briefs that outline the key features and decision points
of the application built in the Data Lab.
 Cloud development experience with AWS services, including: API Gateway, ETL data
pipeline building, PrestoDB, Microservices,
 Build distributed, scalable, and reliable data pipelines that ingest and process data at
scale and in real-time.
 Author extract, transform and load (ETL) scripts for moving and curating data into data
sets for storage and use by a datalake, data warehouse and datamart.
 Develop tools and procedures to monitor and automate system tasks on servers and
clusters
 Collaborate with other teams to design, develop, and deploy data tools that support
both operations and product use cases.
 Experience with stream-processing systems: Kafka

 Ability to provide a leadership role for the work group through knowledge in area of
specialization.
 Bachelor's degree in computer science, computer engineering, or a related field, or the
equivalent combination of education and related experience.
 12 years of professional experience as a data software engineer
 3 years of experience with AWS cloud or other cloud Big Data computing design,
provisioning, and tuning.
 Related AWS certification, preferred

AUGUST 2021 – FEB 2023

Project II Janssen R&D US


Client: Johnson & Johnson Services, Inc
(Raritan, New Jersey)
Current Role: AWS Big Data Solution Architect

Description:
Working on the R&D Data Lake Ingestion & Integration project. As part of the project, configure data
ingestion& integration process, using Aws services like glue, Athena, kinesis, CloudWatch/Event bridge,
and other AWS service as well etc,StreamSets and Big Data technologies, Databricks, to the centralized
on-prem and cloud-based R&D Data Lake ( storage repository that can store large amount of structured, semi-
Classification: Internal

structured, and unstructured data). Further to the data ingestion process the data that are currentlyresiding
in heterogeneous source applications will be configured, curated, and tested to the centralizeddata lake
platform and consumed by downstream applications and Integrated with Business intelligenceapplications
to support business users.

This work will also require the functional and technical knowledge of the existing application and skills
toanalyse, design, develop, test, and implement newsolutions for various projects, within R&D Data
Lake.

Requirement analysis and solution design to implement the requirement by conducting the
feasibility with the StreamSets architecture and integration with custom tools.

Sources: Oracle, Flat files, API and external applications.


Target: Redshift, Athena, Snowflake,Appian, Databricks.

Duty Details:
 Conduct requirement elicitation through a combination and conference calls.
 Discussion with the stakeholders to clarify queries and business rules around current and
historical data and validate data availability for the defined use cases.
 Designs, builds, and maintains flexible and scalable data warehouse solutions to support Business
Intelligence (BI) and Reporting projects.
 Creation of agile technical stories for data architecture and development.
 Master Data Management, in conjunction with the new data warehouse.
 Impact analysis on the existing modules, Solution design and dry run.
 Feasibility analysis of the solution in the boundary of the StreamSets, Cloudera and Aws
architectureto formally define the data sources, method of access and relationships.
Use cases development using StreamSets, Shell Scripts, Scala, Hive,kudu.
 Design the process workflow, with Datawarehouse, ETL development and automation, Hadoop
data models, and source target transformations.
 Configure data ingestion and integration recurring scripts and ad-hoc ingestion of the data coming
from various sources into the R&D Data Lake using Streamsets and Aws services.
 Review of the implemented solution
 Validation of the implementation using the use cases to trace the requirement coverage.
 Validation of the implementation against the finalized, agreed, and shared solution
 Validation with the boundary conditions
 Compliance and data security validation

UAT, feedback recording and sharing.

Duty Details:
 Conducting SIT and UAT (User Acceptance Testing) with stakeholders using Jira X-ray
 Recording the feedback into Jira
 Recording the additional changes if any
 Suggesting the possible improvements while conducting UAT
 Sharing and communication with offshore
 Follow up for UAT approval to Move change on Production (MTP)

Rollout of software application and training on tool

Duty Details:
 Configure the migration scripts for migrating StreamSets pipelines from lower environment to
higher environment.
Classification: Internal

 Version control code in Bitbucket in order to facilitate source code collaboration with the team.
 The rollout of the implemented feature to the user community
 Create and maintain technical specification document in Confluence.

Support, maintenance, and recovery

Duty Details:
 Providing post-production support and maintenance
 Addressing the user’s critical issues during hyper care phase
 Modifications to existing functions initiated by Problem Report/Change Request (PRCR) and
maintaining product compatibility.
 Reporting of the design and implementation-based issues
 Logging and tacking the issue till resolution (temporary / workaround / permanent) is provided.
 Generating different reports (total issues raised vs closed, priority wise, impact wise etc) for the
issues raise in month.
 Recovery schedule monitoring and restore points creation.

Optimization and System improvement analysis, Value Adds

Duty Details:
 Process analysis and gap identification
 Improvement discussion with stakeholders
 Mockup creation of the improvement and showcasing the model
 Optimization in the resource utilization in terms of data transfer and data storage
 Identify the scope where we create value-adds for customer or organization.
 Add the value adds in the upcoming releases as Work Order or Additional change in Proposal.

SOLUTION ARCHITECT JUL 2017 – AUGUST 2021

Project I. Baxter HealthCare


Client: BAXTER. USA

Description: QAS and RPA EDH Project (Quality and Robotics), Data warehousing, Data lake and
datamarts.

Applications and Tools: AWS-Glue, S3, Athena, okta, kubernetics, airflow, Snowflake, Python, Oracle
SQL Developer, Tableau andCognos BI Tools, Tableau, Pentaho, Vertica, UNIX, ETL jobs and Control
M jobs.

The role involves a broad spectrum of application, information, technical, data and business
architectureswith an Enterprise focus across a global group of stakeholders.

Different Modules Involved in the Project:


 Define and implement standards and processes to ensure alignment with the architectural strategy
and goals of the business.
 Collaborate with key stakeholders to ensure data architecture alignment with key portfolio
initiatives on the basis of cost, functionality and practicality.
 Document and articulate relationships between business goals and information technology
solutions.
Classification: Internal

 Conduct architecture assessments and reviews to ensure alignment with business goals and
strategies.
 Worked on creation of procedures, ETL and Control M jobs.
 Translate complex business issues and requirements into structured analytics use cases.
 Rapidly develop/test/iterate analyses that will reveal insight and opportunities for the client and
build end to end analytical solutions.
 Lead technical pieces of client data warehouse implementations and on-boarding efforts.
 Help clients troubleshoot the implementation of the product within their systems.
 Identify, document, triage, and track issues to ensure resolution.
 Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g., Amazon
AWS)
 Experience implementing ETL pipelines using custom and packaged tools.
 Experience with ETL, application and database performance tuning and debugging.
 Experience using AWS services such as Kinesis, S3 bucket, Elastic MapReduce, Data Pipeline,
AWS Glue jobs and AWS Athena.

Contribution:
As a Sr Technical Leader, responsible for the following:
Extract: Data from SharePoint in CSV format / Different databases (Automated change data
capture in S3 Raw). Data ingestion in Translate/inbound Layer> Convert to Parquet format>AWS
Glue Catalog. creation/database for Raw Athena Layer. Athena tables can be created Using AWS
Glue Crawler /defining the schema manually/Through SQL DDL queries.
Transform:
Integration of databases > Metadata Maintenance, Table schema Enhancements, Data Merge to
latest snapshot Flatten the specified tables, secure data Layer, Material Views & synonyms for
Tableau.
Load:
Successfully create Dimensional Model in Snowflake.
Create staging Tables, Dimension tables and Material views by using Copy scripts (copy from s3
translate layer to staging tables> Merge scripts (merge data between stage and dimension tables>
view scripts (Create Materialized views).
Transactional data need to be processed and generates data set which utilized by BI tools &
associated application Role-based data security to consume & share data.
UNIX shell scripts for database connectivity and executing data pipeline Scripts.
Demonstrated experience with developer tools like code build, deploy and Code pipeline.
Experience implementing complex queries against large datasets to support business
intelligencefunctions.
Track record of delivering results in a dynamic enterprise or start-up environment.

BI Projects:
 Legal Analytics and Demand Management Project. (Design and implementation).
 PMDA with team (Requirement gathering, design, document, SQL custom reports, Suggesting on
Performance improvement with the Oracle DWH team.) (Design and implementation).
 QAD Migration Project for APAC and Other countries. (10.2 to 11 R12 version)
 Baxter - Tm1 project in planning Analytics for Apac, EMEA Cognos Admin Project.
 QAD CMS Enhancements. Baxlims, SQL Lims development Projects. (AWS Environment)

Responsibilities:
 Work with the Project Managers and Business Analysts to develop high level project schedule
resource plans for implementation projects.
Classification: Internal

 Develop, document, communicate, and enforce environment/release management,


configuration, and Development best practices.
 Supervision & Mentoring. Manage vendor development resources as necessary.
 Performs design reviews to make sure that the implementation aligns with architectural plans and
Roadmaps. Follow agile Scrum process.

Development:
 BI reporting and Data Visualization using latest version Cognos 11 – R13
 Build complex reports and Ensures that BI Solutions are aligned to BI Strategy.
 Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
 Strong experience with data modelling using Framework Manager for self- service reporting.
 Expertise in requirements gathering, designing and client interaction skills.
 Understand Client requirements and propose specific solutions based on Cognos BI Platform.
 Suggest technical proposals, solution presentations and preparing statements of work.
 Core responsibility shall be to execute BI projects, develop reports and coordinate the technical
workof business analysts.
 Have a sound understanding of ETL design using ETL tools to extract data from heterogeneous
data Sources.

Shell India Markets Pvt Ltd.June 2016 – March 2017


Role: Cognos Developer
Technology: Cognos BI Analytics

Project, I: Oil & gas


Client: Shell India Markets Pvt Ltd
Contribution:
Build complex reports and Ensures that BI Solutions are aligned to BI Strategy.
Performs statistical analysis to support business needs.
Designs, develops, and delivers analytical solutions, resulting in decision support or models.
 Prove with complex and innovative technical solutions to business problems with
dueconsideration to the business problem.
 Strong problem-solving skills at multiple levels within the organization on Technical and
Nontechnical.
 Strong experience in client interaction, understanding business applications and requirement
gathering.
 Plan, design, and implement application database code objects, such as stored procedures and
views.
Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
Work towards promoting UAT Test Development and within the teams.

IBM India Pvt. Ltd. India: June 2012 – May 2016


Role: Cognos BI Application Developer

Project II IBM Internal – ISL Solution Team / IBM External – Direct TV


Client: IBM India Pvt. Ltd. India
Contribution:
Classification: Internal

 Experience in full life cycle of Software Development such as Analysis, Design, Development,
Customization, Testing and Deployment
 Create complex reports, Metadata Modelling both in Relational and Dimensional model.
 Created Projects, Models using Cognos Framework Manager and published packages to Cognos
Server for reporting authoring.
 Created Dimensionally Modelled Relational in Framework Manager and created analytical
reports using Analysis Studio.
 Complete migration Brio reports to Cognos 10.2, SPSS and QMF Db2 (Using SQL)
 Experience in developing dynamic Dashboards and Scorecards.
 Implemented Data Security, Object Security, Package Security in Framework Manager and
created user classes, roles, or groups as per the requirement.
 Strong experience in client interaction, understanding business applications and requirement
gathering.

IBM Daksh, B’lore, IndiaApr 2007 – June 2012


Role: Assistant Manager – Business Operations Finance

Project I. IBM Internal – Asia Specific and Logistics


Client: Daksh, B’lore, India

Contribution
 Designing, Querying, using SQL and Functional consulting into analytics. Financial and
forecastmodelling.
 Analysing financial scorecard, Cost and Revenue analysis, using Brio queried results.
 Comparing and analysing the financial position or budget allocation reports.
 Reviewing scorecard, preparation of provision allocation reports and cost and revenue report as
perusage, cost modules etc.
 Accountable for team performance and meeting deadlines with good interpersonal
andcommunication skills and Excelled in the team co-ordination and client’s interaction.
 Ensuring the clients satisfaction by resolving the queries and meet the deadlines, deliver work
ontime.Audit the work before submitting for the final audit trail.

Inventory planning team.


Analysis of stock out situation,Cost variance analysis and total cost reporting and advisory
Provide support to the Sr Managers in the management reporting requirements.

ACHIEVEMENTS AND REORGANIZATION


 Transition of reports from Hyperion Brio to Cognos BI, Framework manager and SPSS Modeler.
 IBM: Remote Transition of Resource management project from Benelux, Israel, Australia, and
New Zealand.
 IBM: Travelled Sweden for Resource management transition.
 TESCO HSC: Travelled UK Cardiff for transition of B/S reconciliation (TESCO Freight
Accounts)
 ACS India pvt ltd: Travelled Spain Barcelona for transitioning of Vauxhall Process.

EDUCATION
Classification: Internal

Masters:
1. Diploma in Business management -2013 ICFAI University: Hyderabad
2. Post Graduate Diploma in Business Admin from ICFAI – 2018: ICFAI University: Hyderabad
3. MBA in ICFAI -IT and Finance – Feb -2019: ICFAI University: Hyderabad
Bachelors:
1. B’COM Graduate (specialized in Taxation) with basic Programming Language from NMKRV
Women’s College. – 1998 : Bangalore University

You might also like