Sewenet Asrat
Sewenet Asrat
Sewenet Asrat
OBJECTIVE: A highly communicative individual with strong interpersonal skills and ability to adapt to working in
team environments. Motivated by a challenge, an astute and dedicated professional working with the highest
of ability and effectively managing the challenges of reporting. Proficient in administering both on-premises
and cloud-based database Servers. Performance Monitoring, Optimizing, and Troubleshooting to maintain the
integrity, security.
PROFILE SUMMARY:
• 9 years of experience in the Information Technology Industry with Database Design, Data Modeling,
Data and Database Security, Requirements Analysis, Application Development, Testing, Implementation
and Deployments using SQL Server 2016/2015/2014/2012/2008/2005 & Oracle 11g,12c.
• Strong experience in Business Intelligence (BI) using Visual Studio 2017/2015/2010/2008/2005.
• Created, Built and deployed successfully dynamic SSIS packages into SQL Server 2017/2015/2014
/2012/2008/2005 databases and scheduled them according to requirements.
• Extracted data using SSIS from OLTP (Tabular) to OLAP. Based on the required reports, identified
various data sources and established the connections using SSIS and constructed data source view.
• Expertise in creating databases, users, tables, triggers, views, stored procedures, functions, Packages,
joins and indexes in database.
• Expert in designing complex reports writing like reports using Cascading parameters, Drill-Through
Reports, Parameterized Reports and Report Models and adhoc reports using SQL Server Reporting
Services (SSRS) based on client requirement.
• Expertise in developing and extending MS SQL Server Analysis Services (SSAS) cubes, data mining
models, deploying and processing SSAS objects.
• Experience in supporting Microsoft Azure systems including SQL Server hosted on Azure Virtual
Machine (IaaS) and Azure SQL Databases (DBaaS).
• Very strong in Azure system configuration, database migrations, backup and recovery, geographic
data replication, performance monitoring and tuning and disaster recovery configuration and
execution.
• Expertise in working on all activities related to the development, implementation, administration and
support of ETL processes for large-scale Data Warehouses using Bulk Copy Process (BCP), Data
Transformation Services (DTS).
• Experience in Extraction, Transformation, Loading (ETL) data from various sources into Data
Warehouses and Data Marts using SSIS and Informatica Power Center (Designer, Workflow Manager,
Workflow Monitor, Repository Manger) with Strong understanding of OLAP and OLTP Concepts.
• Responsible for data quality testing and leading a team that performs data reconciliation to identify
data anomalies.
• Implemented OLAP Cubes, Facts, and Dimensions for providing summarized and aggregate views of
large sets of data.
• Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema
Modeling, Snowflakes modeling, fact and dimension tables, Pivot Tables, modeling of data at all the
three levels: view, logical & physical.
• Have basic knowledge on Informatica Power Center Transformations such as Source Qualifier, Lookup,
Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure,
Sorter, Sequence Generator and XML Source Qualifier.
Sensitivity: Confidential
• Proficiency and Expertise in SQL Server Replication, Backup/Recovery, Disaster recovery and planning.
• Performed database consistency checks at regular intervals.
• Experience in Creating complex Views, Stores Procedures, DDL/DML Triggers and User-Defined
Functions to implement Business Logic and Data Protection.
• Evaluation/design/development/deployment of additional technologies and automation for managed
services on AWS
• Investigate and debug issues in the Database and Services you create and work with QA & Data
Analysts to ensure highest quality within the system
• Support the business development lifecycle (Business Development, Capture, Solution Architect next
migration path, cost reporting and impartments)
• Create and execute a strategy to build mindshare and broad use of AWS within a wide range of
customers and partners
• Work with onboarding internal clients
• Solution design for client opportunities in one or more AWS Competencies or general cloud managed
services
• Create a lift and shift process model clearly defining the individual steps of the lift and shift process
TECHINICAL SKILLS:
Databases Tools SQL Server Management Studio, SQL Server Data Tools, SQL Profiler, Microsoft Azure
ETL Tools SQL Server Integration Service (SSIS), Visual Studio 2015/2017
Reporting Tools SQL Server Reporting Services (2012, 2014), MS Access, MS Excel
Other Tools MS Office Suite (Microsoft Word, Power Point), OLAP & OLTP, One Note, MS Outlook
PROFESSIONAL EXPERIENCE:
Crescent Bank and Trust provides auto loans to consumers across 32 states. The company also offers local
personal and business banking services.
Sensitivity: Confidential
Responsibilities:
• Created Queries in T-SQL, Stored Procedures, and Views to store the data of clients, advisors using SQL
Server Management Studio. Created User Defined Functions, Views to store data into appropriate
tables as and when data arrives and created indexes to the tables that have least updates to improve
the query performance.
• Tune SQL queries and database performance by improving overall table design, indexing, query plan
analysis and refactoring. Create database design documentation at architectural and functional level.
• Review database objects for data integrity, quality, security, recoverability, scalability, maintenance
and capacity recommend changes as necessary.
• Recommend new strategies and architecture to enhance efficiency of data systems. Plan, manage and
track database and SQL procedures system changes.
• Create tasks, sessions etc. in the workflow manager and monitored them in the workflow monitor.
• Performed data domain discovery and Identified data standardization, validation, de-duplication in
source data.
• Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key
profiling using IDQ 9.5.1. For MDM.
• Extensively Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating
transformations.
• Worked with mappings to dynamically generate parameter files used by other mappings. Involved in
performance tuning of the ETL process by addressing various performance issues at the extraction and
transformation stages.
• Documented the mappings used in ETL processes including the Unit testing and Technical document of
the mappings for future reference. Designed developed the SSIS packages to extract, transform, load
the data into Target and generate outbound files for various vendor by using SSIS.
• Scheduled the Informatica Workflows to start running at specified date and time repetitively forever.
Participated in unit testing to validate the data in the flat files that are generated by the ETL Process.
• Involved in Designing and deploying cubes in SSAS environment using Snowflake and Star Schema
Designs for the Tabular, OLAP and Operational data.
• Developed Tabular Reports, Sub Reports, Matrix Reports, Drill down Reports and Charts using SQL
Server Reporting Services (SSRS).
Sensitivity: Confidential
• Data analytics and engineering experience in multiple Azure platforms such as Azure SQL, Azure SQL
Data warehouse, Azure Data Factory, Azure Storage Account etc. for source stream extraction,
cleansing, consumption and publishing across multiple user bases.
• E2E understanding of all MCIO usage related streams such as Compute, Azure-SQL, Azure-Storage, and
Azure- DocDB built up on a customer centric platform encompassing constant communication with
partners/upstream providers in guaranteeing 99%Completeness and 100% accuracy.
• Expert knowledge on Microsoft proprietary SCOPE language specification (de-facto standard for all data
related operations) for Microsoft Internal COSMOS platform for most of, much of usage streams such
as Azure Compute, Azure Storage, Azure SQL & Azure DocDB.
• Cloud based report generation, development and implementation using SCOPE constructs and power
BI. Expert in U-SQL constructs for interacting multiple source streams with in Azure Data Lake.
• Data Factory for on-perm to Azure SQL and Azure SQL to Microsoft Cloud Internal Implementation
COSMOS.
• Worked aggressively on Microsoft Optimization Grim Reaper project which identifies SKU, VM
recommendations across multiple platforms such as Azure Compute and Azure SQL utilizing Grim
Reaper algorithm which is based on historical trends.
• Experienced in creating MDX query in SSAS cube for various calculated members, KPI and various
actions such as drill through, reporting etc.
• Experienced in DAX expressions in tabular models as per business requirements.
• Used Repository Manager to Import/Export Metadata and promote Incremental changes between the
DEV/QA/PRD environments. Experience using Metadata Manager for data lineage and Business
Glossary to define business terms. Define the roles and access privileges for
developer/administrator/data steward/requestor etc. on requirement basis.
• Create the test cases and Test plan for each scenario, perform sanity checks in Dev/SIT and UAT boxes
as need basis for Vendor domain.
Tools and Technologies: MS SQL Server 2017, 2015 Visual Studio 2017, 2015, SQL Server Integration Services
(SSIS), SSRS, Data Warehouse, Widows server 2012, Linux server.
Texas Capital Bank is a commercial bank that delivers highly personalized financial services to businesses and
entrepreneurs.
Responsibilities:
• Assisted in Business Analysts to gather technical requirements.
• Developed Tables, Views, and Stored Procedures using SQL Server Management Studio.
• Used the SSIS to migrate databases such as Excel files, flat files and SQL Server database to centralized
IT destination.
• Created many complex database code objects such as Stored Procedures/ views/ Functions and used
them to generate reports and created indexes to the tables that have least updates to improve the
query performance.
• Created various SSIS packages to transfer data from OLTP (Tabular) databases to staging area and then
into data warehouse. Deployed the packages to the appropriate destinations using SSIS and
successfully created dynamic package configuration in SSIS.
Sensitivity: Confidential
• Work with Service Management teams to establish and monitor service level agreements,
communication protocols with data suppliers and data quality assurance policies.
• Provide data quality advisory services to clients and to internal stakeholders. Recommend maintenance
enhancements to data acquisition processes to improve accuracy of data warehouse data.
• Perform support functions and develop fixes accordingly for our Association Management System,
Manage and actively participate in system and user acceptance testing.
• Provide technology options and recommendations to solve for business challenges, Strong knowledge
of database and system performance monitoring. Ability to share and communicate ideas both written
and verbally to staff at various levels within the organization in a clear and concise manner that is
relevant to each level.
• Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate,
Conditional split, SQL task, Script task and Send Mail task etc.
• Have good business analytical and technical knowledge in Healthcare Membership, Benefits and
Provider Modules in FACETS.
• Support Batch and Online processes involving development, enhancement and support of T-SQL
procedures, views, triggers and query tuning.
• Used different Control Flow Tasks and Data flow Task for Creating SSIS Packages. Used different types
of Transformations for Data Conversion, Sorting and data cleansing from different Sources into
Company Formats.
• Created XML Package Configurations, Error Handling using Event Handlers for On Error, On Warning,
and On-Task FailedEvent Types.
Tools and Technologies: MS SQL Server 2014/2012, Visual Studio 2010, SQL Server Integration Services (SSIS),
SQL Server Reporting Services (SSRS), MS Office
CVS Health is a leading American pharmacy, retailer and health care company dedicated to providing the best
and most innovative solutions for access to healthcare and pharmaceuticals. The interface of the EPSO project
will include construction phase for Patient and Prescriptions with web services and Drug, Employee, Store for
Integration services included Rx-Connect, Retail, Specialty, Minute clinic and Mail business units.
Responsibilities:
Sensitivity: Confidential
• Exporting file from the database to Excel for various uses.
• Monitor SQL Server Error Log, space usage (data and log file)
• Collect money from Shareholder and deposit it in the designated bank account.
• Reconcile general ledger and subsidiary ledger balance and resolve the issues if there is any.
• Issue check to the various vendors or shareholder it required.
• Posting all journal entries to the ‘CUBIS’ Application on a regular basis.
• Prepare receipts and payments, income and expenditure account and balance sheet for the
organization.
• Preparing bank reconciliation.
• Any other duties assign by the management.
• Preparing daily collection sheet.
• Preparing deposit, cash disbursement and journal voucher for the office.
• Reconciling with general ledger and shareholders individual ledger balance.
Tools and Technologies: MS SQL Server 2008/2012, SQL Server Reporting Services (SSRS), MS Access, MS Excel,
SSIS, BCP, T-SQL, Tableau Desktop, Power BI
EDUCATION:
CERITFICATION:
Sensitivity: Confidential