0% found this document useful (0 votes)
120 views8 pages

Latest2022Version2h Programmer9e2 Long2

Glen Silverman is applying for a new role involving Excel, VBA, SQL, and ETL. He has extensive experience in these areas from previous roles at companies like Abbott, Morgan Stanley, and TIAA. He is currently working in a support role at Abbott but is looking for his next challenge. He has strong technical, communication, and problem-solving skills and is able to quickly get up to speed on new assignments.

Uploaded by

ashok
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
120 views8 pages

Latest2022Version2h Programmer9e2 Long2

Glen Silverman is applying for a new role involving Excel, VBA, SQL, and ETL. He has extensive experience in these areas from previous roles at companies like Abbott, Morgan Stanley, and TIAA. He is currently working in a support role at Abbott but is looking for his next challenge. He has strong technical, communication, and problem-solving skills and is able to quickly get up to speed on new assignments.

Uploaded by

ashok
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Dear Hiring Manager.

Resume: Glen Silverman Please consider me for this role. Completed my project with
Abbott. Abbott has given me the flexibility to continue in a support
[email protected] role on a day-to-day basis while looking for what’s next. System is
215-262-1418 stable and I can start a new role a Monday following date of offer... An
experienced consultant. Adept at stepping in; Filling gaps; Getting up
to speed quickly... Looking for a new Excel/VBA, SQL, ETL
assignment...
Work Summary : Thank You.

A business, systems, data analyst, developer, programmer and during pandemic and periods of limited staffing, assigned role as lead
Tier-3,4 Excel/VBA Trade Floor support person, Morgan Stanley.

Considerable RAD and traditional IT background deployed, embedded with business, liaising with Stakeholder, PMO and IT as developer
and programmer managing, supporting a large portfolio of complex Excel/VBA SQL applications that make use of ADO and ODBC
connectivity with major Database DBMS’s, Oracle, SQL-Server, Sybase, SAS (IOM), DB2, and MS-Access, also live data-feeds with
realtime internet information providers, Bloomberg, Reuter’s, also Sharepoint, Cloud, and ETL…

I am dependable, easy-going, multi-task, and solid under pressure… Excel/VBA and Formula Expert... An experienced consultant. I
assimilate fast. Adept at stepping in, filling gaps, solving problems. Strong RCA... Comfortable working with all levels of management
and organization… An affinity for detail. Strong business acumen. Solid written, speaking, power-point presentation and communication
skill to include documentation and technical writing…

Company Overview:
• Abbott Labs – Chicago/NYC, Contract, Excel/VBA SQL, Oracle Database and Order fullfillment, (WFH Jan 24,2022 to present)
• TIAA-Cref – NYC, Contract, Excel/VBA Annuities, Investments, Tax, Cost Basis, Gain/Loss Reporting (June 2021 to Jan 2022)
• Walmart (Bentonville) Remote; Excel/VBA, Word/VBA Mass Mail-Merge during outbreaks of COVID-19 (May 2021 to July
2021)
• Morgan Stanley (MS) – NYC. Excel/VBA R, C#, Python Trading Applications (Sep 2015 to Mar 29,2021 WFH to May 2021)
• MFX/Fairfax / IronShore P&C – NYC/NJ, Excel/VBA SQL, UW/Risk/Rating/Pricing, Contract (May 2013 to Sep 2015)
• Merrell Lynch - Pennington, NJ , Excel/VBA SQL Java Front Office, Middle-offce re-engineering Contract, (Dec 2012 to May
2013)
• TV Guide (ROVI), Reengineer Excel/VBA SQL ETL, Philadelphia, Contract (June 11, 2012 through December 2012)
• Bank of Tokyo, Mitsubishi Securities, NYC, Excel/VBA SQL, Backfill PNL (February 6, 2012 thru early June 8, 2012)
• Teleflex Medical Instruments Int’l, Phila, Excel/VBA, HR Performance Rating, Salary/ Bonus Planning (Sept 19, 2011 to Feb
2012)
• EmblemHealth Insurance Companies, NYC, Excel/VBA SQL, SAS, ActuaryArchitect (Jan 2010 to Sep 16, 2011)
• JP Morgan/Bear Stearns, NYC, Excel/VBA SQL Trading Applications Developer and DTCC Clearing (Sep 2004 to Jan 2010)
• CitiGroup, NYC Global Creditcard Portfolio Risk Mgmt Backfill Excel/VBA SQL C,R, Python, Java (Jan 2004 to Sep 2004)
• Advent Capital Management Investment Advisors, Midtown Manhattan. Excel/VBA and SQL Equities, Options Trading
Developer
• Unilever Nat. Starch, Bridgewater NJ, Excel/VBA SQL Project Mgmt, Power-User Developer, Programmer, Application Support
• Henkel, Ingredion, Bridgewater NJ, Excel/VBA SQL RPA/ML AI, ERP, Demand Planning, Forecasting, and Predictive Analytics
• Proctor and Gamble, Oil of Olay, Ft Washington Pa, (formerly Richardson Merrell – Vicks Chemical Co.) Programmer

Education:

University of Pennsylvania. Bachelor of Science, Mathematics (Quantitative Methods), Computer Science…

Application Experience:

Extensive technical background with Excel/VBA and SQL in a broad spectrum of Industry working with financial applications to include
CRM,
FP&A, S&OP, ERP, Forecasting, Planning, Budget, Spend and Runout. Expert skill with Excel/VBA, also intricate array formula,
PivotTables, PivotCharts, PowerPivots, Slicers, Spark-Lines, PowerQuery, Power-BI/DAX, HeatMaps, Dashboards, Scorecards, API, and
Custom UDF Functions. Heavy experience with Shared Excel Workbooks, also workbooks with cell formula with cell references to cells that
reside in other workbooks, connecting large networks of these workbooks to one another. Developed workbook applications that support the
culling and filtering of reference data, sometimes many hundred thousands of rows, where use of seriously advanced performance and
optimizing techniques essential in making this kind of workbook feasible, especially with the more sophisticated trading applications with
live, frequent, real-time refresh-rates, eg, Bloomberg, Reuter’s to include numerous continuously refreshed URL data connections. These
technical skills are current. With most recent projects here with Abbott, TIAA, and Morgan Stanley, wrote thousands of lines of VBA, and
had maintained this pace of coding throughout my career, balancing project lead responsibilities with many of these projects along the way...

Considerable experience embedded with business liaising with Stakeholders, PMO, IT and with use of SDLC (Waterfall) and ALM, authored
many BRDs making presentations to senior management. Essential 1st steps conferring with stakeholders, business areas and project teams,
building concensus, confirming that all of us are on the same page, that we fully understand what we are being asked to do... Authored many
FRDs and DFDs, Use-Case drawings, with use of Agile, Jira, and Scrum boards, wearing many hats, as JAD Facilitator presenting the DFD
which is the deeper study, a detailed mapping, the business model, a forest view that identifies all operational areas, data sources, information
and dataflow to and from each bubble, one operational area to another, essentially this is the scope of the project... If it’s not on the drawing,
it’s not in-scope, but revisions do happen. Heavily annotated lines and arrows with supporting narrative describe each exchange of data and
information from bubble to bubble, to and from proposed new and existing data-store(s), one to another and if there were gaps, missing
pieces, we caught them here... My goal, the final presentation, is to gain a final concensus, a key milestone,and then shifting to a higher gear
develolping the SRD and making the transition from SDLC Waterfall to the more traditional IT Waterfall where I switch to using MS
Project... Throughout my career I’ve gained the bandwidth and skillset to present to senior management, huddle at the whiteboard side-by-
side with project team, in roles as developer, designer, a programmer, and everything in between.

Data Analyst :
Formal study of data normalization, data modeling, data science and database design at University of Pennsylvania (a math and computer-
science major), and as an IT DBA, and lead DBA with Unilever, considerable experience designing relational databases and data warehouses
with considerable scale and complexity. Some examples of large database initatives, data mining, data modeling and analysis follow…
With ERP and related projects , built and performed studies with databases with millions of rows of historical data, testing for trend,
cycles, interval, amplitude, seasonality, a distinguishable signal. Built many automated model fitting (RPA/ML) and back-testing
processes that determine which of many forecasting methodologies (triangles, time-series techniques, Bayesian Linear techniques,
single, double, and triple exponential smoothing (choosing optimal smoothing constant sets), where it decides which of these
algorithms is best suited, most accurate, product-by-product to produce forecasts SKU specific, that feed a production planning
process for multi-national company that manufactures 10s of millions of pounds of product each month. Knowing which products to
make, when to make them, in what quantities to make them and where to inventory them (warehouse distribution) in an industry that
is always capacity constrained, has direct impact on logistics costs and company’s bottom line. Having this experience, I found that
the accuracy of a forecast isn’t just a function of the arithmetic one uses to produce a forecast, it’s also a function of the preparation,
organization and quality of the data it feeds from (described more fully below). I developed many fully automated data modeling and
model fitting techniques to identify what specific forecasting method to use in each of the varied products company produces, sku
specific, where each sku exhibits unique trend, cycles, and seasonality. Please see following pages for intricate detail of several large
scale ERP, RPA/ML AI applications...
At EmblemHealth, a leading health insurance company, NYC, a finding from my studies with the millions of rows of claim history analyzed
each month, that in order to more accurately calculate future claim reserve projections, data must first be organized in homogeneous cohorts
(projection cells) of data where each grouping of insureds, the cohorts, share a similar profile, the same demographics... This is a disciplined
data analysis process using methods such as chi-square, multiple regression, correlation coefficient (similar to market beta), and covariance -
the trend correlation of many datasets of data and with the use of Excel/VBA, it was now possible to demonstrate that data when organized in
these cohorts, with greater homogeneity, similar profiles, and with use of more advanced predictive models (triangles) that better understand
medical event costs, specifically post medical event additional costs, these are considerable costs during period of recovery (the runout)
previously thought not predictable; These projections are now greatly improved, more accurate, consistently produce reliable projections, a
sufficient cash reserve number, a mandate with New York State Insurance Commision’s stringent requirements (they impose heavy fines) to
never be under-reserved which was a major justification for this project. Please find a detailed description on following pages...

With Morgan Stanley and all Major Investment Banks NYC deployed, embedded with Quant and support teams sitting side-by-side with
derivative (clo,cdo,cds,mbs,abs,lbs), also Fixed Income (Rates), FX and Equities traders and trading desks wearing many hats as developer,
programmer, and rapid-response support person for urgent or broken Excel/VBA trading apps or modifications to live realtime pricing
models, LIBOR, (re)building a yield curve, spreads, and RAD programmming initiatives. Considered a high pressure role, many high
pressure moments. For me, a very fulfilling, rewarding part of my overall career and work experience.

ETL, EDI, and FTP data conversions, transformations, data mapping, data migrations w/CSV, TXT files during startup phases of new
projects or ongoing feeds to and from legacy platforms, enterprise databases, the cloud, ie, handshakes to and from other systems such as
SAP. ETL always an important part of every project throughout my career. Sometimes under-appreciated, these are critical, sometimes very
complex processes that require sophisticated safeguards, control-reports, exception detection and sometimes suspense files that are recycled
automatically with each next run to assure data quality, and data integrity...

HR/HCM Projects: Numerous projects as developer to assist HR with Budget, the annual Staffing Plan, Salary Planning, Employee
review/rating, census collection. A template, essentially, an Excel workbook, one for each profit/cost center location allowing for
cross border currency differences, transliterations. Also salary increase and bonus funding pool attribution process using KPI Metrics
as predicates. Developed payroll compliance reports with sensitivities to diversity and discrimination in its many forms… See
Teleflex Int’l project details, description on following pages …

With recent projects, any given day, you can find me at my desk coding, in meetings with stakeholders, making a presentation, or sitting
side-by-side with a jr programmer debugging a program, and that same day you might find me on a trading floor fixing a broken trading
app… Considered a seasoned professional, embedded with business, a low-key, team-oriented individual that can step into a high-pressure
role technical and otherwise day-1 and hit the ground running... Please consider me for this opportunity... Thank You...

Company Specific Project Descriptions :

Abbott Labs – Chicago/NYC, Contract, Excel/VBA SQL, Oracle Database and Order fullfillment, (WFH and onsite Jan 24,2022 to present)
A backfill role as an Excel/VBA programmer. Abbott, well into their project determined they needed more experienced Excel developers to re-
write their Order Entry, Pricing, Order Fullfillment, Supply-Chain, Shipping and Invoicing applications. They reached out to me. Initially
thought Excel would be the fastest and easiest way to build this application, Abbott realized well into this project that much of the
complexities of building this kind of application would need to be written with VBA code. The UI/UX (user interface) aspects of the
application would be ideal for an Excel presentation layer. But there was no easy way to achieve data connectivity with the spreadsheets and
Abbott’s massive enterprise database. I developed numerous API to address what was identified as system and enterprise-wide reusable
code. This was a high priority application that needed to be brought online as quickly as possible to strengthen an aging order fullfillment
system and supply issues in the overall marketplace. Abbott, at this time, was under considerable pressure with much publicized supply-
chain issues. This was an ideal role for me as a lead programmer and project lead. Excel and VBA with ODBC connectivity to an Oracle
database are a strength and this project needed to be up and running ASAP; was clearly the top priority here at Abbott...

TIAA-Cref - Contract NYC / Durham, NC (Onsite/Remote) (June 2021 to January 2022) Excel/VBA SQL Technology platform...
Assigned Role as analyst, developer, and programmer embeddded with Actuarial and Quant Teams developing an Excel Workbook Pension
and Retirement Funds Management and Administration System. Its purpose is the managing of Retirement, Savings and Investments, and
related transaction activity, examples, Insurance, Dividends, Annuity Surrender, Premium Payments, Untaxed Gains, Taxable Gains, Loan
Withdrawals, Taxable Income, Disbursements, Distributions, Cost Basis calculations and Valuations, and a predictive cash flow and runout
projection process for millions of members (UFT Teachers Union), many billions of dollars... Was asked to develop this sophisticated
spreadsheet. Heavy emphasis with UI/UX features, just shy of 2.300 lines of VBA code, a fully automated refresh process with ADO/ODBC
SQL connectivity to back-office Oracle databases.

Morgan Stanley - Midtown Manhattan, NY (September 2015 to March 2021, WFH March 29,2021 to May 2021)
Joined STRATs and FX Quant Teams as a developer/support person initially asked to build a new class of trading, pricing, spread, and yield-
curve applications with emphasis on data visualization, office automation, dashboards, heatmaps, and scorecards. March 26, 2021 as COVID
spred through our building at 1585 Broadway, Morgan Stanley invoked their continuity plan, where employees and consultants, all of us
were asked to transition, and work from home. I was assigned role as Morgan Stanley’s Tie-3, Tier-4 Excel/VBA go-to support person ,
embedded with trading desks, sitting side-by-side with traders, at that time, our top priority was simply to keep our technology platforms
stable, up and running. A reputation for multi-tasking with trading desks, a go-to person that can do the ‘work of an army’, shared with me
during annual review... Pre-Pandemic, some technical highlights include creation of a new class of worksheet function (an Addin with 2,300
lines of VBA) that produces sophisticated business graphics. This was considered a very special capability. Received lots of attention.
Normally, a worksheet function works within the scope of the one cell it resides in. Building a dashboard, comprised of sophisticated charts,
and sparklines, that are generated from a worksheet function not generally thought doable. MS’s Quant community, determined to have this
capability asked me to find a way to make it happen, to have an ability to rapidly build and distribute sophisticated charts with enhanced
graphics with what would look to be a simple custom worksheet function, a formula that typically sits inside a cell that serves as the top left
corner of the chart it just generated. We named it FastChart =FastChart(). And true with all BI dashboard tools, Excel provides for one
primary and one secondary Y axis for its charts, but with what I was being asked to do, this new chart required the invention of a 3 rd Y. I
refer to it as a tertiary Y. And this required a modification, an enhancement to the Microsoft Excel Chart Object. This was a high visibility
project, thought not possible to do, a request that came from senior management, the c-suite to make happen (See last page discussion “why
there was so much interest with this dashboard”). Was given opportunities to develop numerous custom add-ins (API), powerful tools, many of
them, that attach automatically to any workbook (using VBA code that writes VBA code that it embeds inside the VBE VBProject property
of the receiving workbook during the initial addin or API open event. These are tools that had become popular throughout the Morgan
Stanley Quant community, also Senior Management, the Derivatives and Options Trading Desks... Also asked to participate in many urgent
special projects to include BI dashboards, that feed from databases, data warehouses, local and global, via connections from within the VBA
code, eg, SAS IOM, ADO ODBC connectivity to Access, SQL-Server, Oracle, DB2, Sybase, the Cloud, pretty much, all of Morgan
Stanley’s DBMSs... Also Pioneer, Bloomberg, and Reuters Addins that provide realtime/live ticker data... These kinds of projects and
opportunities have come my way throughout my career. And probably explains why to this day, I love working with Excel and VBA. Every
project is a thrill. Morgan Stanley’s PMO Team believed I would be the right person for this role after what I understand was a lengthy
candidate search. Wasn’t initially hired to be a technical IT person, but was recognized as having a rare combination of strong technical
Excel/VBA skill and strong business acumen and experience, and met the criteria for the person Morgan Stanley was looking for.

MFX Fairfax / Ironshore P&C Morristown, NJ (May 2013 to Sep 2015)


MFX Fairfax provides software, and software services for property and casualty insurance industry. In this role, as business, systems, data
analyst and Excel/VBA SQL developer/programmer heavy collaboration with Ironshore P&C, New York City. Working closely and embedded
with Ironshore Actuarial staff; tasked with development and overhaul of large diverse set of casualty insurance risk/rating/pricing workbooks
preparing them for a process that persists key policy submission, underwriting, booking, rating, pricing, artifacts and supporting documents,
essentially all relevant information to a centralized enterprise database, itself in a state of development when I first came on board…
Extensive modifications needed to the embedded rating and pricing functions in each of the workbooks. These are large sophisticated
workbooks. They undergo considerable modification as response to steadily changing business needs. Formats and layouts vary from
insurance coverage to coverage but much of the underlying key data is constant with all insurance products. The initial approach described to
me during the interview simply was to perform the mapping of each cell address in a workbook to a specific database table and
corresponding column-fieldname. I was asked during the interview how I might approach this project mindful of an incomplete database
design with workbooks that are always in perpetual states of change. I cited some risks - starting with one example where something as
simple as the insertion of a row or column in one of these workbooks, which is common-place, could easily break this process.
Underscoring the value of using range-names in this project, not necessarily for all Excel based applications.. Adding that incorporating the
use of range-names for key data versus a mapping of rigid non-scalable cell addresses would eliminate ongoing (re)mapping (IT support)
along with other larger advantages where the persisting process could be accomplished by simply walking the (range) names collection with
VBA wherein each range-name always accurately points to the actual cell to be persisted (even if cell is moved elsewhere in the workbook)
which could then be fed and indeed does feed the SQL engine that updates a simplistic database table consisting of (policy-id, range-name,
element-name, and value) the mapping that would serve as the linchpin, essentially a database staging area that sits between workbook and
what would ultimately become the Policy Object Database. MFX liked this idea and believed I grasped the complexities of this task, and
they hired me and went with the concept I outlined, and this actually simplified the overall process. Allowed database team, Ironshore
actuaries, and my work to have a reduced dependency with one another in that we could independently work toward the same goal without
getting in one another’s way. I developed a number of Excel based automation tools to help build this system. Entrusted with this
responsibility, MFX and their client Ironshore believed I had the intuitiveness for how this system would need to be put together, an
imperative for all of us, to meet the tight time-lines and milestones along the way.

TV Guide (ROVI), Philadelphia, Pa (suburb) (June 11, 2012 through December 2012) Performed work for TV Guide (ROVI), a global RAD
initiative to automate receipt of television schedules and programming information sent to ROVI from local broadcasting companies from
around the world in varied formats and languages. Exclusive use of Excel/VBA and SQL. All are transformed into a common format (a
massive ETL process), required a massive staff doing this manually, now fully automated, a huge initiative that now feeds all of this
content to TV Guide’s enterprise database, and then digitally re-distributed globally to every cable and set top box here in America,
Europe, Asia-Pacific, and someday their goal, every cable and set top box, everywhere... This URL describes ROVI as being the most
important company no one has ever heard of... https://www.businessinsider.com/the-most-important-media-tech-company-you-dont-know-
rovi-2011-1

• Excel spreadsheets are used as the primary medium where local broadcasters from around the world send TV programming and
scheduling information to ROVI where a staff of hundreds of people manually reformat incoming program schedules and captioning
information into a common format and then manually upload this information into Rovi’s proprietary programming database.
• My role was to study this process and develop a strategy to automate this costly manual process.
• Asked to assume role as Excel developer I built the automation that now feeds all inbound Excel schedules in varied formats through a
process that converts all input into a uniform format using an Excel/VBA platform, with ADO/ODBC connection to Rovi’s proprietary
Oracle database. Rovi does not require inbound content to be in standard nor even a clean format as doing so would place large burden
on content providers (Rovi’s valued customers) and this was seen as the key challenge to overcome in order for this project to be
successful... This of itself was a significant technical challenge, one that I could not refuse, which drew me to do this project. The
project was successful and It really did require extensive advanced VBA and worksheet formula Excel skills to make this project happen.
Bank of Tokyo, Mitsubishi Securities, NYC (February 6, 2012 thru early June, 2012) As Business, systems, data Analyst, Lead Developer.
Joined team well after project was underway. Categorized as a RAD initiative, I helped FP&A management team meet aggressive target to
go live with new generation of financial reports targeted for March 2012 month-end.
• Assisted team to overcome unforeseen technical issues. Project was at risk of missing firm implementation date. Power-Pivots, charts,
seemingly ordinary spreadsheets in a sharepoint environment were not operable. Sharepoint and Excel services with its many features,
had constraints with earlier versions of Excel that rendered the initial approach not workable. No experiment nor proof-of-concept
undertaken earlier in the project. Compatibility issues went undiscovered until mid-January 2012 less than 2 months before
implementation when former consulting team began porting (sending) spreadsheets from client machine to sharepoint environment…
• Spreadsheets worked perfectly from desktop client but would fail when attempting to load / run in Sharepoint environment. Widely
understood, but overlooked as 2 examples, Sharepoint and Excel services, VBA code and validation-lists not compatible.
• Experienced with these matters I was asked to help address and solve these issues quickly. And what was originally accomplished with
elaborate VBA, we were able to re-engineer and accomplish solely with use the of worksheet formulae. Functionality such as doing an
alphabetic sort (not for faint of heart) with the use of Excel array formula, and not VBA, deceptively complex to do. Stakeholder
requirement that sorting needed to be fully automated, no user intervention; This meant “do not force” end-user (Senior
Executive/Management team) to make use of standard sort facility on menu ribbon… To further sidestep use of VBA, which was a firm
stakeholder requirement/constraint regardless of which versions of Excel are in use, we did ultimately make use of Excel 2010
functionality (the slicer) and this enabled us to sidestep primarily the use of VBA code that synchronized filtering of the clusters of
pivots and charts, aka, the dashboard. Oddly the slicer was not known to the original team heavily staffed with a consulting firm.
Validation list issues, also not a problem with Excel 2010. Only hitch I needed to overcome was to persuade management team to do
emergency upgrade from Excel 2007 to 2010 which was otherwise prohibitive this late in the project. But doing this allowed us to
successfully address all showstopping issues which put us back on a glide-path to produce initial set of financial reports on time, as
promised. I developed 12 of the 74 financial reports considered highest priority to meet this first deliverable target date.

Teleflex Medical Implant and Instrumentation Corp, Limerick, Pa (Sept 19, 2011 to Feb, 2012) As Business, Systems, Data Analyst,
Developer and programmer. With Excel/VBA, SQL and ODBC connection to Microsoft Access DBMS via ADO, I was asked to develop an
HRIS/HCM Human Resource Total Compensation Mgmt application. One of the more challenging ETL initiatives in my career. This is an
Employee Review, Headcount, Compensation and Bonus planning tool. Teleflex is a multi-national company with 15,000+ employees
during the timeframe of this project. SAP HRIS module in early development, not available for their annual review, compensation, and bonus
planning process, so Teleflex went with an Excel/Access interim alternative.
• This was a RAD project. Teleflex went with an Excel platform, mostly because Excel was already installed on every desktop at every
location global, essentially already in place everywhere, Teleflex needed to develop an application that would produce a Salary Planning
spreadsheet/template for distribution and collection to contain employee census information, one spreadsheet for each and every
location, where each was built to capture/contain performance score, salary, bonus, and extra compensation data. 34 countries (every
employee).

• System was designed to combine employee census data into a central Data Warehouse, a significant challenge, given the complexities
with cross-border organizational relationships of Executives, Management, and staff - with a facility to capture each employee-to-
manager (line/staff) relationship, inverse, from bottom to top of the hierarchical organization (org chart) all the way to CEO and
Chairman. This new collection process would become the single feed for all ad-hoc reporting requests and possibly the initial feed to an
SAP HR Module when and if SAP would eventually be implemented … This was the company’s first complete database with all
employees, world-wide… Seen as a noteworthy milestone for Teleflex.

• Challenges included transliterations and normalizing of diacritics (gliffs aka accent marks) where names are the only key in the absence
of employee-ids. Salaries in multiple currencies converted to USD and then back to each local currency.

• Developed many custom worksheet functions identified as reusable determined to be poppular and reusable company-wide... An
example =findHier() a custom worksheet function that looks at each employee’s next higher level of manager/supervisor iteratively, to
create threads that go from each employee to CEO, in some cases, approaching 15 levels of management from employee to CEO…
When assembled, a vba routine translates a semi-colon delimited thread (returned from findHier function) and runs it through a vba split
function into an array, and using shell commands, following the hierarchy, it creates a file-server folder structure that precisely mirrors
Teleflex’s org chart. The folder structure actually looks like an org-chart. Each Exec can drill down into their respective folder structure
to see the detailed salary planning of their directors and managers. Security builtin to the process permitting manager ability to see only
his/her complete organization. In essence, the CEO can ultimately drill down from the very top of the folder structure down any leg of
folders to view the very highest and lowest levels of salary planning data.

• These spreadsheets, fairly sophisticated, i.e., the salary and bonus planning tool, incorporate heavy analytics that take into consideration
issues of discrimination, bell curves, each business entity’s contribution to profit, overall corporate profit pro-ration rules, with a
dashboard cluster of business graphics and KPI to be used by Senior Executive Salary Planning Committee with capabilities to adjust
pro-ration and bonus incentive parameters, essentially a tool that enables Exec committee to dial-in and fine-tune the settings, a cool
tool to calibrate the parameters in order to conform with the budgeted bonus line-item. It’s like having a set of dials, and as you fine-
tune adjust the parameters, the set of pie-charts line-graphs, and bar-charts are changing with each turn of a dial. A special effect that
we built unknowingly, was a bit of a thrill to watch when this system was implemented. A really neat project for me.

• Hierarchy threads, via the =findHier custom worksheet function integral to a company-wide initiative to re-rationalize global
organization... Also a neat challenge to me and the neat staff I was working with, was building this facility to be flexible to
organizational changes occurring while this process was being developed. The resulting database was to be used in the initial population
via ETL of the SAP HR module in early planning stages. Was said that the functionality we built into these spreadsheets as short-term
as this project was, would be a tough act to follow for SAP HR Module or with whatever HR application Teleflex ultimately goes with...

EmblemHealth Insurance Companies - 55 Water St, Lower Manhattan, NY (Jan 2010 to Sep 16, 2011) As Business, Data Analyst /
Developer. Excel, VBA, ActiveX Controls, ADO, SQL, Hyperion, Ms Access, Oracle/SAS/IOM (SAS proprietary ODBC equivalent).

• Working as lead developer, Actuary Architect with Emblem’s Senior Actuary as co-leads reporting directly to the Chief Actuary with a
team made up of actuarial staff, working independent of IT, but working directly with SAS Technical staffing to address the numerous
technical issues with what was for Emblem their first experience using the SAS environment.

• As part of actuarial team initially assigned responsibility to re-engineer Emblem’s aging legacy data collection as preparation for a new
risk, rating, pricing, underwriting, valuation and reserve projection process… Emblem’s month-end manual preparation and
consolidation of 120 ETL data-feeds (CSV files) and scores of manual FTP downloads, millions of rows of claims data from numerous
aging legacy systems, many still reside on mainframes from earlier mergers and acquisitions, grew unreliable. Adding to the tedium
was the recoding of claims data to a common coding scheme (HL7) and the culling of this data into separate cohorts ahead of the
valuation process and doing this had added complexity and considerable risk of human error. These Cohorts were found to have poor
homogeneity. This manual process, no matter how careful it was done, was prone to data errors and the staff that supported this process
worked with very difficult time constraints to prep all of this data by 3 rd workday, in many cases, not achievable, ahead of Emblem’s
monthly number-crunching process that itself could take 10 or more days, the result, future claim reserve projections are produced and
then fed to Emblem’s Accounting systems, also a manual process, and reserves are then set aside to meet NY State’s stringent reserve
requirements… First task was to fully automate, build sophisticated ETL for this process. where all claims data is now combined in a
standardized format, all using a common coding scheme, all data now stored centrally on a SAS/Oracle database platform.
• Once this was achieved, performed rigorous data analysis task, described earlier, producing new sets of cohorts with high levels of
homogeneity using techniques such as multiple regression, chi-square, and correlation-coefficient (not unlike a stock market beta). My
suggestion to Emblem to undertake doing this was that “No matter how great our projection process was or wasn’t, the accuracy of
these projections would not be a function of the arithmetic nor our techniques, but more a function of the data it feeds from”. To me this
was a familiar situation, because in a different initiative at Imperial Chemical Company years earlier where I led an initiative to build an
ERP Demand forecasting system it was essential we first smooth anomaly from historical sales data as a first step. Not doing this, the
noise/ bumpy data may be seen as legitimate signals, and trip up the model-fitting process and produce bad forecasts. Rebuilding the
cohorts with greater homogeneity gave us significant advantage as starting point to produce what has become Emblem’s best reserve
projections ever, far exceeding Emblem stakeholder expectations.

• As developer, I wrote the code for what is now a fully automated application that feeds directly from a new SAS Database using ADO
and IOM (a SAS proprietary ODBC equivalent) where the SQL that resides on worksheets is ingested via vba code and treated as
though the sql are parameters and not hardcoded nor embedded in VBA code making it easier to modify or maintain each sql statement
wherein this SQL is executed to include building the connection string and issuing the open from within the VBA subroutines where the
select statement uses copyFromRecordSet option to drop the data into the receiving worksheet; Describing some of this app’s features,
choosing one from many of its PopUpMenu capabilities, the owning actuary initiates retrieval of a specified cohort from SAS with
detailed membership claims data, and numerous components of trend, a decomposition of overall trend that identify the smaller signals
that comprise overall trend. Built-in facilities enable actuary to calibrate / dial-in an individual component of trend’s signal strength and
this adds enormously to accuracy. The arithmétic product of components of trend is recalculated to represent overall trend. Adding with
this, there are many additional popup menus with numerous methods that remove noise, outliers in claim data, using a triangle view of
cohort data that feed a process that imputes either of completion-factors or development factors, actuary my choose either method,
where either demonstrates the rate of completion (medical cost obligations from date of medical event and succeeding costs post
trauma) along with every best known actuarial full and partial credibility and blending technique that when applied to one of numerous
projection base methodologies combine to almost always produce a sensible initial set of projection reserve numbers for next and future
months... These reserve numbers are sent through an attribution process, built up from a PMPM (per member per month) using what are
considered current membership numbers, to ultimately become an input (future liability reserve projection) that is forwarded to
Emblem’s financial accounting systems. Projection Module has 6,000+ lines of VBA code, is easy to maintain, but has considerable
sophistication. I wrote this VBA code.

• Developed a Chief Actuary Monthend Dashboard that lists all 125 cohorts (number of cohorts can vary). and for each cohort, its status
as either of in-progress or complete, also the owning actuary name, and the reserve projection total dollar amount. The dashboard gives
the Chief Actuary the ability to drill into each cohort’s detail, the projection’s arithmétic triangle, to include owning actuary”s
adjustments, and the mandatory notation (documentation that explains the rationale for each adjustment) which is necessary at times to
address an anomaly in the data... Each cohort feeds from its projection triangle. Each triangle is the product of a forecasting process,
sophisticated algorithms, almost always is unique to each cohort. The Dashboard enables Chief Actuary a full view of all cohorts, the
ability to drill into each one, view the system generated projection, the triangle with its vast array of numbers along with all owning
actuary adjustments, justifications for each adjustment (the accompanying notation for each adjustment, wherein the chief actuary, using
the dashboard, has an ability to drill into each cohort, perform a review, a look-see at the owning-actuary’s adjustments and reasons, and
can add/modify verbeage to an adjustment’s notations. Chief Actuary from the Dashboard has complete ability to override the
adjustment, or recommend adjustments to owning actuary.

• When all cohorts are checkmarked, approved by the chief actuary, this sends signal for the monthend closing process to begin... As
each cohort is approved, with the presence of a checkmark, this freezes (locks) the cohort. Once locked, no additional actuary
modifications are possible. With use of sophisticated ETL, the membership (the subscriber membership count) is the divisor, and the
total cohort projection is the dividend, where the result (the quotient) is a PMPM (Per Member Per Month projected cost per member).
Membership changes notably from month to month. PMPM is a common denominator. So this journal entry that represents a projected
total cash reserve needs to scale precisely with most current membership, and then set-a-side as the projected claim expense for each
cohort each month. Something that wasn’t done all that well with existing Emblem systems at that time. And upon doing this, A/P, G/L,
and P&L (Hyperion) are updated to be the net of existing cash reserve account balances and adjusted accordingly… The entire risk,
rating, pricing, and reserve projection process was written entirely on an Excel/VBA technology platform. From inception, the
anticipated complexity to build this application was considered outside the abilities of Emblem IT and EmblemHealth’s Actuarial staff,
largely the reason I was brought in to join the actuarial team, however, the broader EmblemHealth underwriting, actuarial, and the
overall operational communities were familiar with Excel worksheet look-and-feel, but not so much with VBA and Excel’s deeper
capabilities, but all things considered, it made sense to them to use this technology platform. Every employee has Microsoft Office on
their desktop. Was a great project for me. Many technical challenges, some of its features not thought do-able made this project that
much more memorable and fullfilling for me…

JPM/Bear Stearns - Midtown Manhattan, NY (Sept 2004 to Jan 2010) As Business, systems, data Analyst, Developer
Technologies: Bloomberg, Reuter’s, MS Access, Excel, VBA (macros), ADO, CDO, SQL, SQL-Server, Sybase, Oracle, SharePoint, VB.Net.

• Worked as member of a select team, an internal SWAT/RAD team in the final days of Bear Stearns responding to urgent financial data
needs particularly during the period leading to and following their merger.

• Developed and distributed surveillance and governance reports of investor unwinds, positions held in the weeks leading to the collapse
of Bear Stearns and merger with JP Morgan. Made heavy use of reference-data and DTCC DerivServ during this timeframe.

• Produced a significant number of senior management and operational reports to track formal trade confirmation process with all
Counterparties to include Novation re-Assignments, transfers/terminations of selected positions, during transfer of derivative portfolio
holdings from Bear to JPM with use of ETL to and from Scrittura, Calypso, other internal systems and DTCC/DerivServ.

• Developed numerous scorecards for senior management that tracked progress of Novation of Bear Stearns positions and portfolios with
corresponding JP Morgan portfolios. A tedious process for JPM and Bear Stearns taking more than 1 year to complete.
Deployed directly with Front Office, Middle Office, Back Office, Trading, Marketing, Equities and Derivatives Teams.

• Built many higher-functioning Excel dashboards and applications, many as RAD initiatives, some with 5 thousand lines and greater of
VBA code. Heavy use of WebQuery, and real-time links to live data. Other VBA techniques make use of OLE embedded Outlook object,
OLM and outlook mail envelope to automate the sending of email w/attachments from within Excel with VBA using CDO and SMTP
mail servers and Excel SharePoint services via hyperlinks embedded in email. We used this method to distribute reports globally in
varied formats to include PDFs, PPTs, XLSX, XLSM, and Docx files...

• Built and maintained several trading blotter apps in addition to my primary responsibilities as a derivative FO/MO developer.
Considered alpha Excel go-to support person company-wide at that time. Designated as Tier-3, Tier-4 support for all trading desks
world-wide...

• Developed office-automation with what had come to be named theScheduler.xlsm. Its need grew from a heavy workload where middle-
office staff worked late into each evening preparing a growing stack (deck) of management reports and metrics ( KPI/KRI) that quantify,
identify and prioritize outstanding unconfirmed trades as support for Bear Stearns' significant Derivatives business. This scheduler, built
with excel/vba, was the key component in the automation of the reports. It provided a facility to populate a to-do list of spreadsheet
reports that are refreshed with automation (written with VBA) and distributed one or more times daily at varying intervals. Known to us
as the to-do list, each row contains a spreadsheet’s file name, file-path, a start-time, and the procedure (a subroutine, also with a path
and filename) that performs the particular refresh at each designated start-time. Alternatively, a spreadsheet refresh can be
initiated/triggered when the file it feeds from is found to exist in what is generically referred to as an FTP Drop-Off folder. A VBA
CDO (email) routine is triggered that sends reports/spreadsheets in the form of a hyperlink or attachment to the managing director / exec
(the global community) earmarked to receive the report/spreadsheet. This enabled us to bypass using fileserver as delivery mechanism
where a report earmarked for Singapore as example, sitting on a server in Delaware back in 2008 would take forever to download to
recipient in other parts of the world…

• Developed derivatives workflow trade lifecycle management application (Excel/VBA/ActiveX) to support Bear Stearns' Derivatives
Desks. A significant tool that drives much of the derivatives middle office by directing focus to T+3, R+3, problemed trades. This
supported Credit, Rates, FX, Equity, Base and Precious Metals Commodities, Energy derivative, CDS, CDO, ABS, MBS, CLO, Syndicated
Loan products... The Excel application feeds from a Data Warehouse via SQL that feeds from DTCC Deriv/SERV post trade matching
service at that time an advanced ETL. Integral to this project was the design and development of a Derivatives Data Warehouse.

• I developed what was known as the master spreadsheet, a living document that receives updates of 000’s of rows of reference data via
this new Data Warehouse that feeds or fed directly from DTCC throughout each day via a VBA updating process triggered with a VBA
ontime timer event. Each refresh to the spreadsheet added new trades, assignments, amendments, partial and full terminations, and
updated break information. Metrics derived from the spreadsheets assist management teams to quickly spot through-put problems and
in many cases led to improvements in appropriate best-practice procedure(s). Master spreadsheet undergoes a process that redirects
trades via an elaborate data-mining process to the appropriate analyst using a pre-determined mapping of counterparty(s) and
responsible middle-office analyst(s)… Analysts resolve breaks, make notations (with what we called the diary) and indicate status on
their respective spreadsheets, subsets of the master spreadsheet and this data, in turn, is uploaded back to the master spreadsheet (a
shared workbook_ and then updated to the Data Warehouse where a scorecard, the KPI/KRI process feeds from, producing very
elaborate (red/yellow/green scorecards) heatmap reports and distributes them to respective managing directors along with metrics that
indicate speed of throughput highlighting with use of an aging process how long a trade is sitting in a problem status. These
management reports did a great job of showing most aged problemed trades (stuck trades), blinking red with all pertinent diary of
actions and dialog with counterparty resolver. Useful to the Sr Managing Director during status meetings where management teams
prioritize problem trades (breaks) stuck in the pipeline in order to meet t+3/r+3 throughput constraints wherein the master spreadsheet
contains latest break information it obtains with realtime connectivity with DTCC post trade clearinghouse.
• Worked with On-Boarding Account Documentation Team in RAD mode to meet aggressive target date. Goal was to automate a
process that heretofore was manual, quite tedious, to support initiative to Know-Your-Customer (KYC) by gathering the requisite
legal documentation as part of what is known as a Customer-Information-Profile (CIP) for every client in order to comply with
'Regulatory', branch of risk management. A questionnaire was developed with Excel but looks nothing like Excel. Essentially this is a
rules engine. Rules provide this application an innate understanding of jurisdictions, client entity business types such as C Corporations,
S Corporations, LLC, LLP, and the financial products and Bear entities they can be traded with. It produces a listing of required legal
documentation that varies widely, based on country, state, city, and province, wherein this documentation is considered a required pre-
requisite to trading and/or doing business with new client.

• Developed a series of compliance/surveillance Key Risk (KRI) reports for risk management group. Purpose is many-fold, but largely to
identify anomalous/suspicious trading patterns, irregularity with economics, AML, fees and/or pricing data present as skew with like
trades having exact or similar tenor, and occur within the same contiguous 2 to 3 day window of time. Generally, these reports
developed specifically for risk management and compliance teams, SOX Compliance, they identify extra-ordinary trading patterns and
trading support activity to include middle office confirmation and amendment processes.
Citigroup, Manhattan Midtown Manhattan, NYC (Jan 2004 to Sept 2004) Sr. Business Analyst / Lead Developer (backfilled position to fill a
staffing gap mid-project) Technologies: Excel, VBA (macros), ADO, CDO, Pivots, Dashboards, SQL, Access, SQL, SQL-Server, Oracle...

• Working with Global Risk Management Group reporting into the Global Chief Risk Officer.
• Developed numerous sophisticated spreadsheets with heavy analytics to centralize the risk oversight and management for Citi’s Global
Portfolio of credit card products exceeding at that time 10’s of Billions in total value and exposure. Heavy use of statistical techniques to
produce forecasts (tactical and strategic) that feed from historical actuals.
• Developed sophisticated spreadsheet templates to capture global data that contain both actual data and guidance (forecasts), collected
monthly from 70+ countries and maintained the histories of this data both actuals and forecasts on Microsoft Access to determine
accuracy of forecasts.
• Regional Dashboards with impressive business graphics showing degrees of risk, the interval, amplitude of payments, latency, the
percentage of cardholders at or near limit, making mimimum payment, late and missed payments, indicating degrees of risk and default.
And a dashboard showing global hotspots, with point and click drilldown…
• Automated an existing process to build decks of pivot-oriented dashboards, distributed globally to local servers every continent with
each refresh. Pre-dating this, it required large staff, sometimes one or more persons each region at Citigroup’s NYC office to produce
manually; all staff local to Manhattan office. I was considered a top Excel resource at Citi.

Walmart (Bentonville Ar) Remote; Excel/VBA, Word/VBA during outbreaks of COVID-19 (May 2021 to July 2021)
Large-scale office-automation, urgent short-term projects, with major retailer Walmart, Excel/VBA, and extensive Word/VBA (a very rare
skillset ) and HTML with sophisticated Labeling and Bookmarking for mass mail-merge sending many 10’s of thousands of individualized
postal letters, literature to customers, buyers, vendors, business associates, employees, a real challenge given the considerable time
constraints to make this happen; Also built a mechanism to send mass email via Outlook, 30,000+ employees and business associates, also
buyers and vendors, intra-company with individualized (personal) attachments, and Outlook meeting invitations also with meeting agenda
attachments and exhibits working with the Walmart (Bentonville Arkansas IT location) with one of the world’s largest IT departments, a high
pressure role working with considerable time constraints during COVID-19 regional peaks, spikes and outbreaks that made mass tactical
correspondence essential… I did this to assist Walmart IT, surprisingly less skilled in this area, to get this done… And did this type of work
many times at many companies such as Cenlar, a mortgage servicing company located in Pennington, NJ and on ad-hoc basis at the many
companies I worked with throughout my career.

Ingredion, Imperial Chemical Company, ICI - BridgeWater, NJ Sr. Business, Systems, Data Analyst / Designer / Developer.
Excel, VBA (macros), (Power) Pivots, SQL-Server, Sybase, Oracle, SAP, VB.Net. As consultant and lead Excel/VBA developer. A go to
person for Excel and VBA assistance with all operational areas. Conducted numerous advanced Excel training classes and lunch-and-learns
for employees. Developed many Excel/VBA/ SQL applications that feed from Access, Oracle, Sybase, and SQLServer using ADO/ODBC.
Developed a Budget and performance tracking System using Excel, VBA, Access, and an ETL process with feeds to and from SAP. A joint
project with HRIS and IT, a complete Global IT budgeting process to include salary planning, vendor management, also equipment, software
licensing, licensing other, leasing, and departmental chargebacks. A system unto itself, an employee review process, employee census, with
new-hire planning. Dozens of separate departmental budget planning spreadsheet templates built with interfaces to G/L and COA, each
worksheet, a template providing for FTE, employee rating, bonus compensation rules, multiple levels of pro-ration in part predicated on pre-
defined KPI parameters. Bonuses determined based upon each operational areas contribution to profit. Templates perform local and US
Currency transliteration. Spreadsheets reside on shared servers. Rollup and Consolidation of budget items to G/L accounts where each
departmental budget spreadsheet content is additionally consolidated (rolled up) to corresponding business entities spreadsheets. All business
entity spreadsheets are then consolidated, a rollup to a corporate spreadsheet. Monthly Budget-versus-Spend reports using a dashboard
format with clickable graphics for drilldown to underlying departmental data, Heavy emphasis on UI/UX. Obhjective is to identify overspend
and underspend and with simulation project year-end impact. Dashboard designed wherein a click on a datapoint shows underlying data
possibly answeing the question of what caused the anomaly, the skew of actual versus bdget. Designed for CIO, CTO, CFO, and CEO, and
re-enforces the value of a UI and the UX …

Henkel, Unilever, National Starch and Chemical Company, NSC , BridgeWater, NJ


As Senior Analyst, ERP Architect and Developer embedded with operational areas to include Supply and Demand Teams.
Developed RPA early-warning capability (aka, eWarning) with higher functioning custom UDF Excel functions, VBA, and SQL with
ADO/ODBC connectivity to acquire historical sales data, inventory, and factory production schedules from a Data Warehouse. ETL transfer
of CSV files, ie, feeds from SAP. This app has an ability to detect breaks in ordering patterns. This capability was named “eWarning”. Its
job was to alert the planning community, and customer service (CSRs) to reach out to a customer (customer refers to big Fortune companies)
when an order is not received when expected. In the ABC analysis of things, these are typically the ‘A’ customers, ie, the 10% of customers
that account for 90% of Unilever NSC’s business. Almost always at most inopportune times, a customer or its systems might fail to place
orders in timely manner for any number of reasons. Much of the time, the Unilever NSC CSR (the customer service rep) reaches out to the
customer and gets the order . Outcomes are fed back to demand and supply teams and forecasts are revised as needed. I.e., if customer
chooses to not place an order. Anecdotal stories from CSR recount how sometimes customer had simply forgotten to submit an order and
very happy that NSC could know this... Or the customer changed their production schedule and failed to communicate this with NSC. This
was a real bonus for NSC. The consequence of a customer that fails to place an order can shut-down customer’s own operations especially
those that operate with a just-in-time ordering strategy… Bad for them and bad for NSC… Re-tooling NSC factory to make unscheduled
product is expensive. Each category of product, its composition is different, causes factory down-time, costly time-consuming equipment
retooling, reconfiguration, to re-stage the factory and suppliers to stage different categories of product. Not easy to make changes to
production schedule without major disruption to NSC’s factory output . And this underscores the importance of an accurate forecast as a key
to efficient use of manufacturing capacity, ie, production planning and capacity planning.. Early Warning logic makes heavy use of interval,
amplitude, order rate and frequency.

Developed sophisticated forecasting engines with an automated model-fitting process using modeling techniques such as Winters triple
exponential smoothing, Holt's single and double exponential smoothing, Box and Jenkins, Bayesian Linear techniques, Multiple-Regression,
Triangle techniques borrowed from my actuarial insurance experience, and other forms of averaging, simple regression as example, and
timeseries techniques... Also invented and prototyped a completely new class of forecasting strategies explained in part, in above description
regarding the use of Early Warning capabilities largely to protect the forecast and the downstream production and inventory plans derived
from it... Entire application built using Excel/VBA to write the ETL and Activex-Data-Object(ADO) to pull data directly from SAP.
Forecast input data (4 or more years of ordering history) and the forecast outputs coming from this project consistently demonstrated greater
accuracy when comparing it with SAP. This new class of forecasts are uploaded back into SAP via ETL and ABAP… I demonstrated to NSC
Executive Committee that it was possible to produce a sensible forecast 100% of the time which to this day is not achievable with SAP. Also
produced a new class of forecasting accuracy and error measurements, for lack of a better name, I dubbed it Degrees-of-Accuracy, in many
ways it exceeds the practical benefits of MAD, MAPE, APE, MSE, the ANOVA.. Anova components are simply more complex variations of
averaging techniques. Less good at choosing the best forecasting algorithm. Chi-Square, an alternative to a combination of an algorithm and a
measure of accuracy and matches the signature of a forecast comparing it with actuals.. And chooses the forecast with what is a chi-square
score that falls closest to but ideally within one-degree- of-freedom, or a trusted standard deviation (1-sigma) or that falls closest to 1-sigma
to decide which algorithm produced the best score. If just one datapoint is seriously skew it may disqualify the entire algorithm that produced
thie error. But degrees of accuracy sees it as 11 of 12 datapoints with 90%+ accuracy with one dataapoint at 70% accuracy. Where otherwise
with Anova, that one datapoint spoils things in that its one APE comparing actual to forecast for that one bad datapoint was so bad that when
if takes an average of the many APE datapoints, the MAPE, the mean (average) of APEs will suggest that the entire forecast was bad and will
go with a less good strategy in the model-fitting of things. My time at NSC spans a 6-year timeframe. Was entrusted as lead of many of
Unilever NSC’s largest IT development initiatives… My detailed Predictive analytics RPA/ML experience and how it works, available in
following detail. Also references of team members for these projects available on request…

The initial concept, my reasoning for the importance of eWarning, if I could sell it to senior management, I needed to prove it would be
value-added, prove that it would work. So I worked ‘round the clock to create what became known as a concept car, a working Excel
application known as the concept-car the reason it was so successful, far exceeded everyone’s expectations. This thing I had developed as the
Unilever NSC lead architect, initially its purpose had nothing to do with the overall efficiencies that were realized with the many business
areas that benefited from it... I built It to protect the forecast in the event that assumptions that went into producing the forecast hadn’t
materialized. I invented eWarning to protect the forecasts, and because I led the project that built the forecasting system. Jokingly I’d say
my reputation was at stake. This was my sense of humor, but in truth, eWarning raises a “heads-up” giving planners the time to react and
hopefully meaningfully re-use what otherwise would’ve been seen as unused or mis-used manufacturing capacity… An unforced error. The
eWarning app studies every customers’ ordering and shipment patterns (their typical lead-time when placing orders), and doing so, it
became a tactical but also strategically important operational tool, and a competitive advantage for NSC… It warns planners and customer
service with sufficient lead time, and averts what are big logistics problems that had come to be accepted as the cost of doing business. This
Excel workbook filled a gap that know one knew existed... Solved a problem that was presumed not solvable. NSC-IT unable to make a
business case to recreate this process with their conventional technology platform and workbench; Recalling IT said - to reproduce what
eWarning (the Excel workbook) could do would be a 5 Year effort (a staff of 3 might be able to complete the project in 2 to 3 years). IT, not
considered a fan of Excel, not positioned to take this task on, and unprecedented, IT embraced this Excel App, and so began a growing
respect throughout IT for Excel and VBA. To predict order date, eWarning, uses a form of rate/time (rate over time) to estimate/anticipate a
next order-date datapoint, with use of algorithm detection. It studies order interval and amplitude (when to expect orders, the frequency, and
in what amounts, the amplitude. We are talking about tank trucks and railcar sized shipments of product or barrels or tote-bins. Not getting it
right is costly. But I built eWarning for selfish reasons I would say... I was the architect of NSC’s Forecasting system. I had faith in this
forecasting to always produce a sensible forecast so I built eWarning to keep an eye on NSC customer buying patterns, in the event their
ordering patterns changed, or something as simple as forgetting to place an order occurred, it would be costly, both for NSC and the
customer possibly forcing shut down to reconfigure factories (costly depending on varying equipment needs for each production run).
eWarning would in fact reliably raise the alert giving the planner(s) sufficient time to adjust schedules with little to no disruption to factory
output.. I built eWarning to protect the forecasts. :) SAP to this day cannot support these capabilities. Forecasting is the Demand Planning
team’s primary planning tool. It drives when to make product, how much to make and where most efficiently in what regional warehouse or
cross-dock to place product for it’s final leg of journey to the customer. Forecasting produces a forecast, checks inventory positions, where
the inventory is located in proximity to the customer, adjusts production output to be net of current inventory positions. Transportation and
shipping costs are expensive, even moreso now. Especially at this point in time. Not efficiently and logistcally storing/warehousing product
geographically, means added substantial shipping costs. This was a great project. Embedded with Demand and Supply teams and I was
accepted as being one of them. Part of their team. This meant the world to me…

I have an extensive background in Forecasting, Predictive Analytics, Marketing, Metrics, Inference, and modeling with Excel/VBA
technology platform and with use of RPA/ML and AI automation that drive model-fitting, ie, the running of thousands of simulations where
it, on it’s own, decides best algorithm choosing from Time-Series, Simple Regression, Multiple Regression, Bayesean Linear techniques,
Single, Double, and Triple Exponential Smoothing where by itself, RPA drives a secondary process that determines optimal alpha, beta,
and gamma degrees of dampening, and sometimes it determines “simple is better”, that a moving average, weighted moving, or simple
average is best, as this is RPA’s way of punting when no algorithm, signal, or method exists within the data it is working with... Using
aforementioned algorithms, it iteratively runs these simulations for each of the thousands of sets of historical data it works with, and
chooses the one simulation, the winning model, the one algorithm that wins the contest from the thousand plus simulations, a thousand plus
passes at each set of data, where RPA chooses best algorithm, the one simulation that most closely mirrors the signature, the signal of the
most recent 12 months of historical data, a proxy, that represents the future, essentially the baseline for the purpose of doing these
simulations. Essentially, it uses a process commonly known as back-testing, where it looks back in time at 4 or more years of historical
data (the actuals), then feeds from the oldest 3 of the 4 years, in this case, data coming from SAP via ETL handshake. RPA puts this data
through its paces, the simulations, using every model and smoothing constant combination (10+((10+10)*10)) + (10*10*10)) =1,210
combinations and using these accuracy metrics, MAPE, APE, MAD, StdDev (1 sigma tolerance), and least squares MSE (derived from a
sum of squared errors) and a method I invented, Degrees of Accuracy, it chooses the simulation, ie, the one that digitally demonstrates the
smallest variance, i.e., where comparing its signal with the signal of the most recent 12 months of historical data (the proxy representing
the future), both curves, the proxy and the winning simulation, their trend, slope, every bump, cycle, and seasonal feature should appear to
move in lockstep, collinear… And voila’, this is Model-fitting. This process almost never gets fooled with noisy data, an errant bump (the
anomaly) that inevitably exist in historical data, i.e, the one-off events… The project to do this was deemed necessary as SAP forecasts not
sufficiently sophisticated to be able to achieve an acceptable degree of accuracy… Each month this process self-evaluates itself, A-F/A,
how it did in the previous month(s), it measures its own accuracy; Produces a report for its administrator, a scorecard, and it re-modelfits
those data-sets where RPA thinks it can do a better forecast going forward... I have built many variations of this methodology dating to my
days at U of P, spending considerable time at University City Science Center at 3401 Market St, the West Philadelphia campus…

Developed several Large Scale Automotons... These are RPA/ML AI applications, the one of many projects working with RPA/ML and
AI as the developer. This particular application, distinguishes itself from other current-state RPA, a true automoton, a snap-on tool that
attaches to existing workbooks, eg, trading apps that feed from live data, local web-farms, Bloomberg, Reuters, perhaps numerous URL
amongst the many realtime internet information providers and does so exclusively on Excel/VBA SQL technology platforms. Several
thousand lines of VBA code. Designed to self-install; attach to existing Excel apps. Essentially an Addin, an xlam that self-installs in a
matter of seconds. Uses VBA code that writes VBA code it embeds with VBE inside the receiving workbook’s activate event during the
robot addin’s (xlam) open event. This app Self-Study's live data using ANOVA, Qualitative, Quantitative, Algorithmic, Stochastic, and all
best known Statistical Methods. This Addin is a true robot that operates as an employee. Essentially with Instantiation, it is replicatable,
scalable General Staffing, Quants, Analysts, MiddleOffice staff, where each instance’s role, as one example, is to monitor an Excel desktop
looking at live ticker data with an ability to spot what it determines is an actionable event, eg, a trading opportunity... The advantage versus
human is this app never blinks, no bathroom breaks, a workaholic… Learns, study’s on its own "what normal looks like". It gains
perspective; A context; Essentially it studies every digital signature, every change with data; gains an innate understanding of normal and the
ability to Identify anomaly, a datapoint or series that exceeds sigma tolerance, an outlier that deviates outside of what it knows is normal
variation, ergo a potential actionable event, ie, a new normal moment. Equipped with this ability, It communicates observed event instantly
to SMEs, stake-holders, actionaries (the decision makers), the traders, with relevant charts, essentially snapshots of cells (before/after), and
artifacts that tell the story via email, of what triggered the alert. It does this with use of HTML and Outlook object embedded in its VBA via
OLE. Seriously advanced VBA that makes heavy use of Intersect() functions (this is just one example) to include jumping inside the data-
series collection(s), properties that are hidden deeper inside an Excel chart or graph object. These collections are pointers that tell the chart
where in what cells to find the data it feeds on. But this also helps the robot to know which charts are relevant to the story. Snapshots of these
charts, graphs, dashboards (the artifacts), are sent either as attachments or embedded inside these heads-up email alerts, and does this in
fractions of seconds, ie. sending this email to the actionary, a trader, a managing director, and/or it can send a pulse to a backoffice
mechanism to make the trade, or initiate the trade by itself, generally speaking it alerts the actionary, in this case
a backoffice trading app, or sends text message or email...

Morgan Stanley experience cont’d (more insight regarding special dashboard project)
Also asked to re-engineer the many ongoing multiple regression studies with use of automation (RPA), machine self-learning and
AI, where some studies require 10 datasets of input, but no real limit to the number of datasets it may be asked to ingest. It now
decides on its own which from a large collection of datasets (some published by US Dept of Commerce) are relevant for each study.
Determines itself which of the datasets are the dependent or independent variable and on its own detects and determines lag.
Datasets have numerous metric-types, each with wide-ranging orders of magnitude. Each a Y in the charting of things... Microsoft
Excel supports as many Y inputs as you want but its big constraint and limitation, albeit subtle, is that Excel gives you 2 axes to
work with. And that’s it. Just 2. One primary. One secondary. This same constraint is found with most BI presentation tools to
include Power-BI...
The questions I received during initial interview centered with what happens when using 10 inputs (10 datasets), essentially 10 Y’s
with wide-ranging orders of magnitude? Y’s with greater or smaller orders of magnitude that track high or low of primary and
secondary min-max scale, so to, the charted line will track outside of the chart’s viewing area. Not uncommon that we experience
this from time to time. But this circumstance was very upsetting to senior management, a fatal bug (short-coming) in what they
deemed, their most important CEO dashboard (visual), and believing possibly there is or was no solution, I was asked during the
interviewing process by a senior managing director (a hands-on person), specifically, if I were to come on board, could I solve this. I
said yes given the many similar technical challenges I encountered in my career. Now all Y’s, regardless of magnitude fit nicely and
well within the chart viewing area. The concept of tertiary Y allows a developer to define unlimited numbers of Y axes, not just the
2 mentioned above… Each Y now has its own axis, its own min-max scale and legend properties and all are seated neatly, side-by-
side with the chart’s viewing area... Was necessary for me to rewrite portions of Excel Chart Object, not for the faint of heart.
Projects like this do not come along all that often… Examples of wide-ranging orders of magnitude are bps, fractions, ratios, units,
tens, thousands, millions, billions, trillions., Seeing all of these datasets on one chart, super-imposed with Morgan Stanley’s KPI,
each it’s own signal, with expectation that all 10 signals, digital signatures, all 10 KPI are moving (tracking) in unicen, in lockstep;
All responding, similarly to bumps, cycles and seasonality. This one visualization of 10 Y’s, on one chart tells the story,
demonstrates the degree of correlation. Visually, viscerally, answers the CEO’s question “How are we doing”... This one chart, all
by itself considered the most important BI visual. It gives context, a perspective that all of Morgan Stanley’s important internal KPI
are on par, keeping or exceeding pace with the greater economy, and with what Morgan Stanley defines as key marketplace KPI of
the broader markets Morgan Stanley operates in… This one picture answers CEO’s most important questions, amongst them “How
did we do and where can we do better?” What makes this chart special, isn’t the technology required to build the chart. Most KPI lack
context. Typical dashboards enable one to track how well they did relative to their own internal historical data (how we did compared
y/y or the month before), or relative to a forecast. KPI are generally intra-company, inward-centric. Whereas Morgan Stanley wanted
their KPI to sit side by side with KPI of the overall marketplace Morgan Stanley operates in . Yes they can see how they performed
relative to their own historic data, but this chart tells them how they are performing relative to the greater marketplaces Morgan
Stanley operates in. Dashboards generally don’t do that.. Hence the importance to the senior management team to make this
happen.

You might also like