SAP Data Center

Download as pdf or txt
Download as pdf or txt
You are on page 1of 42

SAP Data Center

Data Center:
Data centers process, analyze, and save data nonstop. Learn why they are necessary and –
looking at SAP’s data center in St. Leon-Rot as an example – what they contain, and how they
are operated.

How a Data Center Works:

1. Heat Exchangers: Located on the data center’s roof, heat exchangers release excess heat
from the turbo-cooling units into the air. When outside temperatures are high, the
exchangers are sprinkled with water to increase the efficiency of heat dissipation.
2. Diesel Generators: When a power outage occurs, the diesel generators start up
automatically within seconds. While the generators go through a short start-up phase,
batteries deliver power so that operations can continue uninterrupted. The diesel
generators then take over and provide the complete power supply for the data center.
3. Server Room: Servers and storage units are located in SAP standard racks in especially
secured server rooms. The racks are kept in an enclosed area to enable optimal cooling.
Server rooms are only entered sporadically and for short periods of time
4. Batteries: Batteries can provide power during short outages. When electricity fails
completely, power is delivered via this uninterruptible power supply (UPS) until the
emergency standby system is active. The UPS apparatus also compensates for voltage
fluctuations and distortions. However, batteries cannot bridge the gap for power outages
that last longer than a few hours or days.
5. Turbo-Cooling Units: High-efficiency cooling units remove the heat emitted by the air
conditioning system and release it into the outside air via heat exchangers on the roof.
6. Extinguishing Gas: Water, extinguishing foam, or powder fire suppression systems can
cause more damage in a data center than a charred cable. For that reason, special
extinguishing gases are preferred. INERGEN, an extinguishing gas, displaces the oxygen
content in the air, which smothers the fire source. It is harmless to people and the
equipment.
7. 300,000 Liters of Ice-Cold Water: If a cooling system should fail, the time until the
backup unit is operational must be covered. For this purpose, the data center houses six
tanks, each filled with 50,000 liters of ice-cold water (4°C), to absorb heat from the air-
conditioning system.
8. The wire to the outside world: Telecommunications connect the data center to public
data networks.
9. Control Station: Control stations for the IT and building security serve as command
central in the data center. All important information is displayed here on large screens.
Any variation from standard operation is promptly reported.
10. Video Cameras: Multiple high-resolution video cameras monitor the exterior premises
and the building. The cameras are arranged in such a manner that one camera is also
monitoring another one. This means that should a neighboring camera fail, continuous
monitoring is still assured. A lighting system is in place to provide illumination for video
monitoring and guard personnel.

It is the brain of a company and the place where the most critical processes are run. Find out why
data centers are necessary and – looking at SAP’s data center in St. Leon-Rot as an example –
what they contain, and how they are operated.
Large-scale computer systems have been around for a while, and many people are already
familiar with the term data center. In the 1940s, computers were so large that individual rooms
had to be specially set aside to house them. Even the steady miniaturization of the computer did
not initially change this arrangement because the functional scope increased to such an extent
that the systems still required the same amount of space. Even today, with individual PCs being
much more powerful than any mainframe system from those days, every large-scale operation
has complex IT infrastructures with a substantial amount of hardware – and they are still housed
in properly outfitted rooms. Depending on their size, these are referred to as “server rooms” or
“data centers.”
Data centers are commonly run by large companies or government agencies. However, they are
also increasingly used to provide a fast-growing cloud solution service for private and business
applications.
The basic characteristics are the same regardless of the size of the data because every company’s
success invariably depends on smooth software operations – and those have to be safeguarded.
Computers, of course, require electricity, as well as protection from theft and the accidental or
intentional manipulation of hardware. Put simply, one has to safeguard data centers against
external influences and provide them with sufficient cooling. After all, there is a lot of powerful
hardware sitting in one place.
In addition to these “hard” factors, one must also take into consideration organizational
measures, such as periodic backups that ensure operability. As a rule, the more extensive and
critical the hardware and software become, the more time and effort are required to provide
optimal protection.
For that reason, a data center preferably consists of a well-constructed, sturdy building that
houses servers, storage devices, cables, and a connection to the Internet. In addition, the center
also has a large amount of equipment associated with supplying power and cooling, and often
automatic fire extinguishing systems.
An indicator of the security level is provided by the “tier” rating as defined by the American
National Standards Institute (ANSI).
During the design of the SAP datacenter the Tier 4 requirements were used as guiding principles.
The key to success lies in the robust design of every individual component and especially in the
redundancy of all critical components. This ensures that SAP can count on its “brain” at any
time, and SAP customers can rely on the contractually guaranteed availability of cloud
applications running in the data center.

Power supply:
The data center is connected to two separate grid sectors operated by the local utility company. If
one sector were to fail, then the second one will ensure that power is still supplied.
In addition, the data center has 13 diesel generators, which are housed in a separate building.
Together, they can produce a total of 29 megawatts, an output that is sufficient to cover the data
center’s electricity demand in an emergency. The diesel motors are configured for continuous
operations and are always in a preheated state so that they can be started up quickly in the event
of an incident. It only takes an outage in just one of the external grid sectors to automatically
actuate the generators.
Both the local utility company and the diesel generators deliver electricity with a voltage of 20
kilovolts (kV), which is then transformed in the data center to 220 or 380 volts.
Within the data center, block batteries ensure that all operating applications can run for 15
minutes. This backup system makes it possible to provide power from the time a utility company
experiences a total blackout to the time that the diesel generators start up.
The uninterruptible power supply (UPS) also ensures that the quality remains constant. It
compensates for voltage and frequency fluctuations and thereby effectively protects sensitive
computer electronic components and systems.
A redundantly designed power supply system is another feature of the data center. This enables
one to perform repairs on one network, for example, without having to turn off servers,
databases, or electrical equipment.
Several servers or storage units have multiple, redundant power supply units, which transform
the supply voltage from the two grid sectors to the operating voltage. This ensures that a failure
of one or two power supply units does not cause any problems.

Cooling:
All electronic components and especially the processors generate heat when in operation. If it is
not dissipated, the processor’s efficiency decreases, in extreme cases, to the point that the
component could fail. Therefore, cooling a data center is essential, and because of the
concentrated computing power, the costs to do so are considerable.
For this reason, servers are installed in racks, which basically resemble specially standardized
shelves. They are laid out so that two rows of racks face each other, thereby creating an aisle
from which the front side of the server is accessible. The aisles are covered above and closed off
at the ends by doors. Cool air set to a temperature of 24 to 26°C is blown in through holes in the
floor, flows through the racks, and dissipates the heat emitted by the servers.
Generally, a server room will contain several such “enclosed” server rows. The warm air from
the server room is removed by the air-conditioning system. Yet, even the air-conditioning system
has to dissipate the heat. When the outside temperature is below 12 to 13°C, outside air can be
used to effectively cool the heat absorbed by the air-conditioning systems.
At higher outside temperatures, the air-conditioning systems are cooled with water, made
possible by six turbo-cooling units. They are not all used to cool the data center, given that some
are used as reserve units. Should a cooling system fail, the time until the backup unit is
operational must be covered. To that end, 300,000 liters of ice-cold water (4°C) are available to
absorb the heat from the air-conditioning systems during this period.
To top it off, the turbo-cooling units also have to dissipate heat. There are 18 heat exchangers on
the data center’s roof for this purpose, which release hot air into the environment.
At outside temperatures above 26°C, the heat exchangers are sprinkled with water in order to
make heat dissipation more effective through evaporative cooling. The large amounts of water
consumed in the summer are covered by waterworks allocated to the data center. The municipal
water supply system provides a reserve supply in this case and acts as a failsafe.
Zero Emissions:
SAP always aimed for energy savings in the data centers. Since 2014 the emissions of
greenhouse gases have been reduced to zero.

When the SAP data center in St. Leon-Rot, Germany was planned, energy efficiency was already
on top of the minds of its fathers. Through energy efficiency and the increased procurement of
green electricity, SAP has been able to reduce greenhouse gas emissions of its data centers to
zero globally.
Since the data center has been completed, SAP has continuously sought to further increase its
energy efficiency. These efforts have been successful because of the fruitful collaboration
between Facility Management that is responsible for running the data center facilities and its
technical features and the IT department that is responsible for the installed servers.
How did SAP save energy and greenhouse gas emissions?
Since 2010, TUEV Rheinland has conducted annual reviews of the SAP data center and has
confirmed the increasing energy efficiency with the premium award for energy efficiency. The
TUEV keeps confirming that the SAP data center is not only better than other comparable data
centers but has improved year over year.
In addition, SAP was recognized with the first prize of the 2014 German Data Center Award in
the category “Integrated Energy Efficiency in Data Centers” – another proof point that SAP is
successfully implementing new technologies.
Moreover, the energy consumption data tell their own tale: The electricity consumption has
remained flat in the past five years despite significant business growth. SAP now has 25% more
employees and 57% more revenue – it is obvious that efficiency has improved steadily.
Is it possible that SAP data centers cause zero greenhouse gas emissions?
Even though SAP’s efficiency measures have been very successful, every computer operation
needs power. And this is why it is not only important to care for efficiency but also for how this
power is produced.
SAP has continuously expanded the procurement of green energy. Since 2014 SAP’s datacenters
cause virtually no greenhouse gas emissions because SAP purchases Renewable Energy
Certificates (RECs) that cover the energy consumed by the datacenters.
SAP’s customers benefit from this move as well: All SAP Cloud solutions are now running on a
“Green Cloud”. This means that customers that switch from their previous systems to the SAP
Cloud reduce their related greenhouse gas emissions accordingly.
Security

Access
The data center is monitored around the clock. Single-person access and mantrap systems
provide access only to authorized individuals. Technicians can then enter special rooms using
custom-configured ID cards. High-sensitivity areas require authentication by means of biometric
scans.

Access to Data
An intrusion detection system monitors incoming data and identifies suspicious activities, while
firewalls made by different manufacturers protect the data in the data center. Data and backup
files are exchanged with customers in an encrypted format or transmitted via secure fiber-optic
cables.

Power Supply
Should the multiple-redundancy power supply system fail, batteries are automatically and
immediately actuated and supply electricity for up to 15 minutes. Within this time frame,
emergency power diesel generators are started up. They can then supply power to the data center
for an extended period.

Hardware
All virtual and physical servers, HANA databases, storage units and networks in use access a
pool of physical hardware. If individual components should fail, the load can be directly re-
allocated to other components without impairing system stability. If hardware fails due to a fire,
data can be recovered from the backup system.

Fire Protection
The data center is subdivided into many fire compartments. In addition, thousands of fire
detectors and aspirating smoke detectors (ASD) monitor all rooms. The ASDs pick up on the
emission of specific gases that stem from overheating electronic components and set off a
preliminary alarm. Should a fire break out, the affected room is flooded with extinguishing gas
(INERGEN) and the fire is smothered. Sprinklers are not used, as water would destroy sensitive
electronic devices. As a last resort, however, water or foam may still be used as an extinguishing
method by the fire department, which is automatically alerted to the emergency.

Building
The data center consists of 100,000 metric tons of reinforced concrete and rests on 480 concrete
pillars, each extending 16 meters into the ground. The exterior walls are 30 centimeters thick and
made of reinforced concrete. The server rooms are further surrounded by 3 concrete walls. This
design provides effective protection against storms and even a small airplane crash.

Data Privacy
SAP ensures compliance with data protection provisions. Data from cloud customers falls under
the jurisdiction selected by the customer and is not forwarded to third parties. SAP’s support
services ensure that data protection is also maintained during required maintenance operations.

Backup
Backups are carried out in the form of disk-to-disk copies, which enables rapid data creation and
recovery. Besides full backups done on a daily basis, interim versions are created several times
per day and are then archived, like all backups, at a second location for security purposes.
At regular intervals, TÜV, KPMG, and SAP itself test whether the technology and infrastructure
are operating smoothly. An overview of the most important checks is provided below.

Continual checks
Databases and servers are routinely checked in real time to ensure that they operate properly.
Batteries for the emergency power supply must always be charged. Thus, the condition of
batteries is continuously tested. If a battery’s maximum capacity decreases excessively, it is
replaced.
Gas cylinders containing the INERGEN fire-extinguishing gas must sustain a specific level of
pressure. An electronic pressure gauge on each gas cylinder electronically transmits deviations
from the standard value to the central gas distribution facility.
Monthly
The diesel engines are automatically started once per month to perform a full load test.
Every three months
An aspirating smoke detector (ASD) emits a preliminary alarm to the security department upon
the slightest signs of fire or smoke. A second fire detector then emits a piercing alarm in the
event of an emergency. An external company performs tests every three months using a smoke
device to determine whether the ASD and fire detectors are still active.
Every six months
The diesel engines’ switch control panels are checked twice annually by an external company.
The inspection ensures that, in a real power outage, the switchover will function and that power
is supplied to the servers.
Every year
Doors, windows, and ventilation systems are inspected annually. The TÜV (an international
safety certification organization) inspects all access points to the data center in accordance with
ISO 27001 specifications. The door check verifies what types of door locks (toggle locks or dead
bolt locks) are used and whether they comply with the ISO standard. In addition, doors may not
be kept open for too long. During the TÜV inspection visit, the door is left open for one minute
to see whether an alarm is triggered as per the standard.
KPMG goes one step further and inspects the data center’s “black box” according to the
international ISAE 3402 (or SSAE 16) certification standard. In other words, it checks the video
recordings made over the last 365 days that prove that doors were opened only for authorized
individuals. Inspectors refer to this measure as a “door effectiveness” check.
Access authorization: Records from log files, card scanners, and duty rosters of the security
service are checked by the TÜV once annually according to ISO 27001. Some of the items on the
TÜV checklist include: how the security service organizes its 24-hour surveillance; how access
cards are issued; and how the approval process is conducted.
For the “black building” test, a power outage is simulated once annually. The external power
supply is cut off, so that the emergency power supply is actuated. This procedure ensures that the
batteries can bridge the power failure as expected, the diesel motors start up automatically, and
an extended supply of electricity is provided. This test is conducted and recorded by the data
center operator. The reports are then submitted to the TÜV, which compares them to the ISO
27001 standards.
The assigned installation company regularly services the fire-extinguishing system and generates
reports on the operability of sensors, for example, or any possible gas emissions. The reports are
sent to the TÜV and KPMG. This annual inspection is part of the ISO 27001 and ISAE 3402 (or
SSAE 16) certification process.
An external company inspects construction measures along with the engineering and
architectural blueprints. This ensures that construction work on the data center does not
damage a critical power cable due to improper or careless installation, for example. SAP submits
the engineering and architectural blueprints to auditors once annually.
Fire protection: Ceilings, walls, and doors in the data center must provide 90 minutes of fire
resistance, according to the T90 and F90 classifications for fire resistance, that is. The TÜV
checks this capability using construction plans and an inspection of the premises, in following
with the ISO 27001 specifications.
Air-conditioning system/temperature: As part of the annual inspection, the TÜV reviews the
maintenance records of the electronic systems and room temperature reports in accordance with
ISO 27001.

What about Data Protection?

Besides the physical security of a data center, which is ensured through structural measures and
various types of fail-safe equipment, customers must certainly wonder about data security; not
only to satisfy their own curiosity, but also to fulfill legal requirements.
A customer using any of SAP’s cloud solutions generally has the following questions:

Where is my data?
SAP cloud customers specify in their contract which data center they want to use as the “data
location.” For SAP Business ByDesign, for example, these locations are St. Leon-Rot (Germany)
or Newtown Square, Pennsylvania (USA). SAP HANA Enterprise Cloud is based in St. Leon-
Rot (Germany) and Amsterdam for European customers and Sterling, Virginia , and Santa Clara,
California, for customers located in the USA.
The data persistence and thus its “service life” will not change, as long as the customer does not
request to do so.
Backups are always located in the same jurisdiction as the data that is used in day-to-day
operations, but for security reasons, the two are physically separated.
Is my data disseminated?
No. Third-party use of customer data is not part of the SAP’s business model for the cloud. In
contrast to end-user cloud services (such as social networks), a high security level is at the core
of the SAP’s cloud business.
SAP reserves the right to analyze and graphically map the utilization pattern of users in order to
increase availability and service security. However, SAP will never store personal data or
analyze customers’ business data.
All SAP employees are individually and contractually required to comply with data protection
and information privacy provisions.
How do SAP employees access my data?
This depends on the cloud solution the customer uses. SAP Line of Business cloud solutions and
SAP Business ByDesign follow this approach:
As the operator, SAP must grant its employees access to customer data when needed for
maintenance and fault-correction purposes. To this end, dedicated terminal servers are provided
for which employees receive individual accounts.
All access is limited to one hour.
Support-related access is only approved upon request and only then with a password that is
generated for each particular situation. This prevents the dissemination and use of conventional
passwords.
For SAP HANA Enterprise Cloud different options are offered. The extent of support provided
by SAP differs by offering. SAP ensures that support personnel can only access customer data if
necessary and requested by the customer. There are no generic support users with unlimited
access authorization.
Who are the subcontractors?
The list of subcontractors is shared with customers prior to signing the contract. If there are any
changes, customers are promptly notified.
Among its contractual partners, SAP differentiates between those without data access
(“subcontractors”) and those with data access (“sub processors”). Sub processors located outside
of Europe are employed according to EU model clauses. All external employees must, regardless
of where they work, also sign individual confidentiality and privacy statements (CPS).
What verification options do I have?
The SAP cloud is inspected several times a year by external auditors, in accordance with various
standards (including ISO 27001, ISAE-3402, and SSAE-16) to ensure that the security
organization as well as all technical and organizational measures are implemented and reflect
state-of-the-art technology. These certificates and audit reports may be shared with customers,
although this may require signing a non-disclosure agreement (NDA).
Every year, SAP invests more than €500,000 to audit the SAP cloud. This expenditure is
necessary to meet the legally mandated audit assistance obligation.
Customer-specific, on-site audits extending beyond this scope can also be conducted.
SAP cloud customers can entrust their data to SAP with a clear conscience.
The data is not stored “somewhere in the cloud,” but in clearly agreed-upon locations. No one
has blanket access to the data, and comprehensive audits ensure that all technical and
organizational measures are complied with and implemented.

Certification for Security’s Sake

ISO 27001
This standard specifies the requirements for establishing, implementing, maintaining as well as
continually improving an information security management system. It is a risk-based approach
which covers confidentiality, integrity and availability aspects of information that need to be
managed. The SAP data center is a vital part of the annual internal and external surveillance
audits.

SOC 1 / SSAE 16
The SSAE 16 or SOC 1 (Service Organization Controls) standards require a report on the
controls at a service organization which are relevant to user entities’ internal control over
financial reporting. The physical security perimeters of the SAP data center are therefore part of
the bi-annual audits.
SOC 2
The SOC 2 standard comprises a service provider's controls relevant to a service recipient's
financial reporting integrity, security, availability, processing integrity, confidentiality or
privacy. The SAP data center is also in scope for these bi-annual audits

ISO 22301
Is a standard in the field of business continuity management (BCM) to ensure continued
operation in case of critical situations. This standard sets the requirements for a business
continuity management system to protect against business disruptions and ensure the
organization is able to recover in the event of a disruption.
Whether the foe is fire, data breach, or hardware defect, data centers must be protected against
many hazards. A series of quality seals and certificates show exactly how compliant a given data
center is with all the necessary security precautions. The following description is based on the
example of the data center in St. Leon-Rot.
SAP ensures that the same or equivalent certificates are valid at every data center where cloud
solutions are run.
Data centers are sensitive entities that are exposed to hazards on many fronts. Imagine that all
your data was suddenly lost because of a hardware malfunction. For most users and data center
operators, this would represent a tremendous loss. For some, it would even spell their demise.
However, there’s no need to assume the worst right away. Location alone can make a data center
secure or not. For example, a nearby stream could pose a risk of flooding. Unauthorized access
could cause accidental or intentional damage. And equipment-related defects could result in
failures and downtimes.
Germany’s Federal Office for Information Security (BSI) has listed various hazard categories in
its manuals for basic IT security. Data centers would be well-advised to take the appropriate
preventive measures for example against:
Force majeure, for example, flooding, fire, and lightning;
Organizational defects, such as sloppy or inadequate access rules for areas requiring security;
Technical failure, like a failure of the power supply or security equipment;
Deliberate acts, including, theft, unauthorized entry, or sabotage.
Certifications offer security
In the same way that cars in Germany require a TÜV inspection for roadworthiness at certain,
pre-determined intervals, data centers should also have to demonstrate their operation-
worthiness. Ultimately, this benefits both data center operators and users.
For example, data center operators would do well to understand that operating their technical
equipment, associated systems, and data in a proper environment has a direct bearing on their
economic existence. And users want to be able to count on the fact that their data is stored in a
safe and protected manner. In particular, data centers that function as outsourcing service
providers with responsibility for their customers’ data are obligated to maintain high security
standards.
Certifications help to objectively identify and professionally evaluate security risks.
To do so, the security level of a given IT infrastructure is systematically examined using a
variety of assessment criteria. If the data center passes the inspection, the operator is provided
with a conformity document, usually in the form of a certificate, stating that it is operating its
facility securely and reliably based on the latest technology.
Who certifies what
Behind every certification, there is an inspection of certain parameters or criteria. For example,
an inspection might test power supply, availability, or regulatory compliance (such as with the
German Digital Signature Act). The significance of any given certificate is only as strong as the
requirements outlined by the certification or attestation organization and the institution that
performs the inspection.
Besides evaluating data center security, cloud providers are also interested in protecting their
software and operations. Once the security of these two realms is assured, then customers can
entrust their needs and data to the service providers.
Many certification organizations perform their inspections in accordance with various standards.
Multiple auditing firms conduct audits based on national and international standards, such as ISO
27001, SOC 1 /SSAE 16 and SOC 2. The SAP data center is also audited according to these
standards. Once the audit is successfully passed, the data centers receive a certificate or
attestation report verifying their compliance with the respective standard.

SAP’s Certificates
From System R to the SAP HANA Enterprise Cloud
Early years and SAP R/2
Software that processes data when required. Integrated business functions in SAP R/2.

The SAP R/3 Era


Real-time reaches the desktop: A client-server evolution of SAP R/2.
Anywhere and anytime
Real-time moves to the Web and beyond: cloud and mobile. The SAP ERP Suite is launched.

SAP HANA
redefines the meaning of real-time and becomes the basis for all SAP products.
When SAP was founded in 1972 nobody would have known what cloud-computing would mean.
One might have guessed it was related to space travel, as mankind had been visiting the moon for
some years with huge rockets.
Since its beginnings SAP has helped businesses run better through world-class software solutions
that solve complex problems to help invent, commercialize, and mainstream the products and
services of the global economy. Today SAP is the world’s largest provider of business software.
As a result of customer-inspired innovation SAP’s portfolio of solutions is currently used by
more than 248,000 of the world’s best-run businesses, touches more than 74% of the world’s
financial transaction revenue, and impacts more than 500 million people.
In 1972 nobody would have imagined to which extent computers would affect the lives of most
people four decades later. Estimates at that time predicted only some hundred computers
worldwide to exist in future. The understanding was that only large companies and governments
would be able to afford such costly technology.
Today we know it turned out otherwise and since its beginnings SAP stood to benefit from the
technological advance in computer science.

Real-time and Standardization


The first programs that were written from 1972 onwards emphasized the aspect of real-time.
“System R” used data entry via screen and keyboard rather than punch-cards. Thus data was
immediately available to support business processes and their evaluation.
The aspect of real-time was kept in focus since then whenever new software was developed by
SAP.
“System R,” later called R/1, was developed on customer-owned hardware, but thanks to the
ongoing miniaturization of computers the quickly growing company SAP could soon invest in its
first own hardware. The next evolution of real-time software, the mainframe software SAP R/2,
became a huge success during the1980s. It supported not only instant availability of data but
provided an integrated view of all relevant business areas.
A second important attribute of SAP’s software is standardization. Standard software addresses
many customers’ needs with a basically identically coding using a standard set of customizing
options. This has many benefits: software development can be streamlined, upgrades can be
better planned, customer support can be more efficient and it facilitates forming a stable network
of partners.

Evolution
Keeping the core features of real-time and standardization SAP profited from the newly emerged
personal computers and invented a client-server version of its established SAP R/2: SAP R/3
which was released in 1992 and became not only an even bigger success but an industry standard
in its own.
SAP enhanced SAP R/3 with various standard industry variations and closely attached additional
software packages until SAP R/3 was incorporated in the SAP ERP Business Suite in 2004.
With real-time enabled standard software SAP was also entering the next steps in software
evolution: mobile, big data and cloud.
Big data requires efficient ways to deal with data volumes, SAP’s answer to this is its home bred
in-memory computing: SAP HANA. SAP HANA will become an integral part of SAP ERP.
For the cloud market SAP developed a comprehensive portfolio of various line-of-business
solutions and a cloud-ERP (Business ByDesign) and acquired important cloud-players like
SuccessFactors and Ariba.

Revolution
“HANA redefines the market for enterprise software – it’s only logical to take it to the cloud,”
said Hasso Plattner during the announcement of SAP HANA Enterprise Cloud (HEC) in May
2013. The goal is combining the power of real time with the simplicity of the cloud while
leveraging SAP’s 40 years of application experience with mission critical data centers. HEC will
make in-memory computing quickly available for a large customer base in a managed cloud
environment that includes a wide range of flexible options.

The market for public cloud services is expected to reach over US$200 billion by 2016. Source:
Gartner, August 2012
The Value of SAP HANA Enterprise Cloud
The business environment is changing fast. Speed is a competitive, strategic weapon, and
companies are driving constant and uncompromising change into their business to expand, grow
markets, develop new products, and so on.
For many years IT has been struggling to manage and trade off the impact of ever-increasing IT
complexity with the growing expectations of the business. Demands for faster, bigger, and more
flexible conflicted with the risk of ‘breaking’ something in an IT landscape that became
increasingly more fragile with each new change.
To address these seemingly conflicting priorities and drivers, CIOs are examining ways to
reinvent their operations and drive business value by:
Core not context – Outsourcing SAP operations and application management to free-up
resources that can then focus on core competencies such as creating business value and strategic
advantage, rather than operations, management and maintenance.
Business Relevant – Supporting the business’ needs for continuing innovation, change,
expansion and growth. Everything the business does today is enabled by IT.
Cost Reduction – Driving cost out of the IT landscape and operation is a huge concern and
focus for today’s CIO
Strategic Drivers
Strategy 1: Form true partnerships with companies like SAP to enable ‘Closed-Loop Support’,
whereby the teams that develop, build and maintain the applications on which you depend,
become an extension of your IT department – your virtual support organization.
Strategy 2: Implement ‘Leveraged Innovation’, whereby customers can test, and trial the latest
innovations from SAP in a sandbox HEC environment, to ensure the best fit and the maximum
innovation potential before they purchase. All of this, pre-integrated and linked to the full
breadth and depth of SAP’s entire solution set including Ariba Network, Hybris WebChannel
extension, API Management solutions, and so on to minimize complexity-driven risk and cost.
Sources of HEC Value
The HEC provides numerous sources of value for both ‘HEC for Projects’ and ‘HEC for
Production’. These are outlined below:
Accelerate the Deployment of HANA – deliver the RTDP (Real Time Data Platform) benefits –
analytics, process performance, mission-critical, etc.
Free-up Customer’s IT Resources – leverage the HEC and SAP’s Application Management
Services (AMS) to free-up customer’s resource and IT staff to better drive value creation
activities, focus on business innovation, and drive revenue
Free-up IT staff
Reduce the Investment Capital Required
Reduce and Avoid Energy and Data Center Costs
Respond faster to the business’ need to innovate, change and deploy new systems Outside of the
HANA deployment project already mentioned, the business will have literally hundreds of
smaller projects each year – some large, some small. These can range from deploying a new
system for a company that’s been acquired, to expanding an existing system to new geographies,
groups of users, and so on. Separate topics that are integral to the ‘respond faster’ value driver
are (i) agility and (ii) elasticity.
Agility refers to the capability of the IT department to best respond to the needs of the business
by delivering the systems and hardware the business needs to drive revenue.
Elasticity is the term given to the customer’s IT landscape, and how easily to can expand and
contract to meet the needs of the business. For example, if the business needs a new system
stood-up for a new project or acquisition, does the existing hardware have sufficient capacity to
enable that, is the data center big enough, what about wiring and cooling requirements? These
are all factors in the elasticity equation.
CAPEX to OPEX – the ability for a CIO to be able to purchase HEC as an operating
expenditure (OPEX) rather than as a capital item (CAPEX, CAPital EXpenditure) is a very
significant win. The ability for a CIO to pay a monthly fee from their operating expense (Opex)
budget means they won’t need to have capital approved – the process is far easier and has less
all-round risk.
HEC is the Low-Risk Deployment Option – For both business and IT executives, choosing the
right deployment is all about risk reduction and cost avoidance, and HEC enables customers to
avoid ‘all or nothing’ decisions. With HEC, the customer could avoid both the capital cost and its
resulting depreciation expense. This enables the financial risk to be avoided.
To Do It Yourself or Not?

Drive yourself
As an analogy to software operated by the customer, think of a personal car. You drive when and
how you like, choose the route, and take breaks whenever you need one. All of your individual
desires are taken into consideration, starting from the moment you choose and configure your
dream car. However, you also have to finance the complete cost of the car, even if you don’t use
it very often. And you might even have to pay for a garage. The same is true for software that
you operate yourself – maximum flexibility comes at a price, namely higher fixed costs.

Ride with someone else


As an analogy to the use of cloud services, think of a high-speed train. You don’t have to worry
about the route or cost of the train itself. You simply get on and pay for the actual distance that
you travel. If the train is canceled for some reason, a replacement is usually made available.
Cloud software works in a similar way – you can get started with an application without much
advance notice. But you are also less flexible when it comes to customized options and
functionalities.
To operate one’s own data center, or run applications on external servers? Better yet, why not
simply lease applications? The increasing diversity of cloud services on offer inevitably ends up
overwhelming customers. However, one thing is clear. In most branches of business, data center
operations do not fall under a given company’s core area of expertise.
The role of data centers today resembles the way power plants were used at the start of the
industrialized age. In the early 19th century, almost all large industrial companies operated their
own power plants because there was no uniform standard pertaining to the power supply, nor
was there a reliable power grid.
Once both were available, companies were happy to forego generating their own power. The
newly created utility companies delivered the required electricity reliably and, because of the
economies of scale, more cost-effectively than when it was generated by their own power station.
Electricity became a commodity.
Cloud computing service providers can present a similar argument today. Operating hardware
and software is not a core area of expertise for many companies. In contrast, a specialized
provider can offer IT services, ranging from computing power and storage space all the way to
complete applications, better and for less money than their potential customers can. And that’s
not even taking into consideration the expertise needed to run a data center in a high-quality,
secure manner that many companies would be hard pressed to achieve.
Nowadays, companies can choose from an entire portfolio of cloud-based IT services. They
include: infrastructure as a service (IaaS), where (additional) hardware, computing power and
storage space in the cloud are leased on a temporary or long-term basis; platform as a service
(PaaS), where users can access a cloud platform equipped with a programming environment and
tools to develop and operate their own applications; and software as a service (SaaS), where
complete applications, including administration, operations, upgrades, and maintenance are
provided from the cloud.
Good Reasons for Cloud-Computing
The most important reasons for customers to consider cloud-based computing:
Fast implementation time, providing quick access to functionality
Reduced IT efforts because both hardware and software operation and maintenance are the
provider’s responsibility
Flexibility due to subscription rather than licensing contracts and “pay-what-you-use” concepts
Scalability to support changing business needs and supporting growth
Even though cloud services have established themselves on the IT market over the last few years,
commonly agreed definitions of “what is cloud” are hard to find.

Comparison of On Premise, Hosting, and Cloud


On-premise systems are primarily the customer's responsibility.

SAP partners generally provide hosting services.


Cloud services are provided by SAP.
The line gets especially blurry at times between application hosting (which has been part of the
portfolio of many IT service providers in the last 20 years, especially in the realm of SAP
systems) and SaaS (which refers to operating applications in the cloud).
Lately, hosting has continued to evolve with the primary objective of offering “off the rack”
software — such as application hosting — in addition to the more conventional outsourcing of
hardware. From the customer’s perspective, this comes very close to being a cloud service. To
some extent, subscription models are even used for billing purposes, so that the software license
is actually held by the hosting provider.
From a customer’s point of view the definition might also be of little relevancy. As long as the
quality is great, the service level convincing and the price right the customer need not worry
what the provider is doing behind the scenes.
SAP’s Cloud-Strategy in a Nutshell
SAP’s strategy announced in 2013 is to offer three flavors of cloud services:
Public Cloud-applications, e.g., to support human resources, customer and sales management,
finance or procurement – as well as ERP in the cloud
Private Cloud: HANA Enterprise Cloud, which is a managed cloud offering
A marketplace to scale and extend innovation, for customers as well as the partner eco-system
These three areas are supported by collaboration tools both people-to-people but also business-
to-business – the business network.
SAP has also announced the common foundation.
Technology for the Cloud
When talking about the cloud, it's hard not to mention the terms virtualization, multi-tenancy,
and adaptive computing. Each of these technologies plays an important role in providing high-
quality cloud services.
The Interview
In this interview, Bernd Himmelsbach gives a short introduction to the three technologies.
Improved hardware utilization, simplified use, and uninterrupted operations (even during
upgrades): A smoothly-run data center is more efficient. That’s essential for data center operators
and also helps customers. Depending on the cloud-solution different technologies are relevant.
Just 10 years ago, data center architecture was still based on separate servers that each contained
individual applications. This arrangement usually resulted in poorly utilized systems that were
configured for peak loads and saw little action the rest of the time. One example is a company’s
e-mail server. It is typically used in the morning with great frequency, when employees get to
work and go through their e-mails.
To ensure that no wait time is created when working through their e-mails, the server has to be
configured for such peak loads.
The rest of the day, the server’s utilization is generally less than 25%. Now let us imagine that
this is the case at a company with several business locations throughout Europe. The company
could spare itself the cost of having a large e-mail server at each of its branches in Russia,
Germany, and Ireland – if only it could improve utilization. The same problem, but even more
relevant to the business realm, applies to ERP and accounting systems, which have to be
configured for peak loads during seasonal business periods or when quarterly and annual
financial statements are prepared. Most of the year, though, capacity is left untapped.
Not only is this costly, it also makes hardware maintenance and replacement difficult, due to
operational disruptions that have to be carefully planned. In addition, business-critical
applications require backup systems. These are also designed to handle peak loads and remain
unused the rest of the time.
Virtualization
The key technology that can help eliminate these problems is called virtualization. It is based on
the principle that hardware is decoupled from the operating system and applications. Virtual
machines (VMs) are installed on the physical servers and share the hardware environment. In
this way, the hardware of the individual applications no longer has to be configured to peak
loads. Instead the entire server and storage pool is available to all applications in the company.
Take the aforementioned company with branches in Russia, Germany, and Ireland as an
example. The three virtual e-mail servers could run on shared hardware, and since the
employees, working in different time zones, would access their e-mails at different times, one
single computer could be scaled to support the maximum utilization of all three e-mail servers,
one after the other.
The average utilization of the underlying hardware thus increases in comparison to individual
computer usage. While resources for individual systems are measured based on the maximum
expected load, one can assume that when it comes to shared use in a hardware pool, not all
virtual systems will be subjected to maximum loads simultaneously.
This set-up has several advantages:
Immediate reduction of hardware costs because system utilization is substantially improved.
Simplified hardware maintenance because virtual machines can be moved to other servers within
the server pool. This allows redundant hardware components and entire servers to be replaced
without disrupting operations.
Better investment protection since hardware no longer always has to be state-of-the-art; instead,
it can be retrofitted as needed.
Greater flexibility since new virtual server systems can be set up within minutes. In comparison,
configuring and setting up a physical server takes several hours if not days.
Increased availability and reliability due to the fact that virtual machines can be re-started on
another physical server in the server pool if a hardware malfunction occurs, with no significant
interruptions.
Mobility of virtual machines: Individual virtual machines can be moved from one hardware pool
to another.
Please note that HANA servers are typically not virtualized for technical reasons.

Adaptive Computing
While virtualization decouples hardware from the operating system, adaptive computing
decouples the operating system from applications. Normally, virtual machines must be shut
down to be rescaled or to have their operating systems updated, but adaptive computing enables
one to make changes to an operating system and virtual machines, while they are running
applications, with minimal downtimes.
To make this happen, one must first prepare a new VM with the required specifications. For
example, this could require increasing the performance of the storage device and processor,
upgrading the operating system, or replacing an over-sized VM with a smaller one. After this
preparatory step, the application is simply “moved” from the old VM to the new one.
Another benefit of this process is increased reliability. If the new system does not function
properly, one can always go back to the previous VM that has not yet been removed.
Multi-Tenancy
While virtualization and adaptive computing in a company’s data center ensure that business
applications run more efficiently and cost-effectively, cloud providers have greater requirements.
Their main concern is the ability to simultaneously run the applications of many different
customers on their virtualized data center architecture.
To achieve the greatest economies of scale, cloud providers run many customers on one
application instance, typically in the form of a virtualized server and storage infrastructure.
Customers are assigned a tenant; this is comparable to a client for on-premise applications.
Multi-tenancy occurs when many customers can be served on one instance. Depending on the
application’s size and the cloud software’s requirements, one system can accommodate more
than 100 tenants.
In the SAP HANA Enterprise Cloud systems are typically of sizes that do not allow for multi-
tenancy so that each customer is provided its own system.
For smaller size applications, like most line-of-business cloud-solutions, the system is configured
in such a manner that the tenants of the individual customers are kept completely separate, in a
logical sense. Thus, a given customer is not aware of the others and is unable to access anyone
else’s data or functions. Besides having customer tenants, each system also has individually
configured administration tenants.
One can easily see that such multi-tenant environments, as operated by cloud providers, achieve
significantly greater economies of scale and can be run more cost-effectively than a data center
within a given company. Regardless of whether upgrades, patches, or hot fixes are involved, in
each case these activities are carried out for customers in one process, significantly reducing the
cost per customer. Cloud providers generally pass on the cost savings to the customers.
As a result, customers not only become more flexible, they also benefit from low subscription
prices and save on hardware, operating, and maintenance costs associated with the application.
Data – Underway in the Cloud
01:Access
02: Encryption

03:HTTPS

04:Transmission

05:Firewall
06-Data Decryption

07-Data Storage

08-Confirmation
01
The first step to using a cloud service is the user log-in. A connection with the service provider is
created and an encrypted key is issued via certificate. This key is used for the exchange of data
that follows: for example, when an SAP Business ByDesign user creates new master data for a
customer, including name, address, and terms of payment.
02
After new data is entered and the user clicks "save," the data must be sent to the SAP servers.
Before this occurs, it is encrypted using the key issued at the very beginning of the session. Only
the receiving office knows the code that can restore encrypted data to its original form.
03
Data is sent using HTTPS. That is, by layering the hypertext transfer protocol (HTTP) on top of
the encryption protocol called transport layer security (TLS), also known as secure sockets layer
(SSL). This ensures that unauthorized individuals cannot gain access to the data and read it in
plain text.
04
Encrypted data packets are sent over the Internet and ultimately arrive at the SAP data center.
05
Firewalls protect the data center from unauthorized access. Data packets, however, are directed
to a special address, and thus are able to pass through the firewalls.
06
An SAP Web dispatcher decodes the data and restores it to plain text.
07
Decoded data is matched to the particular cloud customer and saved in the correct tenant in the
database. At this time, using the previous example again, a customer number is assigned to the
newly created master data.
08
Many transactions end with a confirmation that the entry has been successfully created. In the
previous example, after the user has saved the new customer's master data, the confirmation
would come in the form of a customer number issued by the software. Naturally, the
transmission of this information is also encrypted. After the browser decodes the data, the user
sees the following message: "New customer with customer number xyz has been created."
In cloud computing, data doesn’t swirl around in a hazy cloud; it is transferred specifically to the
cloud provider’s data center and stored there. How does this work, and how secure is it? We
follow the data on its journey and show what happens behind the scenes.
Applications used in cloud computing run on the servers of the cloud provider. All data is stored
in their data centers and retrieved from there, too. Simply put, it is possible to use the programs
to store and load data from any given location. The benefit of such an arrangement is that besides
Internet access, the implemented PCs, laptops, tablets, smartphones, and thin clients do not need
any major programs. The end devices simply run a Web browser that gives users access to the
cloud providers’ applications and services.
One of the advantages of cloud computing is that employees, wherever they may be, can access
their work data.
For example, the cloud-based SAP Business ByDesign solution comprises CRM and ERP
functionality that provide worldwide access to financial bookkeeping, customer account
management, order processing, and claims management. The data is stored in the cloud and
retrieved from there.
So how does that work? Essentially, the data transfer to the provider, as well as access to the data
stored at the provider’s location, takes place via an Internet connection. Connections via a secure
tunnel like a virtual private network (VPN) are more the exception than the rule. Consequently,
public cloud systems have open interfaces to the Internet.
That is why security is a top priority and why cloud providers and customers have to protect
these interfaces from unauthorized use.
Medium-sized companies with few employees and financial resources can often sign up for
security features in the form of flexibly retrievable cloud services.
Encrypted data transmission
To use a cloud application such as SAP Business ByDesign, employees simply log in to the
server in the cloud provider’s data center from their workstations, home offices, or when
traveling. In the most straightforward situation, authentication is performed by entering a user
name and password; however, more complex and secure procedures can also be instituted.
One very secure process makes use of public key infrastructures (PKI). It grants access only after
first successfully confirming a user’s identity via smart cards with a signature function, biometric
procedures, or multi-use passwords. Cloud providers can also issue certificates via their own
trust center.
If the values are correct and if the system “finds” the user, that person is granted authorization.
Data and programs to which individuals have valid access are now unblocked. After successfully
signing in, a connection is automatically set up to the destination device in the data center.
Since data is not meant to be read or changed along the way, it arrives encrypted at the
provider’s location. The data is also encrypted on its path from the provider to the user for which
secure socket layer (SSL) encryption is used.
TCP/IP disassembles the data
Once users enter new customer data or modify existing master data in SAP Business ByDesign,
it is sent to SAP via the Internet. The encrypted data is transmitted according to the rules of the
transmission control protocol (TCP)/Internet protocol (IP). The TCP breaks the information
down into small data packets and sends each one to the same target IP address via potentially
different routes.
Each packet contains details regarding the address to which it is being sent and a sequence
number that indicates its position within the transmission. The IP protocol takes care of this
addressing task, thereby ensuring that the packets really do arrive at the cloud provider’s location
and are reassembled in the right sequence. Once at the provider’s data center, the TCP
reassembles the individual packets and forwards them as a file to the server.
Generally, providers operate entire server farms, consisting of dozens (sometimes even
hundreds) of interconnected computers that may often be set up in a decentralized manner in a
cluster. Virtualization software known as a hypervisor can subdivide each server into multiple
virtual machines that are typically used by various customers. The software can also combine
individual servers to form a large-scale system.
Multi-tenant capability keeps customer data separate
When storing data, providers ensure that their customers’ data is kept strictly separate. All data
packets must be assigned to the right customers, which use different tenants of an application. As
a result, each customer has its own completely isolated environment on a logical level.
Data is usually stored in relational database systems. The software can assign individual users
certain roles and rights that determine who can access what data. Relational database systems
offer features such as authentication mechanisms and encrypted storage.
This makes it impossible to view someone else’s data or user management processes.
To protect the data, the provider also ensures that customer data is regularly backed up. If data is
lost, it can easily be recovered.
Its Own First Customer

SAP as Early Adopter


SAP Runs SAP is an initiative within SAP IT. It turns SAP into an early adopter and reference
customer for its own applications, solutions and products.
The Interview
In this interview Martin Heisig explains why SAP runs data centers and how customers benefit.
From mobile apps, to SAP Business Suite powered by SAP HANA, to virtualization in the data
center: New technology at SAP is first tested and used by internal users.
The practice goes by many names: eating your own dog food, drinking your own champagne, or
in the case of the Walldorf-based software company, “SAP Runs SAP.” Companies in the
software business are notoriously loyal users of their own products. This custom certainly helps
inspire confidence in the brand, but it’s not just a marketing gimmick.
By using software internally both before and after making it available to customers, employees
are not only able to alert developers to potential problems in an application and make
suggestions for improvement, they are also the first to enjoy the benefits too.
Employee-driven development
SAP takes it one step further – its Global IT team works directly with the lines of business to
actively seek out innovative technologies and initiate internal proof of concept projects in those
areas. It was this process that drove the deployment of 1,000 iPads to SAP employees in 2010
and made e-mail, analytics, and reporting available anywhere and anytime. Thanks to a secure
mobile platform, this first group of mobile workers was more connected to SAP data than ever
before. Today SAP deploys 25,000 iPads and 27,000 iPhones and addition, there are more than
14,000 Blackberry and 7,000 Android devices.
But when it came to store said reports and save e-mail attachments on their mobile devices,
employees turned to external, publicly available apps for sharing files in the cloud. Convenient
certainly, but lacking the security standards most businesses require when transmitting data.
What happened next embodies the core spirit of “SAP Runs SAP”: Global IT recognized the
business need behind the bad habit and responded by developing its own version of the service:
an internal file sharing application.
Now, employees are able to securely store, access, and share files containing internal information
on their mobile devices, and after the successful internal release of the SAP Mobile Documents
app, it is also available for customers.
SAP runs SAP HANA in the cloud
Continuous improvements to existing offerings, like in the example mentioned above, are the
bread and butter of “SAP Runs SAP.” But the program also plays an important role in the
development of large-scale, strategic innovations, such as the SAP HANA platform. The first
company in the world to use the in-memory technology, back in 2010, was none other than the
global software enterprise that created it.
SAP HANA was initially deployed at SAP in a so-called “side-by-side” implementation, in
which data is replicated from an existing database to the SAP HANA database, where it is stored
in main memory. This configuration drastically reduces the time it takes to analyze and create
reports on huge volumes of data, such as the sales pipeline for an entire company. Due to data
compression and using the SAP HANA database, SAP has been able to reduce their data volume
from 7.1 TB to just 1.8 TB.
In fact, SAP Sales Pipeline Analysis analytic content was one of the first real-time applications
made available internally. Rather than taking six hours to analyze current sales figures,
executives and sales managers were able to do so in a matter of seconds. And since the solution
became generally available, customers have been able to do the same.
In the meantime, employees have seen three further developments of SAP HANA: as the
primary data store for the SAP NetWeaver Business Warehouse (SAP NetWeaver BW)
application, as the underlying platform for new applications, and now, finally, as the integrated,
in-memory platform powering core applications in SAP Business Suite. For each of these
implementations, SAP employees acted as guinea pigs, testing functionality in real business
scenarios and identifying new use cases.
While nowadays cloud computing becomes an even more important topic, SAP recognized this
trend and reacted early by moving their core applications into their own cloud environment
powered by SAP HANA. Customers now have the opportunity to accelerate the deployment of
their SAP HANA projects within the SAP HANA Enterprise Cloud (SAP HEC), where
infrastructure is combined with managed services tailored for the customers’ individual needs.
This enables the customers– including SAP as its own and first customer – to access further
advantages by getting access and benefit from SAP’s latest innovation and application
technologies as well as having a huge increase of implementation speed, and with this reach a
better time-to-value. Due to this given benefits and better flexibility SAP is now able to improve
their products and applications even faster while reducing the total cost of ownership (TCO).
As of 2014, SAP HANA Enterprise Cloud operates the largest SAP HANA systems worldwide,
including SAP’s own single global SAP Enterprise Resource Planning (SAP ERP), SAP
Customer Relationship Management (SAP CRM), and SAP NetWeaver BW applications. The
ERP on HANA has currently over 67,000 business users and 15,000 CRM users now benefit
from an improved search performance – it is 250 times faster within the HANA Enterprise Cloud
– and more than 4,500 BW users gain advantage from the reduced implementation time of only 5
months.
So now “SAP Runs SAP” in the HANA Cloud Platform and – of course – all experiences from
SAP’s own implementations also benefit their customers.
Stories like these show the constant fine-tuning of SAP software that occurs both before it is
made available to customers and after it’s on the market. But in order to use innovations like
mobile apps and in-memory technology in the business in the first place, the technical foundation
has to be sound.
1 ERP for 67,000 users
Less flashy than the iPad, perhaps, but just as important to the “SAP Runs SAP” campaign, is the
fact that the company runs a single instance of the SAP ERP application for 67,000 employees,
in 220 locations, and in 70 countries. It also has single instances for many other applications,
including SAP Supplier Relationship Management (SAP SRM), SAP CRM, and SAP
NetWeaver BW applications. This means that every region, line of business, and acquired
company within SAP uses one version of the software.
Having only one version of data within any given application is the most convincing argument
for single instance installations. It increases transparency in financial reports, eliminates
duplicate information and errors, makes it easier to spot new revenue opportunities, and speeds
up the implementation of innovations. Still, the approach isn’t without its detractors. Some fear a
never-ending, large-scale implementation. Others are wary of the downtime that an upgrade
could cause when applied on a company-wide level. So how does SAP manage to maintain its
single instance installations?
In large part, it comes down to excellence in the data center. SAP has one of the highest
virtualization rates among its peer group.
At the end of 2011, 66% of its servers were running virtual machines. By early 2014 SAP has
reached over 80% of virtualization on the SAP servers.
This allows the company to carry out software upgrades without disrupting users and to flexibly
manage the changes in business need that occur at different times of the day, quarter, or year.
Scalability is another aspect; and nothing can match the virtually infinite scalability of
applications operated in the cloud. Since its acquisitions of SuccessFactors in February 2012 and
Ariba in May 2012, the software company from Walldorf has rapidly expanded its use of cloud-
based applications internally. SAP employees started using the SAP Cloud for Travel and
Expense solution and SuccessFactors human capital management suite. Meanwhile, the acquired
cloud company is putting SAP Cloud for Travel and expense as well as the SAP Cloud for Sales,
SAP Cloud for Service, and SAP Business ByDesign solutions – all of which are run exclusively
in SAP’s own data center – to work.
And who ensures that the SAP data center meets the highest standards for security, availability,
and scalability, so that the company has a solid foundation for implementing innovative
technologies quickly? You guessed it: SAP Runs SAP.
SAP Runs SAP is active in five categories

Cloud: Global IT will grow in these four areas: HR, FI, CRM, SRM
Cloud: Global IT will grow in these four areas: HR, FI, CRM, SRM

Mobility: Employees are enabled to connect anytime, from anywhere


Database: SAP HANA - Employees have access to powerful features

Applications: Drive lower TCO, higher performance and more flexibility

Analytics: Real-time information – SAP employees can perform better

You might also like