SlideShare a Scribd company logo
How to Manage Large Amounts of Data
with Salesforce
1
Data & Integration Considerations for Architects
Paul McCollum
Cloud Architect
pmccollum@sensecorp.com
2
www.sensecorp.com/contact
@SenseCorp
facebook.com/SenseCorpSC/
linkedin.com/company/sense-corp
It’s just Data, right?
• Map my data scale (Migration and Ongoing)
• Will my application fit (long term) in my Target Architecture?
• Do I have any “Design Skews”?
• Where will my pain points be?
• Pain points and solution patterns for:
– Lots of data (LDV*)
– Lots of connections
– Lots of connections to lots of data sources
• In this Session, we will be focusing on identifying data and integration
issues in the design phase, discussing some solution platforms specifically.
3
GOAL: Learn Architect Design Planning Patterns
“A ‘large data volume’ is an imprecise, elastic
term. If your deployment has tens of thousands of
users, tens of millions of records, or hundreds of
gigabytes of total record storage, you have a large
data volume…”
~Salesforce
https://developer.salesforce.com/docs/atlas.en-us.salesforce_large_data_volumes_bp.meta/salesforce_large_data_volumes_bp/ldv_deployments_introduction.htm
The Science of Architecture
• As soon as you build it, (sometimes before you are even done) applications
take on lives of their own.
• Architects need to see the present and predict the future.
• Good Architecture is responsible for the ultimate success of
application/project.
• In relative dollars, it costs:
 $1 to effectively design and architect a solution
 $10 to re-design it due to technical debt if not architected well
 $100 to completely scrap and re-tool
 $____ to repair your reputation after a write-off
4
It’s just Data, right?
• We create:
– Pages
– Buttons
– Flows
– Validation Rules*
• Talking to:
– Objects
– Fields
• Combined with:
– Authentication: Active Directory, SSO
– HR Profile Data, Medical History, Order History
5
We’re all used to building Applications in Salesforce.
*Salesforce has imposed a limit of 100 Validation Rules per Object.
Ever try to create test data on an object with 100 Validation Rules?
Building Applications Based on Data Scale
1:1:1 or 101 : 101 : 101*
• We try to understand our applications’ consumption of
data relative to the complexity/magnitude of
downstream Data. NOT because data is bad but because
handling lots of anything has a cost.
• For our purposes “Design Skew” is defined as having
components out of proportion with each other.
• Skew can cause both System and Operational
degradation.
• The following slides show examples of how to start
documenting your data interactions and Predict and
Mitigate possible Skew complications.
6
*Obviously, there’s more than 3 elements of solutions, the key is balance.
Master Data
Be aware if you are building or
defining relationships for Master
Data, you may also be creating
duplicate “Master Logic”.
Rules, Logic and Relationships to data
that is not your application’s Master
Data should be conferred upstream
to the Master Data owner.
For Example: Product Price and Tax
Rate calculation should not be done
in multiple systems. Differences in
rounding rules can lead to
discrepancies that can be hard to
reconcile.
Common Types of Design Skews
7
How do I map my application?
8
Salesforce
Data
Start with this simple diagram
• Sketch User counts
• User Types
• Functions
• Connectivity
• Firewall
• Domains
• Data Sources
Objects Services
User
Share it with All teams and Stakeholders.
• Attach it to every deploy and code review
• Any changes must be ratified
Architecture Worksheet
9
Salesforce
Data
Objects Services
User
Users Functions Objects Records Connections Dependencies
Data
Sources
Master
Logic
Master
Data
Reference Architectures
Salesforce
Growth Enabled Company Infrastructure
Scalable
Regular Patterns
Repeatable
Mature Support
Any time the Application has a 1:>1:>1
Objects
Kafka
(Streaming)
NoSQL
ESB
DAM
SFTP
ETL
Main
Frame
MuleSoft
Jitterbit
Data Lake
(Object Storage) Data
Data
Data
Data
Snowflake
Step 1: Dream Big
Architects: Visualize Success
• How big will this get?
• How many people will use this?
• If it gets to size X, will I have to rebuild?
• Define Growth Boundaries with recommended changes.
• Create Transactional Warning systems inside triggers and functions.
11
If(DML.size() > 70% of limit){
SendEmail(to=setting.PerfAdmin);
}
Step 2: Dream Bigger
• Does this data have value?
– The answer is always: Yes.
• The larger the volume of data created, the more likely there will be
value in analyzing that data.
– Trends
– Next Best Action
– Inefficiencies
– Agent Performance
– ROI
• As you create data, think of the many ways it could be used.
– Plan accordingly!
12
Too Many Records
Salesforce Data
Schema Overload
• Trying to keep the data model a perfect twin to reality
• Not having a plan to make data “Mature Out” of the system
• Avoiding Data Skew
• Is every application responsible for Forensics?
Objects
Services
Objects
Objects
Objects
Objects
https://developer.salesforce.com/blogs/engineering/2012/04/avoid-account-data-skew-for-peak-performance.html
14
When to use Big Objects
https://trailhead.salesforce.com/content/learn/modules/big-data-strategy/choose-the-right-big-data-solution
Salesforce Object Storage
15 https://developer.salesforce.com/docs/atlas.en-us.bigobjects.meta/bigobjects/big_object.htm
Big Objects (NoSQL) Data Lake/Cloud Data Warehouse
When to use? When data is only recorded for forensics or
broad reporting and not accessed very often.
When data assets from Salesforce need to be mashed-
up with other enterprise data.
• When companies don’t already have a
reporting ODS available.
• Many 3rd party integrations.
• Data can be reused or resold.
• Organizations that have a strong/evolving cloud
analytics strategy.
• When ETL or external data is required for context.
• Do not bring data into Salesforce JUST to
perform calculations on it.
• Do not store data in Big Objects, if you
expect them to grow infinitely.
• Read/Write Optimized
• Read and Write optimizations.
• They reduce security headaches for consumers.
• Once you get to 1+ TB of Enterprise data
• Transact and Read/Write Optimized
What you need to
be successful?
Operational and Administrative diligence
managing size, load, license cost.
Healthy Cloud management presence or engagement.
Licensing? Licensing through Salesforce is ‘extra’ but
reasonable.
Varies but can be much cheaper from a bulk storage
perspective.
Objects  Big Objects  Data Lake/CDW
When to use? • If your application relies on a service or data provider that is also changing, consider
strongly an investment in an API framework like MuleSoft or Jitterbit.
• These frameworks allow for Services and ETL steps to be performed at the
translation layer and can be versioned and tracked.
Vitally important if your application needs to be UP while it is being changed or source
systems are changing.
What you need to be
successful?
Still new and bespoke management required. Dividends clearly paid in uptime and
efficiency.
Licensing? Varies
Leading API Orchestration Technology Platforms
16
Leading Analytics Technology Platforms
• Dream Big: What is the future of your data scale?
• If Consider Architecting for Data offload to an ODS like
Snowflake early.
• Snowflake is great for Near Real Time Process using the
Snowpipe Feature
• Storage and Compute Resources are Separated.
• It can Scale quickly if you need more Compute Power to
process Queries
• Handles both Structured and Semi Structured Data
17
When to use? Sales trend, customer demographics, Area
Performance, Product Returns, Sales,
discounts and bundles. All data that provides
insight once accumulated.
If you are going to run heavy calculation with
near-real-time data.
Ability to Share Information Across different
Organizations without having to transfer files.
Reporting and Dashboarding against a Data Lake
or Cloud Data Warehouse
Your needs are subject to change and
scalability is a concern.
Multiple data sources
What you need to
be successful?
If metrics and data are going to play a
strategic role in your business, invest early.
Competency in modern cloud analytics
architecture.
Data Literacy
BI skillset
Data Visualization and UI/UX skills
Licensing? Cloud consumption-based model Traditional or Consumption based.
On the
Horizon?
• With the acquisition of Tableau and closer dealings with Snowflake, things
could rapidly change in this area.
• Most likely this will tip the scales towards investments in both, if you are
planning to continue/grow your Salesforce investment.
18
Thanks For Joining Us
We hope you enjoyed the presentation.
If you’d like to learn more about how we use Salesforce
to help transform your organization, contact us.
https://sensecorp.com/salesforce-sense-corp/
CONTACT US
www.sensecorp.com | marketing@sensecorp.com
Questions?

More Related Content

PDF
Deep Dive on CI/CD NYC Meet Up Group
PDF
Operationalizing your C4E VirtualMuleys & Deployment Considerations: Cloudhub...
PDF
CloudHub Connector With Mulesoft
PPTX
Kochi mulesoft meetup 02
PPTX
Richmond MuleSoft Meetup 2021-08-18
PDF
MuleSoft_NZ_Meetup_8
PDF
Perth Meetup August 2021
PPTX
Hyderabad meet up-sep12
Deep Dive on CI/CD NYC Meet Up Group
Operationalizing your C4E VirtualMuleys & Deployment Considerations: Cloudhub...
CloudHub Connector With Mulesoft
Kochi mulesoft meetup 02
Richmond MuleSoft Meetup 2021-08-18
MuleSoft_NZ_Meetup_8
Perth Meetup August 2021
Hyderabad meet up-sep12

What's hot (20)

PPTX
MuleSoft Kochi Meetup #3– Integration with Web Sockets
PPTX
Nyc mule soft_meetup_13_march_2021
PDF
Mule soft mcia-level-1 Dumps
PDF
MuleSoft Nashik Virtual Meetup#2 - API Led Connectivity Integration:SAGA
PPTX
NYC MuleSoft Meetup 2019 Q2- MuleSoft for Mobile Applications
PDF
apidays LIVE Australia 2020 - Building an Enterprise Eventing Platform by Gna...
PDF
MuleSoft Surat Virtual Meetup#16 - Anypoint Deployment Option, API and Operat...
PPTX
Mulesoft Solutions for SOA
PDF
apidays LIVE Australia 2020 - Data with a Mission by Matt McLarty
PDF
MuleSoft Meetup Singapore - Reliable Messaging & RTF Operations
PPTX
DevOps and APIs: Great Alone, Better Together
PDF
Microservices & anypoint service mesh calgary mule soft meetup
PDF
Optimizing TAS Usage at Ford Motor Company
PDF
MuleSoft Surat Virtual Meetup#24 - MuleSoft and Salesforce Integration and De...
PPTX
Meet up slides_mumbai_21032020_final
PDF
Nashik MuleSoft Virtual Meetup#1 - Shared and Dedicated Load Balancer
PPTX
Meetup_Bangalore_Rajesh
PDF
Sustainability Challenge, Postman, Rest sheet and Anypoint provider : MuleSof...
PDF
MuleSoft Meetup Singapore June 2021
PPTX
Manchester Meetup #3
MuleSoft Kochi Meetup #3– Integration with Web Sockets
Nyc mule soft_meetup_13_march_2021
Mule soft mcia-level-1 Dumps
MuleSoft Nashik Virtual Meetup#2 - API Led Connectivity Integration:SAGA
NYC MuleSoft Meetup 2019 Q2- MuleSoft for Mobile Applications
apidays LIVE Australia 2020 - Building an Enterprise Eventing Platform by Gna...
MuleSoft Surat Virtual Meetup#16 - Anypoint Deployment Option, API and Operat...
Mulesoft Solutions for SOA
apidays LIVE Australia 2020 - Data with a Mission by Matt McLarty
MuleSoft Meetup Singapore - Reliable Messaging & RTF Operations
DevOps and APIs: Great Alone, Better Together
Microservices & anypoint service mesh calgary mule soft meetup
Optimizing TAS Usage at Ford Motor Company
MuleSoft Surat Virtual Meetup#24 - MuleSoft and Salesforce Integration and De...
Meet up slides_mumbai_21032020_final
Nashik MuleSoft Virtual Meetup#1 - Shared and Dedicated Load Balancer
Meetup_Bangalore_Rajesh
Sustainability Challenge, Postman, Rest sheet and Anypoint provider : MuleSof...
MuleSoft Meetup Singapore June 2021
Manchester Meetup #3
Ad

Similar to Managing Large Amounts of Data with Salesforce (20)

PDF
Big data rmoug
PDF
2022 Trends in Enterprise Analytics
PDF
The Shifting Landscape of Data Integration
PPTX
Data Mesh using Microsoft Fabric
PPTX
Data Mesh in Azure using Cloud Scale Analytics (WAF)
PPTX
Data Lakehouse, Data Mesh, and Data Fabric (r1)
PPTX
IBM Relay 2015: Open for Data
 
PDF
PDF
Data Engineering
PDF
ADV Slides: When and How Data Lakes Fit into a Modern Data Architecture
PPTX
SPSChicagoBurbs 2019 - What is CDM and CDS?
PDF
Lecture4 big data technology foundations
PPTX
Data Lakehouse, Data Mesh, and Data Fabric (r2)
PDF
Data Lake Acceleration vs. Data Virtualization - What’s the difference?
PPTX
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with Hadoop
PDF
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)
PDF
When and How Data Lakes Fit into a Modern Data Architecture
PDF
Fueling AI & Machine Learning: Legacy Data as a Competitive Advantage
PPTX
Designing modern dw and data lake
PDF
Ask bigger questions
Big data rmoug
2022 Trends in Enterprise Analytics
The Shifting Landscape of Data Integration
Data Mesh using Microsoft Fabric
Data Mesh in Azure using Cloud Scale Analytics (WAF)
Data Lakehouse, Data Mesh, and Data Fabric (r1)
IBM Relay 2015: Open for Data
 
Data Engineering
ADV Slides: When and How Data Lakes Fit into a Modern Data Architecture
SPSChicagoBurbs 2019 - What is CDM and CDS?
Lecture4 big data technology foundations
Data Lakehouse, Data Mesh, and Data Fabric (r2)
Data Lake Acceleration vs. Data Virtualization - What’s the difference?
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with Hadoop
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)
When and How Data Lakes Fit into a Modern Data Architecture
Fueling AI & Machine Learning: Legacy Data as a Competitive Advantage
Designing modern dw and data lake
Ask bigger questions
Ad

More from Sense Corp (8)

PPTX
The Future of the Digital Experience: How to Embrace the New Order of Busines...
PPTX
Achieve New Heights with Modern Analytics
PPTX
Why Data Science Projects Fail
PPTX
Small Investments, Big Returns: Three Successful Data Science Use Cases
PPTX
10 Steps to Develop a Data Literate Workforce
PPTX
Why Data Science Projects Fail
PPTX
The Data Warehouse is NOT Dead
PDF
Infographic data
The Future of the Digital Experience: How to Embrace the New Order of Busines...
Achieve New Heights with Modern Analytics
Why Data Science Projects Fail
Small Investments, Big Returns: Three Successful Data Science Use Cases
10 Steps to Develop a Data Literate Workforce
Why Data Science Projects Fail
The Data Warehouse is NOT Dead
Infographic data

Recently uploaded (20)

PPTX
Logistic Regression ml machine learning.pptx
PPTX
Business Acumen Training GuidePresentation.pptx
PPTX
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
PDF
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
PDF
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
PDF
.pdf is not working space design for the following data for the following dat...
PPTX
Computer network topology notes for revision
PPTX
Bharatiya Antariksh Hackathon 2025 Idea Submission PPT.pptx
PDF
Data Science Trends & Career Guide---ppt
PPTX
IB Computer Science - Internal Assessment.pptx
PDF
Taxes Foundatisdcsdcsdon Certificate.pdf
PDF
Oracle OFSAA_ The Complete Guide to Transforming Financial Risk Management an...
PPTX
1intro to AI.pptx AI components & composition
PDF
Fluorescence-microscope_Botany_detailed content
PPTX
advance b rammar.pptxfdgdfgdfsgdfgsdgfdfgdfgsdfgdfgdfg
PPTX
Introduction to Firewall Analytics - Interfirewall and Transfirewall.pptx
PPTX
Data-Driven-Credit-Card-Launch-A-Wells-Fargo-Case-Study.pptx
PPT
Chapter 2 METAL FORMINGhhhhhhhjjjjmmmmmmmmm
PPT
Quality review (1)_presentation of this 21
Logistic Regression ml machine learning.pptx
Business Acumen Training GuidePresentation.pptx
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
.pdf is not working space design for the following data for the following dat...
Computer network topology notes for revision
Bharatiya Antariksh Hackathon 2025 Idea Submission PPT.pptx
Data Science Trends & Career Guide---ppt
IB Computer Science - Internal Assessment.pptx
Taxes Foundatisdcsdcsdon Certificate.pdf
Oracle OFSAA_ The Complete Guide to Transforming Financial Risk Management an...
1intro to AI.pptx AI components & composition
Fluorescence-microscope_Botany_detailed content
advance b rammar.pptxfdgdfgdfsgdfgsdgfdfgdfgsdfgdfgdfg
Introduction to Firewall Analytics - Interfirewall and Transfirewall.pptx
Data-Driven-Credit-Card-Launch-A-Wells-Fargo-Case-Study.pptx
Chapter 2 METAL FORMINGhhhhhhhjjjjmmmmmmmmm
Quality review (1)_presentation of this 21

Managing Large Amounts of Data with Salesforce

  • 1. How to Manage Large Amounts of Data with Salesforce 1 Data & Integration Considerations for Architects
  • 3. It’s just Data, right? • Map my data scale (Migration and Ongoing) • Will my application fit (long term) in my Target Architecture? • Do I have any “Design Skews”? • Where will my pain points be? • Pain points and solution patterns for: – Lots of data (LDV*) – Lots of connections – Lots of connections to lots of data sources • In this Session, we will be focusing on identifying data and integration issues in the design phase, discussing some solution platforms specifically. 3 GOAL: Learn Architect Design Planning Patterns “A ‘large data volume’ is an imprecise, elastic term. If your deployment has tens of thousands of users, tens of millions of records, or hundreds of gigabytes of total record storage, you have a large data volume…” ~Salesforce https://developer.salesforce.com/docs/atlas.en-us.salesforce_large_data_volumes_bp.meta/salesforce_large_data_volumes_bp/ldv_deployments_introduction.htm
  • 4. The Science of Architecture • As soon as you build it, (sometimes before you are even done) applications take on lives of their own. • Architects need to see the present and predict the future. • Good Architecture is responsible for the ultimate success of application/project. • In relative dollars, it costs:  $1 to effectively design and architect a solution  $10 to re-design it due to technical debt if not architected well  $100 to completely scrap and re-tool  $____ to repair your reputation after a write-off 4
  • 5. It’s just Data, right? • We create: – Pages – Buttons – Flows – Validation Rules* • Talking to: – Objects – Fields • Combined with: – Authentication: Active Directory, SSO – HR Profile Data, Medical History, Order History 5 We’re all used to building Applications in Salesforce. *Salesforce has imposed a limit of 100 Validation Rules per Object. Ever try to create test data on an object with 100 Validation Rules?
  • 6. Building Applications Based on Data Scale 1:1:1 or 101 : 101 : 101* • We try to understand our applications’ consumption of data relative to the complexity/magnitude of downstream Data. NOT because data is bad but because handling lots of anything has a cost. • For our purposes “Design Skew” is defined as having components out of proportion with each other. • Skew can cause both System and Operational degradation. • The following slides show examples of how to start documenting your data interactions and Predict and Mitigate possible Skew complications. 6 *Obviously, there’s more than 3 elements of solutions, the key is balance. Master Data Be aware if you are building or defining relationships for Master Data, you may also be creating duplicate “Master Logic”. Rules, Logic and Relationships to data that is not your application’s Master Data should be conferred upstream to the Master Data owner. For Example: Product Price and Tax Rate calculation should not be done in multiple systems. Differences in rounding rules can lead to discrepancies that can be hard to reconcile.
  • 7. Common Types of Design Skews 7
  • 8. How do I map my application? 8 Salesforce Data Start with this simple diagram • Sketch User counts • User Types • Functions • Connectivity • Firewall • Domains • Data Sources Objects Services User Share it with All teams and Stakeholders. • Attach it to every deploy and code review • Any changes must be ratified
  • 9. Architecture Worksheet 9 Salesforce Data Objects Services User Users Functions Objects Records Connections Dependencies Data Sources Master Logic Master Data
  • 10. Reference Architectures Salesforce Growth Enabled Company Infrastructure Scalable Regular Patterns Repeatable Mature Support Any time the Application has a 1:>1:>1 Objects Kafka (Streaming) NoSQL ESB DAM SFTP ETL Main Frame MuleSoft Jitterbit Data Lake (Object Storage) Data Data Data Data Snowflake
  • 11. Step 1: Dream Big Architects: Visualize Success • How big will this get? • How many people will use this? • If it gets to size X, will I have to rebuild? • Define Growth Boundaries with recommended changes. • Create Transactional Warning systems inside triggers and functions. 11 If(DML.size() > 70% of limit){ SendEmail(to=setting.PerfAdmin); }
  • 12. Step 2: Dream Bigger • Does this data have value? – The answer is always: Yes. • The larger the volume of data created, the more likely there will be value in analyzing that data. – Trends – Next Best Action – Inefficiencies – Agent Performance – ROI • As you create data, think of the many ways it could be used. – Plan accordingly! 12
  • 13. Too Many Records Salesforce Data Schema Overload • Trying to keep the data model a perfect twin to reality • Not having a plan to make data “Mature Out” of the system • Avoiding Data Skew • Is every application responsible for Forensics? Objects Services Objects Objects Objects Objects https://developer.salesforce.com/blogs/engineering/2012/04/avoid-account-data-skew-for-peak-performance.html
  • 14. 14 When to use Big Objects https://trailhead.salesforce.com/content/learn/modules/big-data-strategy/choose-the-right-big-data-solution
  • 15. Salesforce Object Storage 15 https://developer.salesforce.com/docs/atlas.en-us.bigobjects.meta/bigobjects/big_object.htm Big Objects (NoSQL) Data Lake/Cloud Data Warehouse When to use? When data is only recorded for forensics or broad reporting and not accessed very often. When data assets from Salesforce need to be mashed- up with other enterprise data. • When companies don’t already have a reporting ODS available. • Many 3rd party integrations. • Data can be reused or resold. • Organizations that have a strong/evolving cloud analytics strategy. • When ETL or external data is required for context. • Do not bring data into Salesforce JUST to perform calculations on it. • Do not store data in Big Objects, if you expect them to grow infinitely. • Read/Write Optimized • Read and Write optimizations. • They reduce security headaches for consumers. • Once you get to 1+ TB of Enterprise data • Transact and Read/Write Optimized What you need to be successful? Operational and Administrative diligence managing size, load, license cost. Healthy Cloud management presence or engagement. Licensing? Licensing through Salesforce is ‘extra’ but reasonable. Varies but can be much cheaper from a bulk storage perspective. Objects  Big Objects  Data Lake/CDW
  • 16. When to use? • If your application relies on a service or data provider that is also changing, consider strongly an investment in an API framework like MuleSoft or Jitterbit. • These frameworks allow for Services and ETL steps to be performed at the translation layer and can be versioned and tracked. Vitally important if your application needs to be UP while it is being changed or source systems are changing. What you need to be successful? Still new and bespoke management required. Dividends clearly paid in uptime and efficiency. Licensing? Varies Leading API Orchestration Technology Platforms 16
  • 17. Leading Analytics Technology Platforms • Dream Big: What is the future of your data scale? • If Consider Architecting for Data offload to an ODS like Snowflake early. • Snowflake is great for Near Real Time Process using the Snowpipe Feature • Storage and Compute Resources are Separated. • It can Scale quickly if you need more Compute Power to process Queries • Handles both Structured and Semi Structured Data 17 When to use? Sales trend, customer demographics, Area Performance, Product Returns, Sales, discounts and bundles. All data that provides insight once accumulated. If you are going to run heavy calculation with near-real-time data. Ability to Share Information Across different Organizations without having to transfer files. Reporting and Dashboarding against a Data Lake or Cloud Data Warehouse Your needs are subject to change and scalability is a concern. Multiple data sources What you need to be successful? If metrics and data are going to play a strategic role in your business, invest early. Competency in modern cloud analytics architecture. Data Literacy BI skillset Data Visualization and UI/UX skills Licensing? Cloud consumption-based model Traditional or Consumption based.
  • 18. On the Horizon? • With the acquisition of Tableau and closer dealings with Snowflake, things could rapidly change in this area. • Most likely this will tip the scales towards investments in both, if you are planning to continue/grow your Salesforce investment. 18
  • 19. Thanks For Joining Us We hope you enjoyed the presentation. If you’d like to learn more about how we use Salesforce to help transform your organization, contact us. https://sensecorp.com/salesforce-sense-corp/ CONTACT US www.sensecorp.com | [email protected]

Editor's Notes

  • #2: The ability for Salesforce to handle large workloads and participate in high-consumption, mobile-application-powering technologies continues to evolve. Pub/sub models and the investment in adjacent properties like Snowflake, Kafka and MuleSoft has broadened the development scope of Salesforce. Solutions now range from internal and in-platform applications to fueling world-scale mobile applications and integrations. Unfortunately, guidance on the extended capabilities still is not well understood or well documented. Knowing when your solution needs to move to a higher order solution is an important Architect skill. In this webinar, Paul McCollum, UXMC and Technical Architect at Sense Corp, will share an overview of data and architecture considerations. Attend to learn how to identify reasons and guidelines for updating your solutions to larger scale modern reference infrastructures as well as when to introduce products like Big Objects, Kafka, MuleSoft and Snowflake.
  • #4: It’s time to Level-Up and start designing for full lifecycle and fit. http://virtualdreamin.com/top-9-considerations-when-transitioning-from-a-developer-to-an-architect/ https://developer.salesforce.com/docs/atlas.en-us.salesforce_large_data_volumes_bp.meta/salesforce_large_data_volumes_bp/ldv_deployments_introduction.htm
  • #5: Consequences! Let’s look at some of the typical types of design and potential problems. Why this is important! Worst agile design oversight ever: Continuous slight changes to the architecture without foresight led to a Surprise 10x increase in cost to client. Which led to a Surprise “escort out of the building” and “termination of contract”.
  • #6: Maximum of 100 validation rules on an object. Ever try to create test data on an object with 100 validation rules? Think the designer PLANNED for 100 rules? Probably not. Audience participation: on what you would have to do to work around 100 validation rules. Let’s take phone numbers. You need some validation rules 10 digits Xxx-xxx-xxxx Area code matches valid area code for Account.State lookup. Number not on no call list Number not clearly fake: repeated digits. I’m exhausted at 5, can you imagine what it would take to hit 101? You know why there’s a maximum of 100? Because some maniac tried to do business with 200!
  • #7: https://developer.salesforce.com/docs/atlas.en-us.salesforce_large_data_volumes_bp.meta/salesforce_large_data_volumes_bp/ldv_deployments_introduction.htm
  • #8: Is the schema too big to understand? Is the application too complex to be supported?
  • #9: Intro: simplification of Enterprise solution components Next: Patterns and Pitfalls
  • #10: Intro: simplification of Enterprise solution components Next: Patterns and Pitfalls
  • #11: Mature, Manageable, Scalable Architecture We start with this architecture in mind and very seldom devolve to simple (albeit cheaper) models. Our goal is to build a long-term thriving set of applications inside Salesforce and other subscription-based systems. Licensing a suite with only 1 application is a massive waste of funds. Build for scale in systems that are Built and Billed to scale.
  • #13: Story: Slashdotting
  • #14: When you let a business user design the schema
  • #15: Salesforce has put forth their own decision tree on big objects.
  • #16: Data Lakes store and allow access to multiple copies of source data in a single location. (often with cheaper per byte options.) spillover visual They reduce security headaches for consumers. Consider Data Lakes a pre-requisite if your organization is in any phase of a cloud migration. Cloud Data Lakes provide cloud consumer ready access for all platforms. If anything is going to ‘the cloud’ your integration data will probably need to go too. Best to execute one time and not reintegrate later, again.
  • #17: Table
  • #18: Mention data exchange and data marketplace as moving across boundaries. The second you want to use more than one of source data, go snowflake (and a terabyte)
  • #19: Courtesy Chris Rosser