When Salesforce is life!

Tag: Storage

5 Tips for Managing Salesforce Cloud Costs

This guest post is presented by Gilad David Maayan is a technology writer who has worked with over 150 technology companies including SAP, Imperva, Samsung NEXT, NetApp and Check Point, producing technical and thought leadership content that elucidates technical solutions for developers and IT leadership. Today he heads Agile SEO, the leading marketing agency in the technology industry.


Understanding and effectively managing Salesforce cloud costs can significantly impact a company’s bottom line. Salesforce, as a leading CRM platform, offers diverse functionality that cater to diverse business needs. However, the more of this functionality a business uses, the higher the ongoing cost of the platform. 

Image Source

From user licenses to API calls, from data storage to custom developments, each element plays a role in the overall expense structure. To make the most of Salesforce without overspending, businesses must be proactive in their approach to cost management, as part of a holistic cloud cost management strategy. This article delves into the details of Salesforce cloud costs and provides actionable strategies to keep these costs under control.

Factors Affecting Salesforce Cloud Costs 

User Licenses

When it comes to Salesforce cloud costs, user licenses are one of the significant contributors. Salesforce offers different types of licenses, each with its own pricing. The more licenses you purchase, the higher your costs will be.

Note that some licenses offer more features and functionalities but come with a higher price tag. Pricing ranges from $25 / user / month for the Starter tier, all the way up to $500 / user / month for the Unlimited tier. Hence, you need to strike a balance between the number and type of licenses to manage your Salesforce costs effectively.

API Calls

Another factor affecting Salesforce cloud costs is API calls. Salesforce provides APIs to integrate with other systems and applications. However, each API call comes at a cost. The more API calls your business makes, the higher your costs will be.

By managing your API calls, you can control your Salesforce costs. You can identify and eliminate unnecessary API calls, optimize the usage of APIs, and align your API strategy with your business objectives.

Data Storage

Data storage is another factor that influences Salesforce cloud costs. Salesforce provides data storage for your records, files, and other data. However, the more data you store, the higher your costs will be.

By managing your data storage, you can reduce your Salesforce costs. You can identify and delete redundant or obsolete data, optimize your data management practices, and ensure that your data storage costs are in line with your business needs and budget.

Custom Development

The last factor affecting Salesforce cloud costs is custom development. Salesforce offers a highly customizable platform. You can develop custom apps, features, or integrations to meet your specific business needs. However, custom development comes with its costs.

By managing your custom development costs, you can control your Salesforce expenses. You can prioritize your development projects, leverage reusable components, and ensure that your custom development efforts are cost-effective and aligned with your business goals.

5 Tips for Managing Salesforce Cloud Costs 

Let’s dive into five actionable tips that can help you manage your Salesforce cloud costs effectively.

Regularly Review User Licenses

Your Salesforce subscription is primarily based on the number of user licenses. It’s essential to regularly review and adjust these licenses to ensure you’re not paying for more than you need.

Each user license equates to a seat in your Salesforce organization. You pay for these seats whether they are occupied or not. If you have unused licenses, you’re essentially wasting money. Regularly reviewing your user licenses and deactivating unused or unnecessary ones is a simple yet effective way to manage your Salesforce costs.

Also evaluate the types of licenses used by your users. Make sure you’re using the right type of license for each user. Don’t pay for high-end licenses for users who only need basic features.

Optimize Data Storage

Data storage is another significant factor in Salesforce cost. Salesforce provides a certain amount of data storage per user license, and once you exceed this, you need to pay extra. Therefore, optimizing your data storage can help manage your Salesforce costs:

  • Ensure you’re only storing necessary data: Regularly review your data and delete or archive anything that’s not needed.
  • Use efficient data structures: Salesforce has various types of data storage, each with its own storage limit. By using the right type of storage for each piece of data, you can optimize your storage usage.
  • External storage solutions: If you have large amounts of data that don’t need to be on Salesforce, moving them to an external storage solution can significantly reduce your Salesforce data storage costs.

Monitor API Calls

Salesforce limits the number of API calls you can make in a 24-hour period, based on your user licenses. Exceeding these limits can lead to additional costs. Therefore, monitoring your API calls is an important part of cost management.

  • Understand your API usage: Identify which processes generate the most API calls and determine if they are necessary. You may find that some processes can be optimized or eliminated to reduce API calls.
  • Consider using batch processes: Batch processes allow multiple records to be processed in a single API call, reducing the total number of API calls.

Utilize Native Features Before Third-Party Integrations

Salesforce offers a wide range of native features that can meet most business needs. Before resorting to third-party integrations, which can add to your costs, consider if you can achieve your goals using Salesforce’s native features.

Using native features can also improve your overall Salesforce experience. Native features are designed to work seamlessly with Salesforce, ensuring optimal performance and user experience.

Implement Governance Policies

Lastly, consider implementing governance policies to manage your Salesforce costs. Governance policies can help ensure your Salesforce usage aligns with your business goals and budget.

A good governance policy should cover usage guidelines, user licenses management, data storage optimization, API usage, and third-party integrations. It should also include regular reviews and audits to ensure compliance.

Implementing a governance policy may seem like a daunting task, but it’s an investment that can yield significant returns in terms of cost management.

Salesforce is a powerful platform that can drive your business success. However, without prudent cost management, it can become a costly endeavor. By regularly reviewing your user licenses, optimizing your data storage, monitoring your API calls, utilizing native features, and implementing governance policies, you can unlock the power of cost management using Salesforce.

Salesforce is Retiring Their Data Recovery Service. Here’s What You Should Know.

Mike Melone is a Content Marketing Manager at OwnBackup, a global leader in SaaS business continuity and data protection solutions. New to the Salesforce ecosystem, Mike brings a decade of experience in copywriting and marketing to OwnBackup, where he curates original content related to all things Salesforce backup & recovery.


Salesforce is the most secure and available platform in the industry. Yet data protection remains a shared responsibility as it does with all modern SaaS platforms. Customers are responsible for preventing user-inflicted data and metadata loss and corruption and for having a plan in place to recover if it happens. That’s why Salesforce recommends using a partner backup solution that can be found on the AppExchange.

Despite this recommendation, our 2020 State of Salesforce Data Protection survey found that 88% of companies are lacking a comprehensive backup and recovery solution, which may be especially risky with many companies increasing their remote workforce. Furthermore, 69% say their company may be at significant risk of user-inflicted data loss.

Salesforce Data Recovery Service Is Being Retired

Last year, Salesforce announced that they will be retiring their last-resort data recovery service, effective July 31, 2020 because it has not met their high standards of customer success and trust. For customers who aren’t proactively backing up their data, Salesforce currently offers this last resort Data Recovery service. However, at a cost of over $10,000 and a 6-8 week recovery time this service is not an adequate option for many customers. The upcoming retiring of this last resort Data Recovery service is an excellent opportunity to remember that a proactive backup and recovery has always been a required and recommended best practice. Salesforce offers a number of native backup options to minimize your business risk of user-inflicted data loss.

Native Salesforce Options

Weekly Export and Report Exports

Users with “Data Export” profile permissions can generate backup .CSV files of their data on a weekly basis (once every 6 days) or on-demand via user-generated reports. The export can be scheduled and then manually downloaded when ready. Weekly Export backups can include images, documents, attachments, and Chatter files. In the event of a user-inflicted data loss, users must manually restore by uploading their .CSV files in the correct order. Because this process can be challenging OwnBackup has created a free eBook with a step by step process to Recovering Lost Data Using Salesforce Weekly Export.

Data Loader

Data Loader is a client application for the bulk import or export of data. It can be used through the user interface to bulk import or export Salesforce records through .CSV files, as well as Insert, Update, Upsert, Delete, or Export Salesforce records as .CSV files. Data Loader command line can also be used to backup data to or from a relational database, such as Oracle or SQL Server. Restoration of lost or corrupted data would be manual.

These native methods include data, but no metadata. In order to proactively protect your Salesforce platform, you should back up metadata and attachments as well. Without this vital piece, putting the relationships between your Salesforce data objects back in place can become a painstaking process.

Without the ability to maintain relationships, you’ll only have partial restore capabilities. If an account is accidentally deleted, all of the contacts, cases, tasks, and other records will be deleted as well. If you only restore the account, but not it’s dependent child records, your recovery capabilities are only providing you partial coverage.

Having a copy of your data is important to meet the minimum standards of a backup. The real challenge is the ability to restore data back into Salesforce. Plan for regular data recovery testing to ensure that you are prepared for any unexpected data events. You must test your strategy so you’ll be aware of what will actually happen if you were to experience a data loss or corruption. When you test, you may find that you are unable to recover in the time or to the fullness that you require. When testing, check:

  • Are you able to recover specific versions of document or data or metadata?
  • Are you able to minimize data transformation during the restoration process?
  • How does your strategy handle different types of restore processes?
  • What is the performance and time to restore?

How to Trust in Your Salesforce Data Recovery Plan

At the beginning of this article we mentioned Salesforce recommends using a partner backup solution that can be found on the AppExchange. The reason they recommend this is almost entirely due to the challenges most experience when attempting to recover lost or corrupted data and the fact that metadata cannot currently be backed up using native backup methods.

OwnBackup, a top-rated partner on the AppExchange, is here to help you identify the five concepts that really matter when it comes to a comprehensive backup AND recovery.

1. Recovery Point Objective (RPO) and Recovery Time Objective (RTO)

For many companies, having unresolved data loss or corruption for over one week could equate to millions of dollars in lost revenue. As you construct your data recovery strategy, you will need to define your recovery point objective (RPO) and recovery time objective (RTO).

The RPO is the amount of data a company can afford to lose before it begins to impact business operations. Therefore, the RPO is an indicator of how often a company should back up their data.

The RTO is the timeframe by which both applications and systems must be restored after data loss or corruption has occurred. The goal here is for companies to be able to calculate how fast they need to recover, by preparing in advance.

What the recovery time and recovery point end up being are deeply influenced by backup frequency, backup retention, your ability to compare current and past data, and to restore just the data that has been impacted. RTO and RPO are two parameters that help minimize the risks associated with user-inflicted data loss

2. Data Integrity

Data integrity means that you should have a complete backup of your Salesforce environment. Backing up a few critical fields or records isn’t enough. Salesforce is a relational database that doesn’t allow admins to manipulate or populate the record IDs. All of the relationships in Salesforce environments are based on these record IDs. In the case of a data loss, a cascade delete effect will remove all of the child records associated with that account.

This is one of the big reasons why an audit trail is not a backup and recovery solution. You never know what the scope of a data loss will be in the future. For that reason relying on field level changes on a subset of objects as a reliable backup solution is a fundamentally flawed strategy.

Therefore best practice is to have a full backup of all of your Salesforce data, metadata, and attachments. You may also want to have a partial, high-frequency backup for critical objects that change more than a few times each day.

Another best practice for maintaining data integrity includes having restore capabilities that allow both full and granular restore to prevent an accidental overwrite of legitimate changes.

Finally, the integrity of your backups themselves are critical, particularly for companies with internal data security policies or regulatory compliance concerns.

3. Security

Your company is likely required to follow security requirements. Your backup and recovery process shouldn’t be excluded from these security requirements. Saving .CSV files of your Salesforce data within the company hard drive or your laptop should not be considered a best practice. Ideally, your backup data should be saved to trusted cloud storage. To meet compliance requirements the data should also be encrypted both in transit and at rest.

You should also consider the permission settings and accessibility of your backups. With your current backup process, could an inappropriate user access and delete data.

4. Reliability

A reliable backup and recovery plan includes automated change identification, scheduled backups, proactive data change monitoring, and the ability to reach out for technical support if needed.

You should have the ability to automatically identify changes that have occurred in your schema. A tool that can identify what has changed between backups is important for data, metadata, and attachments.

Your backups should run daily at a minimum. Schedule your backups to run on a set schedule which is easy to define and monitor.

With so many end-users with read-write and delete permissions or merge abilities, proactive monitoring is crucial. A reliable data backup and recovery plan allows you to quickly find out when unusual changes have occured or if a backup has failed.

5. Accessibility

Storing data backups outside of Salesforce is a best practice for accessibility. If Salesforce were to become temporarily unavailable, you would still have access to your data and metadata.

Data backup portability is a must for an optimal data backup and recovery plan. Data movement should be able to take place securely and conveniently. Whether you’re looking to conduct more complex reporting on large data sets with BI and ETL tools or want to maintain legacy business continuity and data recovery procedures, your data should be yours to use as you please.

Data accessibility is also a key requirement of GDPR and CCPA. In support of these regulations, you should have full transparency into their data, even backups. To comply with these regulations, you’ll need to be able to search for where a Data Subject’s data, including attachments, resides within their backups.  Additionally, you must have the ability to rectify or forget an individual’s data in your backups for full compliance.

 

Protect Your Data and Maintain Business Continuity with Comprehensive Backup and Recovery

By making the tough decision to retire their Data Recovery service, Salesforce has reiterated their commitment to customer success and trust. Now is the perfect time to reassess your current backup and recovery strategy. Were you completely relying on Salesforce’s Data Recovery service? Why not start backing up with their native Weekly Export service instead? Any backup strategy is better than doing nothing at all.

We’d also encourage you to explore Salesforce AppExchange partner solutions, such as OwnBackup. OwnBackup brings ROI to its over 1,700 customers every day by helping them protect their Salesforce data. OwnBackup customers are almost 3x more likely to notice a data loss or corruption and they feel 3x more prepared to recover.

Now is the perfect opportunity to take a step back and put a comprehensive backup and recovery plan in place to ensure that your Salesforce data has the same level of protection as your other critical systems.

To request a demo and learn more about OwnBackup and our solutions, click here.

Salesforce Data Management 101: Know Your Storage

Today’s guest post is delivered by Gilad David Maayan, a technology writer who has worked with over 150 technology companies including SAP, Samsung NEXT, NetApp and Imperva, producing technical and thought leadership content that elucidates technical solutions for developers and IT leadership.


When developing an app, you need to know how data is stored, structured, and organized. This information is crucial when building, maintaining, and updating your software. It can also help you understand what are the capabilities of this build, how far you can take it, and when it will need to be scaled up. 

In Salesforce, you can use two types of storage for data and for files, but there are five methods designed for specific use cases — files, CRM, documents, attachments, and knowledge. In this article, you will learn how storage works in Salesforce, including tips to help you avoid hitting your storage limits.

How Data is Stored in Salesforce

When working with Salesforce, there are several reliable and efficient ways to store your data. This includes media files, customer profiles, documents, and presentations. This storage is broken down into two types — data and file. 

Data storage includes many fields, such as accounts, cases, custom objects, events, opportunities, and notes. This data is automatically stored within the Salesforce database and you do not have individual control over where specific items go. 

File storage includes attachment files, customer content, media, Chatter files, documents, and custom files in Knowledge articles. This content you can individually control depending on how it is created and attached. Below are the five methods you can use for file storage.

Files

Salesforce Files is a storage location you can use to store any type of file. Salesforce has positioned it to replace most of the following methods as it offers more features and functionality. Files enables you to follow specific files, generate links, share files with users or groups, and collaborate on files. In Files, each file can be up to 2GB. 

Customer relationship management (CRM) content

Salesforce CRM content is where you can store files that you want to publish and share with coworkers and customers. For example, presentations or content packs. This can include marketing files, document templates, media, or support files. This storage type supports files up to 2GB although this drops to 10MB depending on how you upload data. 

Documents

Documents storage enables you to store a variety of web resources, including logos, email templates, and Visualforce materials. When files are stored here, you do not have to attach data to specific records. In Documents, files can be up to 5MB.

One thing to keep in mind — if you are using an older version of Salesforce Documents storage is still available. However, if you are using Lightning Experience, this functionality has been replaced by Files. When you update your Salesforce, you need to convert your Documents to Files before you can access your data. 

Attachments

Attachments is a storage area you can use for files that you want to attach to specific records. For example, marketing campaigns, cases, or contact information. The downside of Attachments is that you can’t share files with links and do not have access to version control. In Attachments, files can be up to 25MB and feeds can be up to 2GB.

Knowledge

Knowledge is a storage area you can use to create and store knowledge base articles. These files can be searched by internal users and shared with customers through your portals or Lightning Platform Sites. In Knowledge, each article can be up to 5MB.

How to Avoid Hitting Your Storage Limits in Salesforce

Regardless of how you store and manage your files in Salesforce, you need to be aware of what your storage limits are and how to make the most of those limits. You should also be aware of what alternative options you have to expand your storage. 

Storing data outside of Salesforce

Sometimes, the most practical option is to store some of your data outside of Salesforce. One reason for this is your storage limits. In Salesforce you are allowed:

  • Data storage—10GB of base storage plus 20MB of storage per user. If you are using Performance or Unlimited versions, user storage is 120MB per. However, the Developer, Personal, or Essentials versions follow different rules with no user data and 5MB, 20MB, and 10GB respectively.
  • FIle storage—for most plans you get 10GB per organization and from 612MB to 2GB per user. For the Developer and Personal plan you get 20MB, and for Essentials you get 1GB. No user data is provided for these plans.

Even if your data is still within storage limits, keeping redundant or unnecessary data in Salesforce can cause issues, including:

  • Degraded performance
  • Inaccurate reporting
  • Inefficient searches

To avoid these issues and ensure that your limits are not exceeded, you might consider adopting a cloud storage service. These services can provide scalable, cheap storage that you can connect with API or third-party extensions to your Salesforce system. 

For example, Azure File Storage by NetApp can provide a standard file system format that you can use from anywhere, including hybrid systems. Or, AWS S3 services can be connected for unstructured storage and any type of data. 

Cleaning up unwanted data

Maybe you do not want to store data outside of Salesforce or you have already moved data but still want to improve storage efficiency. In these cases, you can focus on cleaning your data. You can do this either manually or automatically depending on the type of data you’re trying to eliminate. 

For manual clean-up, Salesforce provides a native deletion wizard. You can use this wizard to eliminate old accounts, contacts, activities, leads, or cases. To identify data that is safe to remove you can run a report to see when data was last used and eliminate things before a certain date. Or, you can individually delete data as users inform you it’s no longer accurate.

Another option is to use extract, transform, load (ETL) tools to pull your data, process it (removing unnecessary data), and load the remaining data back in. This option enables you to script clean-up based on whatever parameters you’d like. However, it can be a lengthy process and requires the help of external tools, such as Salesforce Data Loader or Informatica.

Archiving data

During your data downsizing, you will probably find data that you no longer need in your system but that you don’t want to delete. For example old client files that you need to keep for compliance, historical customer reports, or knowledge base articles for legacy products or services. 

If you have data like this that you want ‘just in case’, archiving is your best option. Archiving enables you to export data from your system, compress it for efficiency, and store it wherever you prefer. 

Often, the previously mentioned cloud services are a good option for this. Many services have cold storage tiers available that are much cheaper than on-premise storage. These services enable you to store large volumes of data that you rarely need to access and can eliminate worries about data corruption or loss due to hardware failure. 

Conclusion

Salesforce comes with a specific data management build that you need to comply with. The two basic data types are data and files, and these are sorted further into five organizational types — files, CRM, documents, attachments, and knowledge. However, you do not have to use all of these. Recent Salesforce change enables you to store most of these elements as files. 

Whichever structure you choose, be sure to continually monitor and optimize your storage. Adding monitoring on a regular basis can help you optimize both performance and billing. To avoid hitting your storage limit, you can store data outside of Salesforce, clean up unwanted data, and archive cold data. 

Powered by WordPress & Theme by Anders Norén