BigObjects

Do we need to purchase new Big Objects in order to use DataArchiva?

Yes, the customer needs to purchase Big Objects. If you want to archive data up to 2 GB, then the available 1 million records storage is sufficient. You do not have to buy any extra scrum of Big Objects.

Are backups of the Big Objects used to archive data created?

Big Objects data backup can be set up using any third-party ETL tool as it supports all the APIs same as a custom object ( ex: data loader )

How to configure archived data access in the Big Objects? How do you ensure the right to access Big Objects while not using role hierarchy & sharing rules?

Big Objects does not support role hierarchy and sharing rules. DataArchiva retains share records of each archived records in the Big Objects to evaluate who can see the records. Moreover, Archived data view page uses describe layout metadata API to render the page dynamically.

Is data in BO reliable and scalable?

Yes. BO is highly scalable and reliable.

Is data stored in BO safe and secure?

Yes. Your data will be highly secured as BO is a native Salesforce storage system.

Can I store my archived data in external systems other than BO?

Yes, you can. DataArchiva offers a connector called “DataConnectiva” using which you can choose any Cloud/On-premise storage system to store your archived data. For more information, please get in touch with us.

What is the storage limit of Big Objects?

This being a big data, large volume of data can be stored in terms of terabytes. There is no official mention of size limit for Big Objects from Salesforce.

Do I need to pay extra for Big Object storage too in addition to data storage

By default Salesforce provides free storage upto 1 million records for Big Objects and beyond this is chargable. The cost of Big Objects storage is way cheaper than the regular Salesforce Objects.