Yes, multiple filter criteria can be set to archive data using an archive API.
Yes, multiple filter criteria can be set to archive data using an archive API.
Yes, users can access the archived records with DataArchiva related list component.
Yes, before archiving the records the archiving engine checks for the metadata changes and if found then syncs them in Relational Database automatically.
Yes, the customer requires to purchase additional Big Objects from Salesforce to use DataArchiva. If you want to archive data up to 2 GB, then the available 1 million records (Big Objects storage) is sufficient. You don’t have to buy extra scrum of Big Objects.
Big Objects data backup can be set up using any third-party ETL tool as it supports all the APIs same as a custom object ( ex: data loader )
It will not allow the user to restore as of now. We have one CR where it will fall back to one active user-defined/setup by admin (custom settings).
Yes, to move archived records DataArchiva makes use of ‘batch apex’. The archive engine very well honors Salesforce platform governor limits. The archive can be scheduled during business hours as well. But, it is recommended to schedule out of business hours.
By default, it is a soft delete. We also offer configurable option for hard delete.
Almost None (most of the major functionalities are same irrespective of the storage, but there would be minor change while setting of the app for the first time)
Big Objects does not support role hierarchy and sharing rules. DataArchiva retains share records of each of the archived records in the Big Objects to evaluate who can see the records. Moreover, archived data view page uses describe layout metadata API to render the page dynamically.
Salesforce Shield doesn’t support Big Objects or outside data store (AWS Cloud). So, we built our own encryption tool as part of the product offering using the AES 256 algorithm & key rotation. This solution has been aligned and validated by the Salesforce ISV tech team. This supports both Big Objects and outside storage out of the box.
It does not use Salesforce Connect. It is a custom build part of the product offering called DataConnectiva – similar to Salesforce Connect. The customer doesn’t require to buy any extra add on.
Yes. Large data can be archived using migration utility which is a onetime activity done initially. Our support team can help you with this.
No. Data integrity remains intact between the live and archived data, which can be viewed from the parent record in Salesforce page without having to restore.
Fields of archived data can be encrypted at rest for standard/custom objects. It uses the highest standard of encryption technology and also supports key rotation policy.
Yes. It does support Chatter/Feeds.
Archived data can be used for analytics. You can apply Einstein Analytics on the archived data and get useful business insights. Archiving the data using DataArchiva also ensures 100% legacy data accessibility which can be used for compliance demands.
DataArchiva takes care of attachments by re-linking relationships internally to retain its content even after parent data is archived. The files/attachments still remain in Salesforce, but we do have another solution called XfilesPro to deal with moving files to the external cloud in case if your file storage requirements are high.
Yes. DataArchiva has a feature called Field Audit Trail (FAT) which helps in archiving field history data for years. DataArchiva even supports System Audit Trail.
No. Big Objects doesn’t offer data re-write.
DataArchiva is an archiving solution, not a backup solution. So, the data will be archived and stored in Big Objects. This means your data will not be in your primary storage.
Yes. You can archive the tasks from your Salesforce Org & still have them available at a record level.
Yes. DataArchiva is fully compatible with Lightning Experience.
Yes, a single record can be archived manually.
We convert all the read-only data into JSON format and store it in a field on that object which you are trying to restore.
If anything fails, you will get an email notification from the application. However, you can always go and check the audit log section in DataArchiva.
Yes. DataArchiva is GDPR Compliant.
Yes it does.
There is a restore button available in the application using which you can restore your archived data just by a click.
Yes. Big Objects is highly scalable and reliable.
Yes. Your data will be highly secured as Big Objects is a native Salesforce storage system.
As DataArchiva is a native solution, your archived data will remain in the Salesforce ecosystem. This won’t impact any of the ongoing business operations.
Using DataArchiva Usage Analyzer feature you can figure out which objects are consuming more data storage space and you can select them to archive.
Yes, DataArchiva offers seamless support to any internal or external files & attachments.
Yes, you can. DataArchiva offers a connector called “DataConnectiva” using which you can choose any Cloud/On-premise storage system to store your archived data. Currently DataConnectiva supports various Cloud/On-premise database service platform providers like Amazon, Google, Azure, Heroku including many databases like Postgres, Redshift, MySQL, Oracle, MS SQL, etc.
For more information, please get in touch with us.
As long as you have access to those objects, you can archive them.
DataArchiva offers a data type mapping so that it will pick list values as texts due to Big Objects data type limitations.
No. Big Objects is Salesforce’s big data based storage system.
Yes you can, but it would be under the related list section of it’s parent object.
We have built DataArchiva with a Visualforce page, which will help you view the archived data.
No. Our Metadata sync feature will take care of that.
You can see your archived data detailed in the DataArchiva Dashboard.
We don’t want you to go, but if you uninstall DataArchiva, we will ensure our technical team help you out in getting back your already archived data.
Yes you can. But you have to purchase XfilesPro & Encryptik licenses.
Please email us to sales@dataarchiva.com with the details, our sales rep will reach out to you within 24 hours.
That’s it, nothing else.
This being a big data, large volume of data can be stored in terms of terabytes. There is no official mention of size limit for Big Objects from Salesforce.
By default Salesforce provides free storage upto 1 million records for Big Objects and beyond this is chargable. The cost of Big Objects storage is way cheaper than the regular additional data storage.
No additional software or hardware are required for DataArchiva.
DataArchiva archives data into Salesforce’s big data storage called Big Objects. DataArchiva has a layer on top of Big Objects that takes care of handling object relationship, data types, indexes, etc under the hood.
DataArchiva is the ONLY Native archiving solution available in the AppExchange. This means the archived data won’t go out of the Salesforce ecosystem. Being native to Salesforce there are many advantages w.r.t data security, throughput, platform scalability, hardware etc.
Yes, the archived data can always be encrypted.
No, adding or modifying indexes on existing Big Objects is not supported.
Yes you can archive child records along with the parent record.
DataArchiva is a 100% Native Salesforce data archiving solution available in the AppExchange which can periodically archive your historical Salesforce data. With Auto-scheduler, Custom Archiving, Encryption, Integrity & Restore features, DataArchiva saves 85%+ storage costs, improves CRM performance & drives better governance.