A long time ago, I implemented duplicate checking in Apex for custom object Foo__c
. I decided it was time to use SFDC’s out-of-the-box Duplicate Rules so I could move the logic into point-and-click configuration and clean up the code.
Good idea eh?
Well, sort of. There are some considerations before you jump into this.
Starting condition:
My existing Apex logic checked for duplicates intrabatch as well as extrabatch. Meaning, if two Foo__c
‘s with the same key appeared in the same trigger set, they were both flagged as duplicate errors. Similarly, if any Foo__c
within the batch matched an existing Foo__c
outside of the batch, it would be flagged as an error.
Consideration (coding)
- Unfortunately, SFDC duplicate rules won’t block intrabatch duplicates. This is documented in the Help
- Doubly unfortunate, once you insert in a batch duplicate
Foos
, if you edit one of the duplicateFoos
without changing the matching key field, SFDC won’t check it against the otherFoo
with the same key. For example, if you bulk uploaded in one batch twoFoos
, each with key ‘Bar’, SFDC doesn’t detect duplicates. When you go to edit one of the Foos with key ‘Bar’ and change any field other than the matching key, SFDC won’t tell you that Foo i with key ‘Bar’ is the same as existing Foo j with key ‘Bar’.
That said, you do get to eliminate any Apex code that does SOQL to check for duplicates extrabatch.
Workaround
If you really want to block Foos
with the same key from getting into the database, you have to implement in your domain layer (i.e. trigger), Apex verification. No SOQL is required because all the records that have to be checked will be in Trigger.new
Consideration (deployment)
As of V37.0 (Summer 16), there is no way to deploy Duplicate Rules through ant or Change Sets. You have to manually add the Duplicate Rules in your target orgs. You can deploy MatchingRules via ant or Change Sets but that won’t do you much good as they have to be bound to a Duplicate Rule. There is an Idea worth voting up.