Category Archives: apex

Schedulable Jobs – Constructors and execute()

This issue threw me for a loop for some time.

Suppose you have a Schedulable class and when it runs, the behavior of the execute() method doesn’t match what you expect. It looks like there’s memory of previous execute()s

Here is a specific example that distills the issue

public class MySchedulable implements Schedulable {

   Account[] accts = new List<Account>();

   public void execute(SchedulableContext sc) {
      accts.addAll([select id, name from Account where YTD_Total__c < 1000.0]);
      // do something interesting with this 
   }

You run the job weekly. You observe on week n, where n > 1, that Accounts are being processed that currently don’t have a YTD_Total__c < 1000. Accounts with YTD_Total__c > 1000 are being processed (?!?)

Explanation

  • When the job is scheduled, the class constructor is called (here, the implicit constructor) and the object variable accts is initialized empty.
  • On week 1, when execute() is called, the accounts with YTD_Total__c < 1000 as of week 1 are added to list accts.
  • On week 2, when execute() is called, the accounts with YTD_Total__c < 1000 as of week 2 are added to list accts.

Note that the class constructor is not reinvoked for each execute(). Hence, the accts list grows and grows.

Solution
Reinitialize all object variables within the execute().

public class MySchedulable implements Schedulable {
 
   public void execute(SchedulableContext sc) {
      Account[] accts = new List<Account>();
      accts.addAll([select id, name from Account where YTD_Total__c < 1000.0]);
      // do something interesting with this 
   }

SObject method isClone() Nuance

Discovered something doing a unit test today

SObject class method isClone() does not return true unless the clone source SObject exists in the database.

Account a = new Account(name = '00clonesrc');
insert a;
Account aClone = a.clone(false,true,false,false);
system.debug(LoggingLevel.Info,'isClone='+aClone.isClone());

The debug line shows as: isClone=true

but, don’t do the insert as in this example:

Account a = new Account(name = '00clonesrc');
Account aClone = a.clone(false,true,false,false);
system.debug(LoggingLevel.Info,'isClone='+aClone.isClone());

The debug line shows as: isClone=false

Normally, this might not be an issue but I was unit testing a service layer method and passing cloned sobjects in as arguments without doing the database operation in order to make the tests run faster. This is one place where the DML is required in order to get isClone() to work as expected.

Update 2016-10-04

Per suggestion by Adrian Larson, I retried using a dummy ID

Account a = new Account(id = '001000000000000000', name = '00clonesrc');
Account aClone = a.clone(false,true,false,false);
system.debug(LoggingLevel.Info,'isClone='+aClone.isClone());

The debug line shows as: isClone=true

Database allOrNone and exceptions

In my batch classes, I’m a big user of the allOrNone argument to Database.insert or Database.update. This is mostly because some orgs have dirty data and dealing with those as non-fatal exceptions is better for the business than rolling back all DML from a batch execute() that might be dealing with 199 good records and only one bad one.

So, the normal pattern would be

Database.SaveResult[] srList = Database.update(listOfSobjs,false);
for (Integer i =0; i < listOfSobjs.size(); i++)
  if (!srList[i].isSuccess() {
      // log the error somewhere for later admin action - typically to a persistent sobj
  }

But what if you had this coding fragment where the allOrNone argument was a variable, sometimes true, sometimes false?

Database.SaveResult[] srList = Database.update(listOfSobjs,allOrNoneVbl);
for (Integer i =0; i < listOfSobjs.size(); i++)
  if (!srList[i].isSuccess() {
      // log the error somewhere for later admin action - typically to a persistent sobj
  }

Well, and again the doc isn’t completely clear on this, if allOrNoneVbl is true, no Database.SaveResults are returned and an exception is thrown. Here’s proof:

try {
   Database.SaveResult[] srList = Database.insert(new List<Account>{
                                                    new Account(),
                                                    new Account()},
                                                 true);
   system.assert(false,'allOrNothing=true does not throw exception');
}
catch (Exception e) {
    system.assert(false,'allOrNothing=true does throw exception');
}

Debug log:
DML_END|[2]
EXCEPTION_THROWN|[2]|System.DmlException: Insert failed. First exception on row 0;
first error: REQUIRED_FIELD_MISSING,
Required fields are missing: [Name]: [Name]
EXCEPTION_THROWN|[6]|System.AssertException: Assertion Failed: allOrNothing=true does throw exception
FATAL_ERROR|System.AssertException: Assertion Failed: allOrNothing=true does throw exception

Conclusion: If your batch execute() is intended to log errors and you sometimes use isAllOrNone as true and sometimes as false in the same execute() (because you are doing multiple DML operations), your logging code is more complex as the source of the error message is found in different places (i.e. Database.SaveResult method getErrors() versus caught exception getMessages() ).

REST SOQL OFFSET and LIMIT – 2000 records

I recently ran into an issue with an Apex batchable class that executed a REST SOQL query performed from SFDC org A against SFDC org B

The batch Apex code in org A was written to handle REST responses that looked like this:

{"totalSize":12,
"done":true | false,
"nextRecordsUrl" : "/services/data/v30.0/query/01g8000001J2eXvAAJ-2000",

"records":[
{"attributes":{
"type" : "the sobject",
"url" : "/services/data/v30.0/sobjects/the sobject/the id"
},
"field0 in query" : "value of field 0",
"field1 in query" : "value of field1",
...},
next record ...
]
}

If "done" : false was present, then, once the iterator reached the end of the retrieved SObjects, it would do an HTTP Request against the value in nextRecordsUrl .

Hence, the batch program could retrieve a large number of records.

And your problem was what?

Turns out my query string used in the original REST GET was of this form:

select id,name from Opportunity where closeDate = THIS_QUARTER Limit 10000 OFFSET 0

Even though there were more than 2000 Opportunities in Org B and even though the coding for the batchable class’s iterator could deal with the queryMore logic, the total number of records retrieved was exactly 2000.

Analysis

Turns out, if you code OFFSET 0, the maximum value returned in the totalSize JSON property of the REST response body is 2000 – regardless of there being more than 2000 rows in the queried org.

Solution
As there was already logic to deal with queryMore(), I simply removed the OFFSET clause. I verified there was no logic to increment OFFSET and repeat the query.

Why was there queryMore() logic in the code in the first place – why not implement LIMIT and OFFSET per SFDC spec?

Turns out, if the retrieved payload from a given query exceeds the maximum transfer size per REST response, SFDC will break up the query results into segments, indicated by the presence of "done": false. These segments will be a maximum of 2000 records but could be (and in my case were – do to payload mass) fewer. The queryMore logic had to be added anyway. Once this was present, the OFFSET portion of the SOQL became superfluous and should have been removed everywhere. Unfortunately, a vestigial OFFSET 0 was present causing ther REST query to only retrieve 2000 rows before saying – I’m done.

Partial success and trigger firing

Yesterday, I was baffled by the following observed behavior:

  1. Insert six Contacts using Database.insert(cList,false), five of which had some validation error, one which succeeded
  2. The successful Contact, in the before insert trigger, derives the value of accountId, if null in the inserted record
  3. The system.assert to verify the defaulting of accountId failed, even though the debug log clearly showed that it was set

Further investigation of the debug log showed the before insert trigger for the successful Contact was executing twice, and in the second iteration, the defaulting of the accountId was not occurring.

Before insert trigger executed twice? Huh?

After some reductionist testing, I verified that this only happened when using optAllOrNothing = false in Database.insert(cList,false). That is, allow for partial successes. According to the documentation (in a section I had never read or paid attention to).

When errors occur because of a bulk DML call that originates from the SOAP API with default settings, or if the allOrNone parameter of a Database DML method was specified as false, the runtime engine attempts at least a partial save:

(1) During the first attempt, the runtime engine processes all records. Any record that generates an error due to issues such as validation rules or unique index violations is set aside.

(2) If there were errors during the first attempt, the runtime engine makes a second attempt that includes only those records that did not generate errors. All records that didn’t generate an error during the first attempt are processed, and if any record generates an error (perhaps because of race conditions) it is also set aside.

(3) If there were additional errors during the second attempt, the runtime engine makes a third and final attempt which includes only those records that didn’t generate errors during the first and second attempts. If any record generates an error, the entire operation fails with the error message, “Too many batch retries in the presence of Apex triggers and partial failures.”

It is the second point that is interesting – “the runtime engine makes a second attempt that includes only those records that did not generate errors” – ahhh, that is why the debug log showed two executions of before insert on the same record.

Now, why did the second attempt not default the Contact’s accountId using my Apex logic?

Answer: At the end of my before insert trigger handler, I set a static variable to prevent the trigger handler from re-executing (sort of a reflex action to avoid unnecessary SOQLs). Hence, when the second attempt at before insert was made, the static variable prevented the defaulting and the record saved successfully, except without a non-null AccountId. Hence the assertion failed.

A different portion of the SFDC Apex doc states:

When a DML call is made with partial success allowed, more than one attempt can be made to save the successful records if the initial attempt results in errors for some records. For example, an error can occur for a record when a user-validation rule fails. Triggers are fired during the first attempt and are fired again during subsequent attempts. Because these trigger invocations are part of the same transaction, static class variables that are accessed by the trigger aren’t reset. DML calls allow partial success when you set the allOrNone parameter of a Database DML method to false or when you call the SOAP API with default settings. For more details, see Bulk DML Exception Handling.

Note the sentence: “Because these trigger invocations are part of the same transaction, static class variables that are accessed by the trigger aren’t reset”.

So, while governor limits are not affected by the retry, the static variables remain persistent and hence the trigger handler derivation logic remained switched off as a result of the conclusion of the initial before insert trigger handler.

Removing the setting of the static variable solved the issue.

Versioning woes – ApexPages.addMessage

I recently updated my Util class from V26 to V31. Unfortunately, this broke some VF page test methods. After some experimentation, I traced the problem down to this:

The ApexPages.addMessage() method has several instantiation forms. The one I was using is:

public Message(ApexPages.Severity severity, String summary, String detail)

Using a pattern like this in some of my VF controllers that did some substantial DML:

public PageReference fooActionMeth() {
  String step;
  try {
    step = '01 step';
    // do some work 'A' using various Util methods that can throw exceptions
    // do some other work 'B' using various Util methods that can throw exceptions
    step = '02 step';
    // do some more work using various Util methods that can throw exceptions
  }
  catch (exception e) {
   ApexPages.addMessage(new ApexPages.Message(ApexPages.Severity.ERROR,
                                              'some error at step:'+step,
                                              e.getMessage()));
  }
}

My testmethod was structured as shown below. I like this pattern because I can test all the error conditions (usually) in sequence within one testmethod rather than going through the overhead of multiple testmethods, one per error case.

 // ...
 Test.startTest();
 Test.setCurrentPage(Foo.page);
 PageReference resPg;
 // set up conditions to test step 1 and verify exception for work 'A' occurs
 resPg = fooAction();
 System.assert(getMostRecentMsg().contains('expected error code'));
 // set up conditions to test step 1 and verify exception for work 'B' occurs
 resPg = fooAction();
 System.assert(getMostRecentMsg().contains('expected error code')); // fails at V31
 // set up conditions to test step 2 and verify exception occurs
 resPg = fooAction();
 System.assert(getMostRecentMsg().contains('expected error code'));
  // ...
 
  // utility method
  private static String getMostRecentMsg() {
     return ApexPages.hasMessages() 
	? ApexPages.getMessages()[ApexPages.getMessages().size()-1].getSummary() + 
          ' ' + 
          ApexPages.getMessages()[ApexPages.getMessages().size()-1].getDetail() 
	: null;
   }

But, when I upgraded the version of the Util worker code to V31 from V26, the testmethod stopped working for the second assert. Why?

To resolve, I broke the problem down to its essential elements using a separate test class:

@isTest
private static void testAddVFMsg() {    
    Test.startTest();
    Test.setCurrentPage(Page.Foo);
    ApexPages.addMessage(
      new ApexPages.Message(ApexPages.Severity.INFO,'MsgSummary','00detail'));
    System.assertEquals(1,ApexPages.getMessages().size());

    ApexPages.addMessage(
      new ApexPages.Message(ApexPages.Severity.INFO,'MsgSummary','01detail'));
     // passes at V28 and earlier, fails at V29+
    System.assertEquals(2,ApexPages.getMessages().size()); 

    Test.stopTest();
}

Note the arguments to the ApexPages.Message constructor between the two calls. Same summary (second argument), different detail (third argument). I then tried it at V31, V30, …until it passed.

Conclusion
ApexPages.addMessage() behavior varies by SFDC version if the i+1st message has the same summary as the ith message, even if the detail argument is different between i+1st and ith message.

If the summary values are the same, the i+1st message won’t get added to the page messages. This change happened as of V29.

Seems to me that if you call addMessage(..), the resulting getMessages() should include the message you added

Versioning Woes – No such column – Field is valid

Here’s a tip to help resolve a System.QueryException ‘No such column’ when the field exists in the schema

Turns out I was generating a SOQL Select in class A using Schema.Describe to get all fields from OpportunityLineItem resulting in:

Select createdbyid, hasrevenueschedule, 
    description, isdeleted, quantity, systemmodstamp, 
    totalprice, hasschedule, createddate, subtotal, discount, 
    lastmodifiedbyid, currencyisocode, pricebookentryid, 
    lastmodifieddate, id, unitprice, servicedate, 
    listprice,  sortorder, hasquantityschedule,  
from OpportunityLineItem

Yet, when this query was executed by class B, i got an error System.QueryException ‘No such column subtotal’

‘subtotal’ is not a field I normally use in OpportunityLineItem so I wondered if it had been removed or added in the most recent SFDC Version (31). Nope.

Here was the issue:

  • Class A, that generated the SOQL from the Schema.Describe was at V27.0, recently upgraded from V13.0
  • Class B, that executed the generated SOQL from the Schema.Describe was at V16.0 (!)
  • Test Class T, that executed Class A and B was at V27.0

So, the Schema.describe in class A found all the fields known to SFDC version 27.0 which included OpportunityLineItem.subtotal but when executed by a much older class B (version 16), the subtotal field wasn’t part of the schema at that point in history so QueryException ensued.

Updating class B to V27.0 resolved the issue

Object access from SOQL results

I was reading Dan Appleman’s 2nd edition on Advanced Apex Programming and looking at the code samples. In one sample, Dan showed me something I didn’t know you could do in Apex – something that had eluded me in 6 years – and so basic. I was so chagrined but, with shame, comes resolution to help others avoid the same fate.

Let’s look a typical SOQL on Opportunity that references both parent and child objects:

    Opportunity o = [select account.id, account.name,
                            id, name,
                            (select id, quantity
                              from OpportunityLineItems)
                      from Opportunity where id = :someId];

Well, we know we can reference the child objects easily with:

   List<OpportunityLineItem> oliList = o.opportunityLineItems;

But here is what I did not know – you can also reference the Account object as well

   Account a = o.account;  //sheesh

This is handy if you need to update the lookup object. Thus you avoid unnecessary SOQL to fetch it.

Salesforce to Salesforce Using REST – Part II

A bit of a temporary setback here.
I naively thought that whatever you could do in the SFDC SOAP API you could do in the REST API. Specifically, that it would be possible for APEX code in the source product catalog org to do mass inserts, updates, or upserts in the target org. NOPE.

The REST API only allows Create-Update-Delete operations on single resources – that is, single records. You can query (Read) to your heart’s content over many records. Teach me to not fully read the documentation.

Alternatives

  1. I could have the source org use the SOAP API. I looked into this and it is kind of a heavy-weight solution wherein you generate a massive Apex class from the partner or Enterprise WSDL. I tried this and realized that the Enterprise WSDL was too large to import and thus would need to be trimmed before I could generate the APEX bindings to it. This would mean maintenance issues. The Partner WSDL isn’t strongly typed and thus is more trouble than it’s worth to work with.
  2. I could use the SFDC Bulk API from the source org. This has other issues notably the input has to be either XML or CSV so I’d need to do conversions and, more importantly, it is asynchronous making the Visualforce page flow control complex with polling. Yuck.
  3. I could write a custom REST service that would be deployed on all target orgs that would handle the upserts through its own logic. This looked like the best way forward as the source org code would be simple – create a payload in the POST body for the Product2, PricebookEntry, and Pricebook2 SObjects. The target org custom REST service would figure out what needed to be done, cross-referencing source org data with existing target org data matching on Product2.ProductCode.

In many ways, option 3 is best as it will make testing easier through separation of responsibilities and much, much simpler Test Mock classes on the source org’s Apex class.

Salesforce to Salesforce Using REST – Part I

Here was the business problem:

Migrate Product2-PricebookEntry-Pricebook2 data from a sandbox to staging sandbox and then onto production.

My previous brute force solution was to use the Excel Connector, querying rows from the Pricebook2, PricebookEntry, and Product2 tables using a connection to the sandbox, then successively inserting rows into the target org while replacing ID fields in the upload PricebookEntry dataset to match the just uploaded Pricebook2 and Product2 datasets. (PricebookEntry is a junction record between Pricebook2 and Product2). This involved Excel VLOOKUP functions and lots of care to make sure that I didn’t reorder rows by accident.  Doing this once wasn’t so bad but the business changes the product line regularly and I was finding this tedious.

Solution Outline

Why not build an admin VF page in my org that could read from my source dataset sandbox and insert rows into the target sandbox/prod org?  To make this happen, the Visualforce controller code would need to be able to do SOQL and DML against a different org.  The way to do this is by exploiting the SFDC REST API and Apex HTTP Callouts.

The SFDC REST API documentation is written assuming that the calling application is not a Salesforce org but there is no restriction that this be so.

Step 1 – AuthenticationOAUTH Authentication SFDC to SFDC

The relevant SFDC REST API authentication documentation is here.  I chose the username-password method as I’m the admin and would use this to get authentication on the target orgs using my own username and passwords on those orgs.  I also opted for OAuth 2.0 authentication as I wanted to try this rather than using sessionIds.

public with sharing class HttpRest {

	//	Class to handle HTTP Rest calls to other SFDC instances
	//
	//	[HTTP-00]	- Reached limit on callouts
	//	[HTTP-01]	- Unable to get OAuth 2.0 token from remote SFDC
	//	[HTTP-02]	- Error in SOQL REST query

	String	accessToken;					// OAuth 2.0 access token
	String	sfdcInstanceUrl;				// Endpoint URL for SFDC instance
						
	private HttpResponse send(String uri,String httpMethod) {
		return send(uri,httpMethod,null);
	}
		
	private HttpResponse send(String uri, String httpMethod, String body) {
		
		if (Limits.getCallouts() == Limits.getLimitCallouts())
			throw new MyException('[HTTP-00] Callout limit: ' + Limits.getCallouts() + ' reached. No more callouts permitted.');
		Http		h		= new Http();
		HttpRequest	hRqst	= new HttpRequest();
		hRqst.setEndpoint(uri);						// caller provides, this will be a REST resource
		hRqst.setMethod(httpMethod);				// caller provides
		hRqst.setTimeout(6000);	
		if (body != null) 
			hRqst.setBody(body);					// caller provides
		if (this.accessToken != null)				// REST requires using the token, once obtained for each request
			hRqst.setHeader('Authorization','Bearer ' + this.accessToken);
		return h.send(hRqst);					// make the callout
	}	
	//	----------------------------------------------------------------------
    //	authenticateByUserNamePassword		: Returns a map of <String,String> of the OAuth 2.0 access token; required before REST calls on SFDC instances can be made 
    //	----------------------------------------------------------------------
    public void authenticateByUserNamePassword(String consumerKey, String consumerSecret, String uName, String uPassword, Boolean isSandbox) {
    	// Reference documentation can be found in the REST API Guide, section: 'Understanding the Username-Password OAuth Authentication Flow'
    	// OAuth 2.0 token is obtained from endpoints:
    	//	PROD orgs	: https://login.salesforce.com/services/oauth2/token
    	//	SANDBOX orgs: https://test.salesforce.com/services/oauth2/token
    	
 		//	OAuth 2.0 access token contains these name/values:
 		//		access_token		: used in subsequent REST calls
 		//		instance_url		: to form the REST URI
 		//		id					: identifies end user
 		//		issued_at			: When signature was created
 		//		signature			: HMAC-SHA256 signature signed with private key - can be used to verify the instance_url	 

		String uri 			= 'https://' + (isSandbox ? 'test' : 'login') + '.salesforce.com/services/oauth2/token';
		String clientId 	= EncodingUtil.urlEncode(consumerKey,'UTF-8');
		String clientSecret = EncodingUtil.urlEncode(consumerSecret,'UTF-8');
		String username		= EncodingUtil.urlEncode(uName,'UTF-8');
		String password		= EncodingUtil.urlEncode(uPassword,'UTF-8');

		String body =	'grant_type=password&client_id=' + clientId + 
						'&client_secret=' + clientSecret +
						'&username=' + username + 
						'&password=' + password; 

		HttpResponse hRes = this.send(uri,'POST',body);
		if (hRes.getStatusCode() != 200) 
			throw new MyException('[HTTP-01] OAuth 2.0 access token request error. Verify username, password, consumer key, consumer secret, isSandbox?  StatusCode=' +
												 hRes.getStatusCode() + ' statusMsg=' + hRes.getStatus());
			
		
		System.debug(FlowControl.getLogLevel(),'response body =\n' + hRes.getBody());
		

		Map<String,String> res = (Map<String,String>) JSON.deserialize(hRes.getBody(),Map<String,String>.class);

		this.accessToken		= res.get('access_token');		// remember these for subsequent calls
		this.sfdcInstanceUrl 	= res.get('instance_url');
		
		
    }


    //	-----------------------------------------------------------------------
    //	doSoqlQuery: Executes a REST query on a remote SFDC and returns a list of SObjects
    //	-----------------------------------------------------------------------
    public List<SObject> doSoqlQuery(String query) {
    	List<Sobject>	res;		
		PageReference 	urlPg	= new PageReference(this.sfdcInstanceUrl + '/services/data/v29.0/query');
		urlPg.getParameters().put('q',query); 

		String uri 				= urlPg.getUrl();				// let APEX do the URL encoding of the parms as necessary
		HttpResponse hRes = this.send(uri,'GET');
		if (hRes.getStatusCode() != 200) 
			throw new MyException('[HTTP-02] Error in query ' + uri + ' StatusCode=' +
												 hRes.getStatusCode() + ' statusMsg=' + hRes.getStatus());
		
		// Response body comes back as:
		// {"totalSize":10,
		//	"done":true,
		//	"records":[
		//				{"attributes":{
		//					"type"	: "the sobject",
		//					"url"	: "/services/data/v29.0/sobjects/the sobject/the id"
		//				},
		//				"field0 in query"	: "value of field 0",
		//				"field1 in query"	: "value of field1",
		//				...},
		//				next record ...
		//				]
		//	}
		JSONParser jp = JSON.createParser(hRes.getBody());
		do{
			jp.nextToken();
		} while(jp.hasCurrentToken() && !'records'.equals(jp.getCurrentName()));
		jp.nextToken();  // token is 'records'
		res = (List<SObject>) jp.readValueAs(List<SObject>.class);		// Let caller cast to specific SObject
		return res;
    }
 
 
 	//	--------------------------------------------------------------
    // 	TEST METHODS - MOCKS
    //	---------------------------------------------------------------	
	public class authenticateByUserNamePasswordMock implements HttpCalloutMock {
    	
    	Boolean	isMockResponseSuccessful;
    	
    	public authenticateByUserNamePasswordMock(Boolean isMockResponseSuccessful) {
    		this.isMockResponseSuccessful	= isMockResponseSuccessful;
    	}
    	
    	public HttpResponse respond(HttpRequest rqst) {
    		HttpResponse hResp		= new HttpResponse();
    		if (this.isMockResponseSuccessful) {
    			hResp.setStatusCode(200);
    			hResp.setBody('{' +			// data copied from SFDC example
    							' "id":"https://login.salesforce.com/id/00Dx0000000BV7z/005x00000012Q9P", ' +
								' "issued_at":"1278448832702",' +
								' "instance_url": "https://na1.salesforce.com",' +
								' "signature":"0CmxinZir53Yex7nE0TD+zMpvIWYGb/bdJh6XfOH6EQ=",' +
								' "access_token": "00Dx0000000BV7z!AR8AQAxo9UfVkh8AlV0Gomt9Czx9LjHnSSpwBMmbRcgKFmxOtvxjTrKW19ye6PE3Ds1eQz3z8jr3W7_VbWmEu4Q8TVGSTHxs"' 
							+'}');			
    		}
    		else {
    			hResp.setStatusCode(400);
    			hResp.setStatus('Bad request');
    		}
    		return hResp;
    	}
    }
    
    public class doSoqlQueryMock implements HttpCalloutMock {
    	
    	Boolean	isMockResponseSuccessful;
    	
    	public doSoqlQueryMock(Boolean isMockResponseSuccessful) {
    		this.isMockResponseSuccessful	= isMockResponseSuccessful;
    	}
    	
    	public HttpResponse respond(HttpRequest rqst) {
    		HttpResponse hResp		= new HttpResponse();
    		if (this.isMockResponseSuccessful) {
    			hResp.setStatusCode(200);
    			hResp.setBody('{' +			// data copied from SFDC example
    							' "totalSize": 2,' 		+ 
								' "done" 	: true,' 	+
								' "records" : ['		+
								'              { 	"attributes" 	: {' +
								'										"type"			: "Account",' +
								'										"url"			: "/services/data/v29.0/sobjects/Account/00id"' +
								'									},' +
								'				"ID"	: "00id"' +
								'				},' +
								'              { 	"attributes" 	: {' +
								'										"type"			: "Account",' +
								'										"url"			: "/services/data/v29.0/sobjects/Account/01id"' +
								'									},' +
								'				"ID"	: "01id"' +
								'				}' +
								'			]' +
								'}');			
    		}
    		else {
    			hResp.setStatusCode(404);
    			hResp.setStatus('Not found');
    		}
    		return hResp;
    	}
    }
    
    
   	//	--------------------------------------------------------------
    // 	TEST METHODS - tests
    //	--------------------------------------------------------------- 
	@isTest
    private static void testAuthenticateByUserNamePassword() {
    	HttpRest hr = new HttpRest();
    	try {
    		Test.setMock(HttpCalloutMock.class, new authenticateByUserNamePasswordMock(true));	
    		hr.authenticateByUserNamePassword('somekey','somesecret','someusername@somedomain','somepw+token',false);
    		System.assertEquals('00Dx0000000BV7z!AR8AQAxo9UfVkh8AlV0Gomt9Czx9LjHnSSpwBMmbRcgKFmxOtvxjTrKW19ye6PE3Ds1eQz3z8jr3W7_VbWmEu4Q8TVGSTHxs',hr.accessToken);
    		System.assertEquals('https://na1.salesforce.com',hr.sfdcInstanceUrl);
    	}
    	catch (Exception e) {System.assert(false,'shouldnt. Exception:' + e.getMessage() + e.getStackTraceString());}
    	try {
    		Test.setMock(HttpCalloutMock.class, new authenticateByUserNamePasswordMock(false));
    		hr.authenticateByUserNamePassword('somekey','somesecret','someusername@somedomain','somepw+token',false);
    		System.assert(false,'shouldnt happen, mock should produce error');
    	}
    	catch (Exception e) {System.assert(e.getMessage().contains('[HTTP-01]'),e.getMessage());}
    }

	@isTest
    private static void testDoSoqlQuery() {
    	HttpRest hr = new HttpRest();
    	List<Sobject>	sobjResList;
    	try {
    		Test.setMock(HttpCalloutMock.class, new doSoqlQueryMock(true));	
    		sObjResList = hr.doSoqlQuery('select id from Account');
    		System.assertEquals(2,sObjResList.size());
    	}
    	catch (Exception e) {System.assert(false,'shouldnt. Exception:' + e.getMessage() + e.getStackTraceString());}
    	try {
    		Test.setMock(HttpCalloutMock.class, new doSoqlQueryMock(false));
    		sObjResList = hr.doSoqlQuery('select id from Account');
    		System.assert(false,'shouldnt happen, mock should produce error');
    	}
    	catch (Exception e) {System.assert(e.getMessage().contains('[HTTP-02]'),e.getMessage());}
    }


}

Implementation Notes

  1. You’ll need your own Exception class for this to compile. The example calls it MyException but I usually name these after the project or business unit wherein my code is deployed.
  2. Note that method authenticateByUserNamePassword takes five arguments. You will need to set up the consumerKey and consumerSecret on the target org using Setup | Develop | Remote Access. Starting in V29.0, this is now called a ‘Connected App’.  Your source org will need access to these values – perhaps through custom settings, directly-entered into your VF page, or for more security, as encrypted fields in an custom SObject.
  3. Your source org will need to Setup | Security Controls | Remote Site Settings for both the OAuth 2.0 authentication access token provider(s) as well as the target instance(s).
Connected App Setup in the target org

There are several OAuth scopes available to you. I chose ‘Access and manage your data (api)’ as that met my requirements.ConnectedAppSetup

Remote Site settings in the source org

In my source org, I set up four Remote Sites – two for the production and sandbox OAuth 2.0 access token providers and one each for the target PROD and target sandbox.
remoteSites