Pinned topic What is the best way retrieve all the data from a SalesForce table?
The Query Objects activity returns data if I limit the result to 1 by specifying an Id, but I need to retrieve all of the records. If I do not limit the result set, it hangs for a long time and then finally errors out without ever showing anything in the variable for the result set. Even if the Query Objects activity did return the full result set, it still wouldn't be good for my use case because it's tedious to write out a long SOQL query where there are so many columns.
The Retrieve Objects activity finds all of the fields dynamically and says it doesn't have a limit to how many records it returns, but it requires Ids as input, and since I'm trying to get the rows for the first time, of course I don't have them.
What I'm trying to do here is create a mechanism for doing a full refresh from an HTTP post if needed. We used Demand Tools for doing exports from our SalesForce tables and it seems to have no problem exporting the entire table at once. How can this be done with Cast Iron?
Use the query operation2009-06-30T20:41:20Z in response to SystemAdminYou can query objects, but constrain the data to a fixed range so that you're not pulling so much data at once. For example, you could use an algorithm that gets the data in a date range, writes it to the database, then gets the next batch for the next date range.
Start by getting the earliest date: select lastModifiedDate from Account order by lastModifiedDate asc limit 1
Use this date in a select statement and limit the result set to 1000 rows, for example: select id, lastModifiedDate from Account where lastModifiedDate > ? order by lastModifiedDate asc limit 1000
Get the maximum date from the result set using XPath in a filter recurring nodes operation in a map using "position() = last()" as the predicate.
Use the largest date to feed the next batch of records, and repeat the process until the number of rows returned is less than 1000.
I can't use dates to chunk2009-07-20T23:50:12Z in response to SystemAdmin
I can't use dates to chunk the data because there are cases where 70,000 records were modified or created in the same day. Demand Tools has no problem quickly exporting large SalesForce tables, and DBAmp makes short work of it as well. There must be a way to do it with web service calls with Cast Iron.
The other problem I'm running into is that when I'm using the Get Updated Objects control on a table with 276 columns, it tells me that my SOQL query is failing because it exceeds 10,000 characters. When I build a query string using the field names it comes out to 7766 characters which is well below 10,000. Demand Tools and DBAmp can query this table. How can I do this with Cast Iron without using two Get Updated Objects activities for the same table?
query objects2010-02-04T19:50:39Z in response to SystemAdmin
Why not write the long query? If you have eclipse with the force.com plugin and use the query builder, you can simply select the object and it will write out the query for you. Then all you have to do is copy and paste it.
Query large data sets from SF...2010-09-22T13:53:13Z in response to SystemAdmin
I typically handle this by doing a query and return only the IDs. I'm able to get a very large number returned using this approach. Then batch using a split and then get the data in chunks. If the SOQL query exceeds 10,000 characters (please complian to SF on this one as it's a ridiculous limit), I break the query into two queries and return half of the fields in one and the other half in the second. Then I use a merge in CI (by ID) to join the data back together. It's not exactly pretty but it gets the job done.