While importing a larger number of nodes, I get lots of OVER_QUERY_LIMIT errors. Here is a patch to sleep a small amount of time before each query, and possibly retry a configurable number of times with increasing sleep time for each retry.

Support from Acquia helps fund testing for Drupal Acquia logo

Comments

gnucifer’s picture

Status: Active » Needs review
FileSize
8.65 KB
kevinquillen’s picture

Are there better ways of avoiding OVER_QUERY_LIMIT? I think I went over mine on Saturday, and 1000 coordinates were wiped out.

gnucifer’s picture

Better how? You have a limited number of queries per day if you choose not to pay for more. The problem for me is that there also seems to be a query frequency threshold that is rapidly exceeded with the current implementation. Sleeping 100ms or so in between each query resolves this, but if you are out of total queries this patch will not help you. The only solution in that case is to wait 24 hours, or purchase extra queries.

kevinquillen’s picture

Hitting the limit is fine- what I want to prevent is replacing valid data with NULL if I am over the query limit, which is what appeared to happen to me a few days ago.

Is there no way to not geocode if the target field (Address) has not changed, or just a way to skip?

steinmb’s picture

If you know you are doing a lot of queries to the same address (reimporting stuff etc) you could cache the result with Varnish. I did that on a dev. system where I had a few migrate tasks that was pulling content from an older legacy system into Drupal.

kevinquillen’s picture

How did you do that?

steinmb’s picture

Requests to maps.googleapis.com is mostly HTTP GET, so they are not hard to cache, but you need to change your local DNS/local host file to point maps.googleapis.com to your local proxy server.

Rob_Feature’s picture

Status: Needs review » Needs work

this no longer applies to the latest dev but is a much needed patch. changing to needs to work for updating...

joelstein’s picture

Status: Needs work » Needs review
FileSize
1.53 KB

Here's a patch similar to the one rolled for the Location module (#697468: Add configurable delay for Google geocoding to avoid 620 errors), which adds an option to configure milliseconds of delay between geocoding requests. This one was committed to Location over a year ago, so I think we can safely use this same solution.

You can read more about Google's usage limits (and the OVER QUERY_LIMIT response) here.

200ms seems to keep my site under the rate limits (no more than 5 requests per second). It defaults to 0 (no delay). After applying the patch, you can add a value of "200" to the new field at "admin/config/content/geocoder".

joelstein’s picture

My patch could probably be improved upon by adding in the exponential delays suggested in comment #1. I'm not sure if that would make a difference to Google, but maybe so.

pedrop’s picture

#9 patch worked for me, thanks.
There is a 2500 query per hour limit in the google documentation so a delay over 1.44 sec seems to be the safe solution if you have to recode 2500 or more geofields.

Cauliflower’s picture

Unfortunately, the limit is 2500 request per day :-(.

kevinquillen’s picture

Yeah, it is per day, and Google doesn't play around.

fonant’s picture

For a feature request for an alternative solution, more useful where site visitors are performing arbitrary proximity searches, see #1949954: Client-side geocoding for proximity searches, to avoid Google's new 2500 per day limit

greggles’s picture

Title: Avoiding OVER_QUERY_LIMIT in google geocode » Avoiding OVER_QUERY_LIMIT due to more than 5 requests in a second in google geocode
Status: Needs review » Reviewed & tested by the community

Updating the title to be more descriptive of the potential problems. Google returns OVER_QUERY_LIMIT for two scenarios and this patch is meant to fix only one of them. The other (2,500/24 hour period) can only be fixed by paying money to google or using a different backend. Discussing that issue here is off-topic.

I believe the solution in #9 will work for situations where there is only one process doing geocoding at the same time. If you have a url on the site that does geocoding and 6 people hit it at the same instant then this won't protect you. That's probably OK for most sites, but I wanted to note that limitation.

I'm using geocoder on a page where I actually use it 11+ times in a row to serve the single page (geocoding a zip, geocoding up to 10 addresses). I get these errors pretty consistently. I deployed this patch and increased the delay to 201ms and it seems to prevent the problem quite well. I'll humbly suggest it is RTBC.

greggles’s picture

I can now say with greater certainty that after I set it to 210ms there have been no more problems with OVER_QUERY_LIMIT in the last 7 hours where before I would get 1 or 2 per hour.

Brandonian’s picture

Status: Reviewed & tested by the community » Fixed

Status: Fixed » Closed (fixed)

Automatically closed -- issue fixed for 2 weeks with no activity.