Hey there,

We are working with the d7 development snapshot of this module and have got it working well on a site that we're looking to put in production in a few months. It's a website for a large non-profit and the site gets a lot of traffic. We are trying to understand in what instances the proximity module hits the Google Maps Geocoding API and how often.

According to this documention, the geocoding API limits more than 2500 hits a day, and the search we're working on could very possibly be getting more than 2500 searches a day.

Can you help me answer:

- In the case of a view which does a proximity search filter on results, does this module require a site to hit the gmaps geocode api for every single search (ie, every time the view is loaded with a zip code argument)? Or does the search happen within the proximity table that is built upon module install?

Comments

jenlampton’s picture

From looking through the code, it looks like the views proximity filter does hit the google API every time a search is run. Running the same search (same zip) over and over also hits the google API each time.

The openlayers_proximity table in the db seems to just hold a node ID and a lat/lon pair in each row - but no actual proximity to zipcodes (or search strings) provided.

In essence, the openlayers_proximity module provides the same functionality as the geocode/geocoder modules, except that it uses only google's API, instead of allowing admins a choice.

The OpenLayers Proximity module also provides fields and filters to views based on it's own data which is stored in that table - this is something that's currently missing from Geocoder, but see #1469958: Add views filters for proximity to Geocoder API

pixelsweatshop’s picture

@elly, since your project is for a non-profit, you can also look into the Google Earth Outreach Grants program http://www.google.com/earth/outreach/grants/software/index.html

Non-profits and applications deemed in the public interest (as determined by Google at its discretion) are not subject to these usage limits. For example, a disaster relief map is not subject to the usage limits even if it has been developed and/or is hosted by a commercial entity. In addition we recommend that eligible Non-profits apply for a Maps API for Business license through the Google Earth Outreach Grants program, which provides a number of benefits.

kafitz’s picture

True, every time a search is run, the module hits the google API. This also happens in de D7 version of openlayers_proximity, sometimes even more than once on a single search. A solution could be to implement drupal caching. That can reduce the amount of google API hits. (Example how to implement D7 caching: http://www.lullabot.com/articles/beginners-guide-caching-data-drupal-7).

goron’s picture

I've opened an issue for caching, with a patch: #1713552: Cache geocoded addresses

ipwa’s picture

Maybe we should consider adding support for other Geocoding services that don't have limits like Mapquest or Yahoo.

pixelsweatshop’s picture

ipwa, openlayers proximity is "Maintenance fixes only". No new features will be added. New users are recommended to use geofield 2.x which now has support for proximity searches and also supports multiple 3rd party services.

fonant’s picture

FWIW the "proper" way to do this is to to the geocoding client-side, and get the client to send the lat&lng to the server. Then you get 2,500 geocodes per visitor IP, rather than for the site.

https://developers.google.com/maps/articles/geocodestrat#client says:

When to use Client-Side geocoding

The basic answer is "almost always."

pinkonomy’s picture

Is this possible also to do client-side geocoding for this library?
https://developers.google.com/maps/documentation/javascript/places#place...
thanks!

aprice42’s picture

Our website is still using D6, and will be for another year, so we cannot utilize the geofield module.

I am wondering if someone could comment on where we would need to modify this module to shift from server side geocoding to client side geocoding. And whether this change would require a major rewrite of the module or just some modifications?

We are definitely interested in pursing a change of this nature as our site relies on this search tool, and currently it seems the only alternative is to pay google 10K/year to increase our call limit.

Any direction would be greatly appreciated!

jpstrikesback’s picture

I would love to see this here and in geofield, I don't have time to work on it right now but if I get a chance I'll submit a patch first to geofield and then post back to see if anyone wants to port it to OLP. That said if we can get a RTBC patch here before then I'm happy to commit it.

urlM’s picture

I've been running into this issue as well and know we haven't been hitting the 2,500 limit by human traffic. Could spam bots be constantly submitting the form which pings Google's Geocoder each time? Is there a way to implement zip code / address validation before submitting the views exposed form (search proximity)? I've tried adding a Honeypot field (http://drupal.org/project/honeypot) to prevent the spam, but am still running into the same issue. It seems that the form still hits Google's Geocoder even with Honeypot disallowing the form submission. I've been using the Openlayers Locator feature here: http://drupal.org/project/ol_locator

And/Or... could this be submitting the views exposed form multiple times since AJAX is enabled in the view?

urlM’s picture

Well, it actually looks like every time that someone visited the page that it was geocoding an empty string. Then when someone actually put in a zip/address/random text that it was sending 6 more api calls server side plus the initial call. So, Geocoder calls were adding up pretty quickly. Caching results per this thread might help things a bit: https://drupal.org/node/1515372#comment-7281948

urlM’s picture

Issue summary: View changes

removing some irrelevant questions

jide’s picture