Boost works great.

If I use and select the option, "Switch back to http pages when there are no matches", I get the following error message when I go to admin/reports/status
"Boost crawler did not get a 200 response; 302 returned instead."

Is this a bug or a feature? Should I be brining this up over at securepages?

#1 boost-891620.patch1.09 KBmikeytown2


Status:Active» Needs review
new1.09 KB

upped the redirect counter from the default 3 to 10

Status:Needs review» Fixed


Status:Fixed» Needs work

This did not solve the problem for me.

Well a 302 is a redirect; so this might be in a redirect loop...

Status:Needs work» Closed (cannot reproduce)

Agreed, not sure how to fix, but probably not a boost problem, so marking

I am getting the exact same error has anyone solved this issue?

Status:Closed (cannot reproduce)» Active

Having the same problem.

Over at secure pages they are saying this is a boost problem or an .htaccess problem

Any ideas?

Quite easy to fix:

diff --git a/sites/all/modules/performance/boost/boost.module b/sites/all/modules/performance/boost/boost.module
index 59e9a3a..85af3c9 100644
--- a/sites/all/modules/performance/boost/boost.module
+++ b/sites/all/modules/performance/boost/boost.module
@@ -98,7 +98,7 @@ define('BOOST_MAX_THREADS',          8);
// Requires Boost Functions or global scope, Define These Last
global $base_url;
-define('BOOST_CRAWLER_SELF',         $base_url . '/' . 'boost-crawler?nocache=1&key=' . variable_get('boost_crawler_key', FALSE));
+define('BOOST_CRAWLER_SELF',         str_replace('https://', 'http://', $base_url) . '/' . 'boost-crawler?nocache=1&key=' . variable_get('boost_crawler_key', FALSE));
define('BOOST_FILE_PATH',            BOOST_MULTISITE_SINGLE_DB ? boost_cache_directory(NULL, FALSE) : variable_get('boost_file_path', boost_cache_directory(BOOST_HOST, FALSE)));
define('BOOST_GZIP_FILE_PATH',       implode('/', array_filter(explode('/', str_replace(BOOST_ROOT_CACHE_DIR . '/' . BOOST_NORMAL_DIR, BOOST_ROOT_CACHE_DIR . '/' . BOOST_GZIP_DIR . '/', BOOST_FILE_PATH)))));
define('BOOST_PERM_GZIP_FILE_PATH',  implode('/', array_filter(explode('/', str_replace(BOOST_ROOT_CACHE_DIR . '/' . BOOST_NORMAL_DIR, BOOST_ROOT_CACHE_DIR . '/' . BOOST_PERM_GZ_DIR . '/', BOOST_FILE_PATH)))));

Not sure if that is the best fix, but I don't think all URLs should hit https from localhost everytime ...

subscribe. for my similar issue look here:

Would love to see @Fabianx patch applied.

Status:Active» Needs review

I get the same error message with the latest Boost Dev (Dec 23), and securepages module. Patch with #8 removes error on the Status Report page, but no cache pages are created.

Boost with boost crawler works fine with uc_ssl module (with one minor issue), but not securepages.

Any guidance to get the boost crawler working with securepages?

Adding "boost-crawler" to the list of "Ignore pages:" did the trick for me. I'm using 6.x-1.18, though I would suspect that it would work just as well for 6.x-1.x-dev too.

Thanks boosh! Adding "boost-crawler" to the list of "Ignore pages:" in the Secure Pages configuration took care of the error for me too on 6.x-1.20.

I wish I could say the same to suggestion #13...

boost 1.20
securepages 2.x-dev

Any other ideas?

Ya know, after re-reading the initial issue, I've realized the problem I have isn't directly related...

Here is the issue to the problem I am experiencing:

Status:Needs review» Fixed

I'd rather avoid the patch in #8 because of the testing it would take to be sure it does not break other sites.

If we can solve this by a configuration change in securepages, and it is specific to securepages, then I added a "requirement" check instructing people to review their configuration and make sure they have "boost-crawler" in the ignored pages.

Committed to 6.x-1.x, will be in Boost >= 6-x.1.21.

#13 fix did it for me with Boost 6x-1.9 and Secure Pages 6x-1.2. Thank you!

Status:Fixed» Closed (fixed)

Automatically closed -- issue fixed for 2 weeks with no activity.

Experienced this same problem. The fix at #13 appears to work at first but then we had the same issue. The only foolproof solution was to downgrade to securepages 6.x-1.9.