This is a feature that is needed for spiders to browse through your website without having to login.
I don't know how unsecure is this, but I do know that websites such as the ft.com and others allow google to go through their sites by knowing their IP address.
http://drupal.org/node/16370

Comments

Uwe Hermann’s picture

It's quite unsecure if you ask me, IPs can easily be faked.

It always depends on what infos you want to give to logged-in users and hide from anonymous users, I guess. But making info available to Google is equivalent to making it available to the whole world. So why even bother with passwords then?

Just my 2 cents.

Sergio Beristain’s picture

Component: other » user.module

Look at how the FT has implemented it. Old pages do not have access and new pages allow users to have access. Is FT secure? I would say that yes, getting old content is almost impossible.
Making it available to Google (News) is not the same as making it available to the whole world. Google does not cache the pages, so you have control of what your users see.
As an administrator, I advocate: Give the choice to the administrator of the web site.

BTW I have tried implementing this in the and so far I have not had any success:
ip_acess, returns an array of the ip address and the user related to that address (to control permissions)

 function user_access($string, $account = NULL) {
  global $user;
  static $perm = array();

  // User #1 has all priveleges:
  if ($user->uid == 1) {
    return 1;
  }

  $ipok = ip_access ($string);
  if (is_null($account) && !$ipok) {
    $account = $user;
  } else if  ($ipok) {
	  $account = $ipok;
	  };

  // To reduce the number of SQL queries, we cache the user's permissions
  // in a static variable.
  if (!isset($perm[$account->uid])) {
    $result = db_query('SELECT DISTINCT(p.perm) FROM {role} r INNER JOIN {permission} p ON p.rid = r.rid INNER JOIN {users_roles} ur ON ur.rid = r.rid WHERE ur.uid = %d', $account->uid);

    while ($row = db_fetch_object($result)) {
      $perm[$account->uid] .= "$row->perm, ";
    }
  }

  return strstr($perm[$account->uid], "$string, ");
}

Since I do not have a spider, I have tested this though a php page that fetches my web site into a variable and displays it. My php page displays:
warning: file_get_contents(http://www.info-europa.com/clientarea//?q=node/330): failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden
in /var/www/html/clientarea/test.php on line 15.

Sergio Beristain’s picture

Look at how the FT has implemented it. Old pages do not have access and new pages allow users to have access. Is FT secure? I would say that yes, getting old content is almost impossible.
Making it available to Google (News) is not the same as making it available to the whole world. Google does not cache the pages, so you have control of what your users see.
As an administrator, I advocate: Give the choice to the administrator of the web site.

BTW I have tried implementing this in the and so far I have not had any success:
ip_acess, returns an array of the ip address and the user related to that address (to control permissions)

 function user_access($string, $account = NULL) {
  global $user;
  static $perm = array();

  // User #1 has all priveleges:
  if ($user->uid == 1) {
    return 1;
  }

  $ipok = ip_access ($string);
  if (is_null($account) && !$ipok) {
    $account = $user;
  } else if  ($ipok) {
	  $account = $ipok;
	  };

  // To reduce the number of SQL queries, we cache the user's permissions
  // in a static variable.
  if (!isset($perm[$account->uid])) {
    $result = db_query('SELECT DISTINCT(p.perm) FROM {role} r INNER JOIN {permission} p ON p.rid = r.rid INNER JOIN {users_roles} ur ON ur.rid = r.rid WHERE ur.uid = %d', $account->uid);

    while ($row = db_fetch_object($result)) {
      $perm[$account->uid] .= "$row->perm, ";
    }
  }

  return strstr($perm[$account->uid], "$string, ");
}

Since I do not have a spider, I have tested this though a php page that fetches my web site into a variable and displays it. My php page displays:
warning: file_get_contents(): failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden
in /var/www/html/clientarea/test.php on line 15.

forngren’s picture

Version: 4.5.0 » 4.7.3
magico’s picture

Version: 4.7.3 » x.y.z
motoservo’s picture

Version: x.y.z » 6.1

Showing a bot something different than what you show a user who is *not* logged in will get you banned by most search engines. And it defeats the purpose of hiding your content because engines may show a cached version of the page as an alternate link.

In a nutshell - what the engine has cached is what is supposed to be shown with the link that the engine provides.

What you are attempting is called cloaking and it is black hat.

davidwhthomas’s picture

Hi, I've made a module to allow login by IP Address.
See: http://drupal.org/project/ip_login
cheers,
DT

Anonymous’s picture

Hi,

does this modul work if user is behind a NAT?

THX!

Zsigmond

pasqualle’s picture

Title: Automatic login based on an IP address » NAT?
Project: Drupal core » IP Login
Version: 6.1 » 5.x-1.0
Component: user.module » Code
Category: feature » support
davidwhthomas’s picture

Status: Active » Closed (fixed)

closing ancient thread for unrelated module