183

I have a php file which I will be using as exclusively as an include. Therefore I would like to throw an error instead of executing it when it's accessed directly by typing in the URL instead of being included.

Basically I need to do a check as follows in the php file:

if ( $REQUEST_URL == $URL_OF_CURRENT_PAGE ) die ("Direct access not premitted");

Is there an easy way to do this?

Antony
  • 14,900
  • 10
  • 46
  • 74
Alterlife
  • 6,557
  • 7
  • 36
  • 49
  • 13
    instead of the die() you should test 'header("HTTP/1.1 404 File Not Found", 404); exit;'. This will (at least on apache) make the server return the normal 404 page. – gnud Jan 04 '09 at 17:08
  • 1
    Here are two easy methods I have explain to disable direct access in PHP included files - https://www.codespeedy.com/disable-direct-access-to-the-php-include-file/ – Faruque Ahamed Mollick Jul 18 '17 at 21:06

33 Answers33

219

Add this to the page that you want to only be included

<?php
if(!defined('MyConst')) {
   die('Direct access not permitted');
}
?>

then on the pages that include it add

<?php
define('MyConst', TRUE);
?>
Amal Murali
  • 75,622
  • 18
  • 128
  • 150
UnkwnTech
  • 88,102
  • 65
  • 184
  • 229
  • 3
    I really need to learn to type quicker. This is the same way I would suggest, as its more secure than a method that uses a variable to check. Since with some PHP setups it may be possible to override the variable. – Mark Davidson Jan 03 '09 at 18:17
  • 3
    This is how a few 'mainstream' applications handle it. I know Joomla does it this way and I think Wiki, Wordpress, and others as well. – UnkwnTech Jan 03 '09 at 18:20
  • 1
    Maybe the message is too helpful for a hacker (no real user would find these pages), you can simply send a redirect header and stop the php processing. – bandi Jan 04 '09 at 10:05
  • 7
    Just send a 404 header and exit -- the error page will look identical to normal 404 pages (at least on Apache). – gnud Jan 04 '09 at 13:24
  • 1
    If someone know your code and create a page with your secret constants to include them? is this bypass your security protection? – Smile.Hunter Apr 29 '13 at 18:21
  • 4
    @Smile.Hunter: this is about blocking access to viewing your include/library script files directly, the answer works. If they created `somefile.php` on your server and added your define in it, that still doesn't let them directly access the include file. It will let them "include" your library files, but if they get far enough to be creating files on your server and knowing your define/include scripts, you have other issues that likely negate writing their own file with your define in the first place. – James Sep 10 '13 at 16:43
  • 3
    A shorter way to deny the access with the same result would be: `defined('MyConst') || die('Direct access not permitted');` – MDeuerlein Jan 10 '16 at 02:43
  • Good Solution, Liked it. Thank you – krupesh Anadkat Aug 30 '18 at 20:16
187

The easiest way for the generic "PHP app running on an Apache server that you may or may not fully control" situation is to put your includes in a directory and deny access to that directory in your .htaccess file. To save people the trouble of Googling, if you're using Apache, put this in a file called ".htaccess" in the directory you don't want to be accessible:

Deny from all

If you actually have full control of the server (more common these days even for little apps than when I first wrote this answer), the best approach is to stick the files you want to protect outside of the directory that your web server is serving from. So if your app is in /srv/YourApp/, set the server to serve files from /srv/YourApp/app/ and put the includes in /srv/YourApp/includes, so there literally isn't any URL that can access them.

Chuck
  • 234,037
  • 30
  • 302
  • 389
  • 1
    Thanks, since I do have full control over the server where I run this app, this is the answer I went with. – Alterlife Jan 03 '09 at 18:23
  • 27
    if you have full control of the server is better if you put the config into a directory directive into the virtual host config file. Apache read it only once on startup, .htaccess is read on every access and slow down the server – Eineki Jan 03 '09 at 19:43
  • 23
    It'd be nice to have an example .htaccess file as part of this answer. – Graham Lea Jun 05 '12 at 11:47
  • 8
    `` `Order Allow,Deny` `Deny from All` `` – Dracorat Sep 27 '12 at 16:53
  • What if you use nginx? :P – Matt Sep 06 '13 at 23:09
  • 12
    @James: Also, not everybody feels that Stack Overflow should be a "plz send teh codez" site. If it answers the question clearly, then it is a good answer. Providing an example where none is needed only encourages copy-and-paste coding. – Chuck Sep 10 '13 at 21:03
  • 1
    @Chuck: Interesting response. Except that PHP is the language of the masses. StackOverflow is the go to "about PHP" site of the masses (other than php.net, of course). Most of the masses have a somewhat, er, "dimmed" understanding of website security on account of them not being evil hackers. AND setting up .htaccess *CORRECTLY* is actually massively tricky. I, for one, will not think any less of you for aiding and abetting C&P coding... just this once o.O PS: Put some caching stuff in there too cos these noobs don't know nothing! It's embarrassing. Real coders (like us) are *born* knowing! :P –  Dec 06 '13 at 10:21
  • And to explain my downvote :( PHP coders (especially) need to become a lot more mindful of site security AND code security. They are not the same thing. We should not be encouraging single strategy solutions - at least not without mentioning the pitfalls (see my answer below). –  Dec 06 '13 at 10:55
  • @HighPriestessofTheTech: No hard feelings. You should downvote answers you don't think do a good enough job. To explain my perspective: I don't feel I'm up to the task of making bad coders (not newbies, but bad coders) write good code. I just do my best to help people who are willing to try and learn. As for multiple strategies, I feel squicky about doing the same thing several times. Most of the time, it smacks of voodoo programming to me — "I'm not sure if I'm doing this right, so I'll do two possibly wrong things and hopefully one works!" But I still think your answer is good, so +1 to you. – Chuck Dec 06 '13 at 18:49
  • At 55200+ view, I believe this is where Google takes everyone ^_^ so for the edit +1 from me - the "killing two birds with one stone" award ;) Thank you kindly for the compliment :) Now, if you'll excuse me, back to the voodoo! –  Dec 07 '13 at 03:54
  • @Chuck I've tried to put the following code in my httpd.conf file (i'm using CentOS and my webpage is completely not accessible now. I'm trying to figure out what the problem was. AllowOverride All Allow from all /var/www/html/include <= this was the path that I wanted to restrict so that people cannot go into my files through the URL by accessing example.com/include How might I be able to go about finding out what I've done wrong? As well, I tried to delete the lines and my webpage still can't be accessed through the URL – moto Jun 11 '14 at 20:13
  • 1
    Is this answer really correct ? I'll need `AllowOverride All` in the Apache configs, and this is by far not default on any major OS setups. – Sliq Oct 21 '14 at 15:42
  • 3
    I tried this and if you do `Deny from all` it won't even work in an include, it will 404. – Michael Rogers Jul 03 '17 at 09:53
  • i tried the `deny from all` but it is also denying me from making fetch request from those files, so what do you suggest i do – Curtis Crentsil Aug 08 '20 at 00:41
123

I have a file that I need to act differently when it's included vs when it's accessed directly (mainly a print() vs return()) Here's some modified code:

if(count(get_included_files()) ==1) exit("Direct access not permitted.");

The file being accessed is always an included file, hence the == 1.  

OCDev
  • 2,280
  • 3
  • 26
  • 37
null
  • 7,432
  • 4
  • 26
  • 28
  • 13
    That's actually a hell of an idea, checking the included file count. I wonder which is better: using defines, or using this method? This seems more self-contained. – Akoi Meexx Jun 08 '11 at 16:40
  • First time I've ever seen anyone come up with this. I don't know why though, because it seems as self contained as can be, and it's directly measuring what you actually want to know (if is included or not) rather than measuring something assumed to be associated (like a certain constant or a certain location banned by .htaccess). Beautiful. – Jimbo Jonny Aug 08 '12 at 12:34
  • This one is really cool because using .htaccess to block all .php files may not be possible all the time as there can be some files in the same directory those need to be called directly or even by javascripts. Thanks for this great idea! – Anuj Sep 18 '12 at 20:55
  • 4
    This only works in PHP5 and up. Pre-PHP5 you want to compare again 0 instead of 1. – jmucchiello Jun 10 '13 at 01:26
  • This is a really clever solution and allows you to control direct access, not just block it - that's what I like. I usually include unit testing in the files themselves, and this way I can wrap my unit testing in this if statement. Wonder how efficient it is.. – whiteatom Jul 12 '13 at 16:19
  • Clever. I use this method in my application. – Tom Aug 07 '13 at 23:09
  • Haven't tested it in pre-php5 but i suppose the code would be `if( count(get_included_files()) == ((version_compare(PHP_VERSION, '5.0.0', '>='))?1:0) ) exit("Direct access not permitted.");` – Glitch Jan 17 '17 at 05:21
  • Clever solution. – Ahmet Sep 28 '17 at 16:34
  • Finally someone answer on the quiestion clearly!!! just wasted my time searching for answers for 3 days and some how, found your answer and worked great!!! Thanks man! – Burhan Kashour Oct 07 '17 at 18:08
  • So many years and still useful ;) Actually, I needed to swap the condition, i.e. do not allow including the file, only direct access permitted. Worked great – Sebastian Kaczmarek Dec 04 '19 at 08:57
  • So simple, so minimal, so local and yet transferable. Clean. – andiOak Feb 22 '21 at 17:05
  • This is the most cleaver way to do I found, thanks for that. But for security reasons I prefer to send a 404 `http_response_code(404)` because the exit will tell people that the file actually exists. – iguypouf Jun 16 '21 at 06:48
43

1: Checking the count of included files

if( count(get_included_files()) == ((version_compare(PHP_VERSION, '5.0.0', '>='))?1:0) )
{
    exit('Restricted Access');
}

Logic: PHP exits if the minimum include count isn't met. Note that prior to PHP5, the base page is not considered an include.


2: Defining and verifying a global constant

// In the base page (directly accessed):
define('_DEFVAR', 1);

// In the include files (where direct access isn't permitted):
defined('_DEFVAR') or exit('Restricted Access');

Logic: If the constant isn't defined, then the execution didn't start from the base page, and PHP would stop executing.

Note that for the sake of portability across upgrades and future changes, making this authentication method modular would significantly reduce the coding overhead as the changes won't need to be hard-coded to every single file.

// Put the code in a separate file instead, say 'checkdefined.php':
defined('_DEFVAR') or exit('Restricted Access');

// Replace the same code in the include files with:
require_once('checkdefined.php');

This way additional code can be added to checkdefined.php for logging and analytical purposes, as well as for generating appropriate responses.

Credit where credit is due: The brilliant idea of portability came from this answer. However there is one con to this method. Files in different folders may require different addresses to address this file. And server root based addressing may not work if you're running the current website from within a subfolder of the main site.


3: Remote address authorisation

// Call the include from the base page(directly accessed):
$includeData = file_get_contents("http://127.0.0.1/component.php?auth=token");

// In the include files (where direct access isn't permitted):
$src = $_SERVER['REMOTE_ADDR']; // Get the source address
$auth = authoriseIP($src); // Authorisation algorithm
if( !$auth ) exit('Restricted Access');

The drawback with this method is isolated execution, unless a session-token provided with the internal request. Verify via the loop-back address in case of a single server configuration, or an address white-list for a multi-server or load-balanced server infrastructure.


4: Token authorisation

Similar to the previous method, one can use GET or POST to pass an authorization token to the include file:

if($key!="serv97602"){header("Location: ".$dart);exit();}

A very messy method, but also perhaps the most secure and versatile at the same time, when used in the right way.


5: Webserver specific configuration

Most servers allow you to assign permissions for individual files or directories. You could place all your includes in such restricted directories, and have the server configured to deny them.

For example in APACHE, the configuration is stored in the .htaccess file. Tutorial here.

Note however that server-specific configurations are not recommended by me because they are bad for portability across different web-servers. In cases like Content Management Systems where the deny-algorithm is complex or the list of denied directories is rather big, it might only make reconfiguration sessions rather gruesome. In the end it's best to handle this in code.


6: Placing includes in a secure directory OUTSIDE the site root

Least preferred because of access limitations in server environments, but a rather powerful method if you have access to the file-system.

//Your secure dir path based on server file-system
$secure_dir=dirname($_SERVER['DOCUMENT_ROOT']).DIRECTORY_SEPARATOR."secure".DIRECTORY_SEPARATOR;
include($secure_dir."securepage.php");

Logic:

  • The user cannot request any file outside the htdocs folder as the links would be outside the scope of the website's address system.
  • The php server accesses the file-system natively, and hence can access files on a computer just like how a normal program with required privileges can.
  • By placing the include files in this directory, you can ensure that the php server gets to access them, while hotlinking is denied to the user.
  • Even if the webserver's filesystem access configuration wasn't done properly, this method would prevent those files from becoming public accidentally.

Please excuse my unorthodox coding conventions. Any feedback is appreciated.

Glitch
  • 595
  • 6
  • 18
42

The best way to prevent direct access to files is to place them outside of the web-server document root (usually, one level above). You can still include them, but there is no possibility of someone accessing them through an http request.

I usually go all the way, and place all of my PHP files outside of the document root aside from the bootstrap file - a lone index.php in the document root that starts routing the entire website/application.

Eran Galperin
  • 86,251
  • 24
  • 115
  • 132
  • 3
    This is a great solution if you are able to do so. I only recently had to start working with shared webhosts and discovered one of many annoyances to be that everything must be inside the docroot. – Beau Simensen Jan 03 '09 at 19:29
  • 3
    In every hosting provider I worked with I always had access to (exactly) one level above the document root. – Eran Galperin Jan 03 '09 at 21:21
  • 3
    At some hosts (including my current one), you can point your domain to whichever folder you wish. – Dinah Jun 23 '09 at 15:43
  • 1
    This is a good alternative as well .. using preg_match -> if ( preg_match("~globalfile\.php~i", $_SERVER['PHP_SELF'] ) ) { die('

    Appliance Security Alert - Direct Access Disallowed! Your IP has been logged !

    '); // Stop further execution }

    – MarcoZen Nov 30 '17 at 15:13
  • That url https://devzone.zend.com/node/view/id/70 is a 404 now. The answer should include the code that was originally used from that non-existant url. – Funk Forty Niner May 08 '18 at 13:22
31

An alternative (or complement) to Chuck's solution would be to deny access to files matching a specific pattern by putting something like this in your .htaccess file

<FilesMatch "\.(inc)$">
    Order deny,allow
    Deny from all
</FilesMatch>
NullUserException
  • 83,810
  • 28
  • 209
  • 234
Kevin Loney
  • 7,483
  • 3
  • 28
  • 33
14

Actually my advice is to do all of these best practices.

  • Put the documents outside the webroot OR in a directory denied access by the webserver AND
  • Use a define in your visible documents that the hidden documents check for:
      if (!defined(INCL_FILE_FOO)) {
          header('HTTP/1.0 403 Forbidden');
          exit;
      }

This way if the files become misplaced somehow (an errant ftp operation) they are still protected.

jmucchiello
  • 18,754
  • 7
  • 41
  • 61
9

I had this problem once, solved with:

if (strpos($_SERVER['REQUEST_URI'], basename(__FILE__)) !== false) ...

but the ideal solution is to place the file outside of the web-server document root, as mentioned in another anwser.

mati
  • 129
  • 1
  • 6
8

I wanted to restrict access to the PHP file directly, but also be able to call it via jQuery $.ajax (XMLHttpRequest). Here is what worked for me.

if (empty($_SERVER["HTTP_X_REQUESTED_WITH"]) && $_SERVER["HTTP_X_REQUESTED_WITH"] != "XMLHttpRequest") {
    if (realpath($_SERVER["SCRIPT_FILENAME"]) == __FILE__) { // direct access denied
        header("Location: /403");
        exit;
    }
}
Regolith
  • 2,944
  • 9
  • 33
  • 50
krasenslavov
  • 355
  • 1
  • 4
  • 14
7

You'd better build application with one entrance point, i.e. all files should be reached from index.php

Place this in index.php

define(A,true);

This check should run in each linked file (via require or include)

defined('A') or die(header('HTTP/1.0 403 Forbidden'));
user2221806
  • 79
  • 1
  • 1
5

My answer is somewhat different in approach but includes many of the answers provided here. I would recommend a multipronged approach:

  1. .htaccess and Apache restrictions for sure
  2. defined('_SOMECONSTANT') or die('Hackers! Be gone!');

HOWEVER the defined or die approach has a number of failings. Firstly, it is a real pain in the assumptions to test and debug with. Secondly, it involves horrifyingly, mind-numbingly boring refactoring if you change your mind. "Find and replace!" you say. Yes, but how sure are you that it is written exactly the same everywhere, hmmm? Now multiply that with thousands of files... o.O

And then there's .htaccess. What happens if your code is distributed onto sites where the administrator is not so scrupulous? If you rely only on .htaccess to secure your files you're also going to need a) a backup, b) a box of tissues to dry your tears, c) a fire extinguisher to put out the flames in all the hatemail from people using your code.

So I know the question asks for the "easiest", but I think what this calls for is more "defensive coding".

What I suggest is:

  1. Before any of your scripts require('ifyoulieyougonnadie.php'); (not include() and as a replacement for defined or die)
  2. In ifyoulieyougonnadie.php, do some logic stuff - check for different constants, calling script, localhost testing and such - and then implement your die(), throw new Exception, 403, etc.

    I am creating my own framework with two possible entry points - the main index.php (Joomla framework) and ajaxrouter.php (my framework) - so depending on the point of entry, I check for different things. If the request to ifyoulieyougonnadie.php doesn't come from one of those two files, I know shenanigans are being undertaken!

    But what if I add a new entry point? No worries. I just change ifyoulieyougonnadie.php and I'm sorted, plus no 'find and replace'. Hooray!

    What if I decided to move some of my scripts to do a different framework that doesn't have the same constants defined()? ... Hooray! ^_^

I found this strategy makes development a lot more fun and a lot less:

/**
 * Hmmm... why is my netbeans debugger only showing a blank white page 
 * for this script (that is being tested outside the framework)?
 * Later... I just don't understand why my code is not working...
 * Much later... There are no error messages or anything! 
 * Why is it not working!?!
 * I HATE PHP!!!
 * 
 * Scroll back to the top of my 100s of lines of code...
 * U_U
 *
 * Sorry PHP. I didn't mean what I said. I was just upset.
 */

 // defined('_JEXEC') or die();

 class perfectlyWorkingCode {}

 perfectlyWorkingCode::nowDoingStuffBecauseIRememberedToCommentOutTheDie();
5
debug_backtrace() || die ("Direct access not permitted");
Unirgy
  • 1,495
  • 15
  • 25
4

The easiest way is to set some variable in the file that calls include, such as

$including = true;

Then in the file that's being included, check for the variable

if (!$including) exit("direct access not permitted");
Kyle Cronin
  • 77,653
  • 43
  • 148
  • 164
3
<?php       
$url = 'http://' . $_SERVER['SERVER_NAME'] . $_SERVER['REQUEST_URI'];
  if (false !== strpos($url,'.php')) {
      die ("Direct access not premitted");
  }
?>
Matt Bettiol
  • 309
  • 1
  • 3
  • 9
3

What Joomla! does is defining a Constant in a root file and checking if the same is defined in the included files.

defined('_JEXEC') or die('Restricted access');

or else

one can keep all files outside the reach of an http request by placing them outside the webroot directory as most frameworks like CodeIgniter recommend.

or even by placing an .htaccess file within the include folder and writing rules, you can prevent direct access.

Antony
  • 14,900
  • 10
  • 46
  • 74
3

Besides the .htaccess way, I have seen a useful pattern in various frameworks, for example in ruby on rails. They have a separate pub/ directory in the application root directory and the library directories are living in directories at the same level as pub/. Something like this (not ideal, but you get the idea):

app/
 |
 +--pub/
 |
 +--lib/
 |
 +--conf/
 |
 +--models/
 |
 +--views/
 |
 +--controllers/

You set up your web server to use pub/ as document root. This offers better protection to your scripts: while they can reach out from the document root to load necessary components it is impossible to access the components from the internet. Another benefit besides security is that everything is in one place.

This setup is better than just creating checks in every single included file because "access not permitted" message is a clue to attackers, and it is better than .htaccess configuration because it is not white-list based: if you screw up the file extensions it will not be visible in the lib/, conf/ etc. directories.

bandi
  • 4,204
  • 3
  • 25
  • 24
  • After a long time I just want to comment that the model you describe above called MVC (Model - View - Controller) model. If you please, check google and add some more info to your answer. Also MVC support not only Ruby on Rails and ASP.NET applications, but also PHP ( see Lavarel, CakePHP). –  Feb 15 '16 at 11:45
2

If more precisely, you should use this condition:

if (array_search(__FILE__, get_included_files()) === 0) {
    echo 'direct access';
}
else {
    echo 'included';
}

get_included_files() returns indexed array containing names of all included files (if file is beign executed then it was included and its name is in the array). So, when the file is directly accessed, its name is the first in the array, all other files in the array were included.

Oleg Lokshyn
  • 517
  • 7
  • 14
2

Storing your include files outside the web accessible directory has been mentioned a few times, and is certainly a good strategy where possible. However, another option I have not yet seen mentioned: ensure that your include files don’t contain any runnable code. If your include files merely define functions and classes, and have no code other than that, they will simply produce a blank page when accessed directly.

By all means allow direct access to this file from the browser: it won’t do anything. It defines some functions, but none of them are called, so none of them run.

<?php

function a() {
    // function body
}

function b() {
    // function body
}

The same applies to files which contain only PHP classes, and nothing else.


It’s still a good idea to keep your files outside of the web directory where possible.

  • You might accidentally deactivate PHP, in which case your server may send content of the PHP files to the browser, instead of running PHP and sending the result. This could result in your code (including database passwords, API keys, etc.) leaking.
  • Files in the web directory are squatting on URLs you may want to use for your app. I work with a CMS which cannot have a page called system, because that would conflict with a path used for code. I find this annoying.
TRiG
  • 10,148
  • 7
  • 57
  • 107
1
if (basename($_SERVER['PHP_SELF']) == basename(__FILE__)) { die('Access denied'); };
andy
  • 217
  • 3
  • 3
1
<?php
if (eregi("YOUR_INCLUDED_PHP_FILE_NAME", $_SERVER['PHP_SELF'])) { 
 die("<h4>You don't have right permission to access this file directly.</h4>");
}
?>

place the code above in the top of your included php file.

ex:

<?php
if (eregi("some_functions.php", $_SERVER['PHP_SELF'])) {
    die("<h4>You don't have right permission to access this file directly.</h4>");
}

    // do something
?>
  • if ( preg_match("~globalfile\.php~i", $_SERVER['PHP_SELF'] ) ) { die('

    Appliance Security Alert - Direct Access Disallowed! Your IP has been logged !

    '); // Stop further execution } whre ~ is the delimiter

    – MarcoZen Nov 30 '17 at 15:12
1

The following code is used in the Flatnux CMS (http://flatnux.altervista.org):

if ( strpos(strtolower($_SERVER['SCRIPT_NAME']),strtolower(basename(__FILE__))) )
{
    header("Location: ../../index.php");
    die("...");
}
JohnRDOrazio
  • 1,358
  • 2
  • 15
  • 28
1

Do something like:

<?php
if ($_SERVER['SCRIPT_FILENAME'] == '<path to php include file>') {
    header('HTTP/1.0 403 Forbidden');
    exit('Forbidden');
}
?>
kmkaplan
  • 18,655
  • 4
  • 51
  • 65
1

I found this php-only and invariable solution which works both with http and cli :

Define a function :

function forbidDirectAccess($file) {
    $self = getcwd()."/".trim($_SERVER["PHP_SELF"], "/");
    (substr_compare($file, $self, -strlen($self)) != 0) or die('Restricted access');
}

Call the function in the file you want to prevent direct access to :

forbidDirectAccess(__FILE__);

Most of the solutions given above to this question do not work in Cli mode.

Ka.
  • 1,189
  • 1
  • 12
  • 18
0

i suggest that don't use of $_SERVER for security reasons .
You can use a variable like $root=true; in first file that included another one.
and use isset($root) in begin of second file that be included.

M Rostami
  • 4,035
  • 1
  • 35
  • 39
0

What you can also do is password protect the directory and keep all your php scripts in there, ofcourse except the index.php file, as at the time of include password won't be required as it will be required only for http access. what it will do is also provide you the option to access your scripts in case you want it as you will have password to access that directory. you will need to setup .htaccess file for the directory and a .htpasswd file to authenticate the user.

well, you can also use any of the solutions provided above in case you feel you don't need to access those files normally because you can always access them through cPanel etc.

Hope this helps

RohitG
  • 41
  • 6
0

The easiest way is to store your includes outside of the web directory. That way the server has access to them but no outside machine. The only down side is you need to be able to access this part of your server. The upside is it requires no set up, configuration, or additional code/server stress.

0

I didn't find the suggestions with .htaccess so good because it may block other content in that folder which you might want to allow user to access to, this is my solution:

$currentFileInfo = pathinfo(__FILE__);
$requestInfo = pathinfo($_SERVER['REQUEST_URI']);
if($currentFileInfo['basename'] == $requestInfo['basename']){
    // direct access to file
}
talsibony
  • 8,448
  • 6
  • 47
  • 46
0

Earlier mentioned solution with PHP version check added:

    $max_includes = version_compare(PHP_VERSION, '5', '<') ? 0 : 1;
    if (count(get_included_files()) <= $max_includes)
    {
        exit('Direct access is not allowed.');
    }
  • 2
    I don't really understand how this can prevent direct access – Adam Lindsay Oct 07 '16 at 14:54
  • @adam-lindsay `get_included_files` will return included files as array, and `count` change array to a number, our included files **should** be more than 1 files, if there is 1 included file, its mean this file directly called – a55 Dec 04 '21 at 23:34
0

You can use phpMyAdmin Style:

/**
 * block attempts to directly run this script
 */
if (getcwd() == dirname(__FILE__)) {
    die('Attack stopped');
}
namco
  • 6,208
  • 20
  • 59
  • 83
0

You can use the following method below although, it does have a flaw, because it can be faked, except if you can add another line of code to make sure the request comes only from your server either by using Javascript. You can place this code in the Body section of your HTML code, so the error shows there.

<?
if(!isset($_SERVER['HTTP_REQUEST'])) { include ('error_file.php'); }
else { ?>

Place your other HTML code here

<? } ?>

End it like this, so the output of the error will always show within the body section, if that's how you want it to be.

Charming Prince
  • 491
  • 3
  • 11
  • I understand that all HTTP_* server headers are not to be trusted, so you better not use this method. – andreszs Apr 03 '17 at 15:01
-1
if ( ! defined('BASEPATH')) exit('No direct script access allowed');

will do the job smooth

Varshaan
  • 555
  • 9
  • 22
  • 3
    Copy paste from CodeIgnitor. That's cool but it actually doesn't do anything on it's own. The `BASEPATH` `const ` is set in a `index.php` file that lays at the bottom of the tree structure. CI rewrites the URLs so there is no need for accessing the scripts directly anyways. – jimasun Oct 28 '16 at 15:37
  • i know there is no need but if any one tries to do so – Varshaan Oct 28 '16 at 15:43
-1

Redirect from that file to some other page. (like index.html)

.htaccess:

Redirect 301 LINK_TO_YOUR_PHP LINK_TO_INDEX.HTML
Benur21
  • 109
  • 7
-7

You can also try renaming the document you don't want people to be able to access. You could rename it to 47d8498d3w.php for instance. Just make something up that people most likely won't type as a http-request. If you include the file with SSI or PHP, the user won't be able to see the name of the document anyway.