0

I would like to know if it's possible to make pages invisible for users with .htaccess or something else.

Some background info and some other questions :

I'm building a web application which does a lot of dynamic content loading. At the moment i'm just doing ajax calls on a ajax.php file and i give a POST parameter with the name : action. Depening on this action I call a method of a class, this class then echo's JSON or HTML.

But i'm starting to get at the point, that i have functions who return almost a complete html page (SQL and php in between) which kinda looks dirty. Is it a better option to build seperate .php pages for this content, and then load the content just from that page? (I use Jquery, so that would be the .load function as call back)

The problems i see with this method is that people can find those .php pages and access them, and then see information that's not relevant and has no lay out or something. So i want to block these pages. How do I do this with .htaccess?

Now if u think i'm doing stuff completely wrong, just tell. A basic situation in my implementation : Fill in a client number in a search form -> press enter -> ajax call with post parameters action : getclientinfo , id : the value in the field -> ajax.php gets the correct class and method -> .class.php executes his method and checks if the input is ok, if so it constructs a page with content from the database -> All this gets returned as JSON something like :

{ success : OK; content : the complete content }

-> then with jquery i check the success boolean, and then load the complete content into the current page.

Is this a good method, or am i doing things bad ? The thing i'm not feeling myself ok with, is the ajax.php page. What's usually done ?

Thanks in advance. I got many answers from this website, first time ever I'm asking one myself.

Anonymous
  • 572
  • 3
  • 15
  • Are you trying to prevent users from snooping around and accessing information they shouldn't be privileged to have - or - are you trying to stop robots from scraping your site? – pp19dd Mar 13 '12 at 20:26
  • If i would make seperate php pages, they won't have a header or nice layout. So i just don't want people to see them. But i'm also interested to know how to block robots. I saw something about a robots.txt ? – Anonymous Mar 13 '12 at 20:29
  • Gotcha. Seems like you're dealing with an organizational problem. Let me see if I can compose a longer reply below. – pp19dd Mar 13 '12 at 21:21

1 Answers1

0

Seems like you have two problems: access control and organization.

It'd be best to lock access to pages with a rudimentary authentication level programmatically (in PHP, using session data). Otherwise, the most rudimentary form of access control is to restrict with apache's .htaccess file, but, rules for query string / post can be confusing. See this example: Is it possible to redirect post data?

Since that kind of locking down is complex, it might be logical to dump AJAX and render full, complete pages instead.

In either way, what you seem to be conceptually illustrating is a need for separating of content and logic. This is good. However, it's time to take your development process to the next level.

Rewrite your code to where only logic and data querying takes place in PHP files. Secondly, output all HTML in your pages by using the Smarty library http://www.smarty.net/docs/en/

The Smarty templating system is pretty extensible. It has inheritance capabilities so you can produce page variations, template includes, and otherwise run circles around HTML. A handful of Smarty statements can make your life so much easier.

Community
  • 1
  • 1
pp19dd
  • 3,625
  • 2
  • 16
  • 21
  • I reduced my AJAX, and render almost always complete pages. Ialso protected my "Admin part" with htaccess, i that a bad way to protect my admin content or is it pretty save as long as the password is save? And smarty is pretty cool, thanks ;) accepted your answer – Anonymous May 14 '12 at 02:35