0

im having trouble finding a script like this, it should be simple but idk

all i want is a script that can take any image within a file/folder and display it on a php page so the script would be on the php page and it would display the images on that page to

but i would like it to grab it from a webserver other than mine using a link

so lets say i wanted to grab all the images from "http://www.sitename/images/

everything in the folder images the script would take them and place them onto the page, it would show the image side by side for all and any image types in that one folder and it would show if you open a browser 2 days later if it has new images it would shpow the new images and if some have been deleted it would not hsow thoes so basically show the current images in that folder and post them on the php page the script is on as an actual image

cant seem to find something like it but it looks to be a small simple script

anyone know how i can achieve this function?

hakre
  • 193,403
  • 52
  • 435
  • 836
Lane
  • 1
  • 1

2 Answers2

4

If I understand you correctly you're asking for a pointer on how to get the directory listing of a remote folder.

There are many options:

  • If you have FTP/SFTP-access you could use those to get the list of the files in that folder
  • If you have access but don't like the idea of FTPing every time then you could upload a (php-) script which returns the directories contents in JSON or XML format (or a simple format with just one Filename per line)
  • If you don't have access but the webserver provides a directory-listing then you have to parse the HTML-output.
  • If the webserver does not provide a directory listing of the contents then there is now way to get the file list (except for hacking in to the webserver).
vstm
  • 12,407
  • 1
  • 51
  • 47
  • no, not like that, i would like to go onto a link like (real link here "http://www.itsover9000.net/DigiChat/DigiClasses/Resources/Default/userIcons/" and thats a link to a folder with nothing but images in it like .gif, .jpeg, .png and sp forth, i want the php script to go to this link grab the images and place them on a php page an actual image from first to last on the link that is inserted in the php script – Lane Aug 28 '11 at 07:32
  • 1
    you cannot do that ``Forbidden You don't have permission to access /DigiChat/DigiClasses/Resources/Default/userIcons/ on this server.``, read 4th line of @vstm explanation! – Mihai Iorga Aug 28 '11 at 07:35
  • That link doesn't work. And I wish SO would notify me when new answers are posted like it used to... is something broken? – DaveRandom Aug 28 '11 at 07:36
  • iv seen it befor but it was a program, you add the link and it saves the images in the folder on your pc in a file, i want it like that but instead of in a file print the data on my php page – Lane Aug 28 '11 at 07:36
  • like if i were to use this program on that link, it would grab the images and save them to my desktop, i dont see why there cant be a php script but print data on a php page instead of saving them to a folder on my desktop – Lane Aug 28 '11 at 07:38
  • 2
    that program may have parsed the HTML on the site to obtain a list of images which is a) very hard and time consuming, and b) not guaranteed to get every image. – DaveRandom Aug 28 '11 at 07:39
  • @Lane, well there is the possibility of scraping the whole website for images - like `wget -r`. But this is a time intensive process - see the answer of [How to implement a web scraper in PHP?](http://stackoverflow.com/questions/26947/how-to-implement-a-web-scraper-in-php). edit: If I had to do this I would do that in a cron script and run it daily (depends on how "fresh" you need the content) – vstm Aug 28 '11 at 07:42
0

You can only grab a list of files from another site if you know the names of the files, or if the server provides some form of API to allow you to obtain a list of all the file names.

This is because HTTP provides no in-built method to list the files in a directory - it is not designed to be a direct alternative to FTP and has less 'commands' (only 8 request methods are defined by the RFC, one of them very vaguely, and only 2 of these are commonly used by browsers - GET and POST). None of these will allow you to list the files in a directory.

Many web servers will return an HTML page that lists the files in a directory by default if you request the root of a directory. In this case, you could parse this page to obtain a list of files, and then download the files.

Alternatively, if you have FTP access to the folder, it would be possible to use PHP to obtain the list of files and then download them.

If you already have the file list and you just need to download them, have a look at the cURL extension.

DaveRandom
  • 87,921
  • 11
  • 154
  • 174
  • http://www.itsover9000.net/DigiChat/DigiClasses/Resources/Default/userIcons/ << a link like this could the images be grabbed from this page of images even though its forbidden, cuz i can grab the images with an image grabbing program even though its forbidden? – Lane Aug 28 '11 at 07:45
  • @Lane: could you use your existing image grabbing program on your webserver? Or could you upload the output of your existing program so that your php script can use that? Otherwise you could contact the author of your image grabbing program - maybe he's giving out his secret sauce? – vstm Aug 28 '11 at 08:09
  • http://forusoftware.com/help/ie_picture_downloader.htm << the program i used to use to download all images on the web from any folder. – Lane Aug 28 '11 at 08:18