Is anyone using a cloud to provide access to their RETS MLS images so that the pubic can access at will w/o password? I currently have the images storing on my server for direct public access through my website, however need lots more space and just not sure about how to get the images to the web and make them available to view through my website. I download them each 15 min from the RETS server and store them on my server. So, what I need is the procedure to convert this storage to a cloud, knowing there may be some password requirements to access the cloud.
-
You could upload them to Amazon S3, but you will need to check with your MLS provider's policies that storing the RETS images on S3 or other cloud provider complies with their rules. – Andrew Briggs Dec 28 '16 at 17:09
-
I am aware that it can be done.. I am just looking for suggestions from those who are doing it.. like what's involved.. is it really manageable to have 60,000 directories as each image needs it own directory.. and that can get real nasty fast since you must delete them frequently.. but thanks for the suggestion. – Tom Chambers Dec 29 '16 at 23:13
-
Why don't you use S3 and have a bucket (folder) for each MLS ID. The bucket contains the pictures. Every time there is a photo update for that MLS ID, delete the contents of the bucket and upload new photos. Then in the same update script, store the URLs of the S3 image links in the database. – Andrew Briggs Dec 30 '16 at 23:25
2 Answers
We are using Azure Storage (https://learn.microsoft.com/en-us/azure/storage/) to cache all of the images for one of our products. We sort of "lazy load" the images in there. If a request is made for an image, we pull it from our cloud storage (where the image is made public) and stream it from there much faster than over RETS media request. If we ever have to make an on demand request from a RETS server for the image we immediately cache it. We'll also pre-fetch images for slower MLS's. Images have been a severe performance impact on working with RETS systems for us.
This also lets us do some cool stuff like resizing the image on the fly pretty quickly too for our customers so they don't have to waste bandwidth downloading full size images if that's not what they want.
Here is a link to our open source SDK on how our customers end up using the image service. https://github.com/timitek/getrets-php-sdk#imageurl
Amazon S3 Buckets is a better solution. We have around 18 TB of data in it. And also like you said " each image needs it own directory", we have millions of directories for 400+ MLSs.
Everything works fine and there is no delay as well. Its scalable too.
Note: We are getting binary raw data for MLS images and that we are directly writing to S3 and making the image urls.