0

I have a form set up that, when submitted, uses an ajax call to retrieve data via a PHP file that in turn scrapes data from a given URL based on the input field value.

Everything is working perfectly, but what I'd like to do now is implement a couple of additional features.

1) After the initial form submission, I'd like it to auto-update the query at set intervals (Chosen by the end user). I'd like to append the new results above the old results if possible.

2) When new results are returned, I'd like a notification in the title of the page to inform the user (Think Facebook and their notification alert).

Current jQuery/Ajax code:

form.on('submit', function(e) {
e.preventDefault(); // prevent default form submit

$.ajax({
  url: 'jobSearch.php', // form action url
  type: 'POST', // form submit method get/post
  dataType: 'html', // request type html/json/xml
  data: form.serialize(), // serialize form data 
  beforeSend: function() {
    alert.fadeOut();
    submit.val('Searching....'); // change submit button text
  },
  success: function(data) {
    $('#container').css('height','auto');
    alert.html(data).fadeIn(); // fade in response data
    submit.val('Search!'); // reset submit button text
  },
  error: function(e) {
    console.log(e)
  }
});

});

I'm not too sure how I'd go about this, could anyone give me an insight? I'm not after somebody to complete it for me, just give me a bit of guidance on what methodology I should use.

EDIT - jobSearch.php

<?php
error_reporting(E_ALL);
include_once("simple_html_dom.php");

$sq = $_POST['sq'];
$sq = str_replace(' ','-',$sq);


if(!empty($sq)){
//use curl to get html content
$url = 'http://www.peopleperhour.com/freelance-'.$sq.'-jobs?remote=GB&onsite=GB&filter=all&sort=latest';
}else{
$url = 'http://www.peopleperhour.com/freelance-jobs?remote=GB&onsite=GB&filter=all&sort=latest';
}

$html = file_get_html($url);
 $jobs = $html->find('div.job-list header aside',0);
echo $jobs . "<br/>";
foreach ($html->find('div.item-list div.item') as $div) {
 echo $div . '<br />';
};
?>
MikeF
  • 485
  • 6
  • 22

2 Answers2

0

If i understand correctly. . .

For updating, u can try to do something like this:

var refresh_rate = 2500

function refresh_data() {
    // - - - do some things here - - -
    setTimeout (refresh_data(),refresh_rate); // mb not really correct
}

You can read more about it here

Hope, i helped you

0

Question 1:

You can wrap your current ajax code in a setInterval() which will allow you to continue to poll the jobSearch.php results. Something like:

function refreshPosts(interval) {
  return setInterval(pollData, interval); 
}

function pollData() {
  /* Place current AJAX code here */
}

var timer = refreshPosts(3000);

This has the added benefit of being able to call timer.clearInterval() to stop auto-updating.

Appending the data instead of replacing the data is trickier. The best way, honestly, requires rewriting your screen scraper to return JSON objects rather than pure HTML. If you were to return an object like:

{
  "posts": [
  // Filled with strings of HTML
  ]
}

You now have an array that can be sorted, filtered, and indexed. This gives you the power to compare one post to another to see if it is old or fresh.

Question 2:

If you rewrote like I suggested above, than this is as easy as keeping count of the number of fresh posts and rewriting the title HTML

$('title').html(postCount + ' new job postings!');

Hope that helps!

Community
  • 1
  • 1
srquinn
  • 10,134
  • 2
  • 48
  • 54
  • I've hit a brick wall. I'm not that clued up on json. I've managed to get the PHP file to respond with a JSON array. I'm stumped how to decode that JSON client-side. [link](http://pastebin.com/SUa3NycF) – MikeF Dec 10 '13 at 10:31
  • You can parse the JSON client side with `JSON.parse(/*response string here*/)` If you're browser is modern or >=IE8 there is a native JSON global object, otherwise you have to include the lib which can be found [here](https://github.com/douglascrockford/JSON-js) – srquinn Dec 10 '13 at 18:22