0

I have a callback function which hits a wordpress api multiple times and concatenates an array of posts of a specific category from the response. It works fine however when it gets to the last call it adds some duplicates to the array which i think is causing problems further down the line.

I've tried to remove these duplicates using a set but for some reason these duplicates are still being returned.

below i'm logging the length of the array.

At the time of writing there are:

40 unique posts of category blog

257 unique posts of category features

however for the features the array still contains 300 items

Does anyone have any idea why this might be happening ?

function getPostsOfType(category) {
    return new Promise((resolve, reject) => {
    const getData = (category, number = 0, page = 0) =>
      fetch(`https://public-api.wordpress.com/rest/v1/sites/www.accessaa.co.uk/posts?category=${category}&number=${number}&page=${page}&order_by=date`)
        .then(res => res.json())

    const found = (category) => getData(category).then(json => json.found);

    found(category)
      .then((value) => {
          return Math.ceil(value/100);
      })
      .then((callsToMake) => {
          let tasks = [];
          for (i = 0; i < callsToMake; i++) {
              tasks.push(getData(category, 100, i)) //<--- Fill tasks array with promises that will eventually return a value
          }
          return Promise.all(tasks); //<-- Run these tasks in parallel and return an array of the resolved values of the N Promises.
      })
      .then((arrOfPosts) => {
          let allPosts = [];
          for(var elem of arrOfPosts)
              allPosts = allPosts.concat(elem.posts);
          console.log(Array.from(new Set(allPosts)).length);
      }).catch((err) => {
          console.log(err);
          reject(err);
      })
  })
}

getPostsOfType('blog') //returns 40
getPostsOfType('features') //should return 257(ish) still returns 300
synj
  • 92
  • 2
  • 8
  • 2
    The object references will not be identical, even though the data inside may be - hence why you're getting duplicates in a Set. You'll need to map them yourself based on id or some other property that determines whether the object is unique or a duplicate – mhodges Mar 28 '18 at 17:45
  • Possible duplicate of [Remove duplicates from an array of objects in JavaScript](https://stackoverflow.com/questions/2218999/remove-duplicates-from-an-array-of-objects-in-javascript) – mhodges Mar 28 '18 at 18:02
  • Thanks for your help. I've tried solving the problem using [lodash.uniqby](https://lodash.com/docs/4.17.5#uniqBy) like so `uniqBy(allPosts, 'ID')` however now i'm only getting 200 posts back. – synj Mar 28 '18 at 18:23
  • Try `_.uniqWith(allPosts, _.isEqual);` That will do a deep comparison to determine if all values of the objects are equal. Are you sure you have more than 200 unique IDs? You can check that easily with `new Set(allPosts.map(post => post.ID)).size` – mhodges Mar 28 '18 at 19:43
  • Also, if your posts for different categories are stored in different DB tables, they can very well have duplicate IDs. i.e. `{id: 1, category: 'blog'}, {id: 1, category: 'features'}`. If that is the case, you will have to do a uniq by id & category (or any other fields that guarantee uniqueness) – mhodges Mar 28 '18 at 19:45

0 Answers0