2

One of the most popular tools to generate static sites is Sphinx which is largely used in the Python community to document code. It converts .rst files into other formats like HTML, PDF and others. But how is it possible that a static documentation with plain HTML files is searchable without losing performance?

I guess, it's done by creating an index (like a JSON file for example) that will be loaded via AJAX and is interpreted by something like lunr.js. Since many major projects in the world of Python have a huge documentation (like the Python docs itself). Therefore, how is it possible, to create such a good search without creating a gigantic index file that needs to be loaded?

bad_coder
  • 11,289
  • 20
  • 44
  • 72
user3147268
  • 1,814
  • 7
  • 26
  • 39
  • 1
    There are many techniques that make indexing and searching efficient including tools like bloom filters and n-gram tokenization. Your question seems a bit too vague and open-ended for SO. – patrys Aug 11 '15 at 12:13
  • I've slightly added the post to make it more general. Your aspect has it's focus on natural language processing, but I'm searching a (combination of) tools that does this job already for me, rather than analyzing the content. – user3147268 Aug 11 '15 at 21:14

1 Answers1

0

You can use Google Search Engine to use Google´s power on your site. It is difficult to customize yet powerful. Other reference in this question

Community
  • 1
  • 1
JrBenito
  • 973
  • 8
  • 30