0

I want to get to a web page, extract the information I need and then write it in a text file and download it. I couldn't find a way to download or copy the text.

I made an extension which can download text in the popup script, and has the text I need in the content script but how to I connect them both?

content script:

let wordsList = [];
let meaningList = [];
let words = document.querySelectorAll(".status-m > div.word, .status-n > div.word");
let meaning = document.querySelectorAll(".status-m > div.meaning, .status-n > div.meaning");

wordsList = [...words];
wordsList = wordsList.map(word => word.innerHTML)

meaningList = [...meaning];
meaningList = meaningList.map(word => word.innerHTML)

let obj = {}
for(let i = 0; i< wordsList.length; i++){
    obj[wordsList[i]] = meaningList[i]
}

console.log(obj)

popup script:

$(function(){      
    $('.btn').click(function(){
        const blob = new Blob(['text to download'],
            { type: "text/plain; charset=utf-8" });
        saveAs(blob, "static.txt");
    })
})
unrealapex
  • 578
  • 9
  • 23
yuviwx
  • 1
  • 2
  • You should pass the data to a background script and let that do the work of saving it. – Andy Oct 19 '21 at 01:42
  • Simply generate the download in the content script, [examples](https://stackoverflow.com/questions/3749231/download-file-using-javascript-jquery). – wOxxOm Oct 19 '21 at 08:06

1 Answers1

-1

may use Powerpage command line to scrape a web page. program can be downloaded at https://github.com/casualwriter/powerpage

// crawl innerHTML to html
powerpage.exe /url={link-web-page} /save={local-file.html} /select=".status-m > div.word, .status-n > div.word"

// crawl innerText to html
powerpage.exe /url={link-web-page} /save={local-file.html} /select="@.status-m > div.word, .status-n > div.word"