10

I installed owncloud on the server!

How I can download shared file with link provided by WebUI from console with wget or curl ?

I tried to download from console with following commands, but this not successfully:

wget http://owncloud.example.com/public.php?service=files&t=par7fec5377a27f19654cd0e7623d883
wget http://owncloud.example.com/public.php?service=files&t=par7fec5377a27f19654cd0e7623d883

wget http://owncloud.example.com/public.php?service=files&t=par7fec5377a27f19654cd0e7623d883&download -O file.tar.gz
wget http://owncloud.example.com/public.php?service=files&t=par7fec5377a27f19654cd0e7623d883&download -O file.tar.gz

I can download this file from web browser succesfully.

We used Owncloud v. 7.0.4, setuped with chef cookbook https://github.com/onddo/owncloud-cookbook

vskubriev
  • 826
  • 1
  • 11
  • 21
  • What do you mean by "this is not successful"? What kind of errors / not expected behaviour do you get? – gturri Dec 26 '14 at 11:54
  • 1
    I mean that I cannot download from console with this commands. It's not easy as with dropbox. – vskubriev Dec 27 '14 at 13:47

3 Answers3

9

For downloading a list of numbered password protected files use Chrome developer "Copy as cURL" feature (http://www.lornajane.net/posts/2013/chrome-feature-copy-as-curl) to get a cURL command with cookie, then copy paste first file name to write a download script:

#!/bin/bash
for number in $(seq -w 37)
do
    curl -o "file.part0$number.zip" "<URL of first file including $number>" \
       -H parameters from "Copy as cURL"
done
Gerrit Griebel
  • 405
  • 3
  • 10
6

use wget to download from public link owncloud

$wget --no-check-certificate --content-disposition "https://owncloud/link"
  • --content-disposition (get the name from the content-disposition header from the requests, name of the file from the url)

  • --no-check-certificate (skip certificate error)

  • -O file.name (to specify new name)

curl

$curl -J -O "https://link/path"
  • -J (use remote header name)

  • -O (use remote name)

Community
  • 1
  • 1
niainaLens
  • 781
  • 7
  • 5
  • 3
    For me the `curl -J -O "https://example.com/s//download"` variety worked. Copied the link by right-clicking the `Download` button at the right hand side of the top frame, effectively downloading the content of a full shared directory structure. Instead, you can also get the download link of individual files, by right-clicking the file name. I have not found a way to get such a link for only one directory within the folder structure. – dlaehnemann Apr 11 '19 at 12:01
  • I have now found a way to download only a particular folder (but all files within that folder): you can take the link of any of the contained files and just delete everything starting from the `&files=` onwards. – dlaehnemann Apr 11 '19 at 12:08
5

Something like below worked for me.

wget --no-check-certificate "http://owncloud.example.com/public.php?service=files&t=par7fec5377a27f19654cd0e7623d883&download&path=//file.tar.gz"

Note double quotes around download link.
URL was "copied download link" from downloads in chrome.

Jai
  • 694
  • 5
  • 15