2

First I don't know if I talk about STDIN out STDOUT, but this is what I want to achieve :

There's a program that export database from distant server and send output as gzipped content.
I want to unzip the content, parse it.
If it's OK then import it, otherwise send an error message. I don't want to write to any temporary file on disk so I want to handle things directly from STD

someExportCommand > /dev/stdin #seems not work
#I want to write a message here
echo "Export database done"
cat /dev/stdin > gunzip > /dev/stdin
echo "Unzip done"

if [[ "$mycontentReadFromSTDIN" =* "password error" ]]; then
    echo "error"
    exit 1
fi

#I want to echo that we begin impor"
echo "Import begin"
cat /dev/stdin | mysql -u root db
#I want to echo that import finished
echo "Import finish"

The challenge here is not to write to a physical file. It's easier if it's the case but I want to do the hard way. Is it possible and how?

Thanh Trung
  • 3,566
  • 3
  • 31
  • 42
  • Heh. Part of what makes this tricky is that you need the content to be binary-safe. If it were just text, we could just use command substitution to capture content into a regular shell variable. – Charles Duffy Aug 15 '18 at 15:06
  • 1
    ...that stdin is completely unsuitable for any of this -- you don't know where it's attached, or if it's attached to anything at all, and nothing guarantees you the ability to write to it. – Charles Duffy Aug 15 '18 at 15:07
  • 1
    Also, not storing this in a file means you're storing it in RAM. If the export is large, that's crazy inefficient. Are you **sure** that's what you want to do? – Charles Duffy Aug 15 '18 at 15:09
  • The export program does an echo so I just want to parse this echo content and pipe it through another program like gunzip or mysql. At the same time I want to echo some message in between to the console to tell the user that everything is OK. – Thanh Trung Aug 15 '18 at 15:16
  • @Charle no I don’t want a memory problem. Just trying to parse anything in the std IN or OUT – Thanh Trung Aug 15 '18 at 15:17
  • If the export program is well-written, on errors, it will not just write a diagnostic message to stderr (not stdout, which is supposed to be used only for output and not for logging!), but will also exit with a nonzero exit status. It's better practice to detect the exit status than to detect the log message, which is meant for human consumption rather than for machines. – Charles Duffy Aug 15 '18 at 15:18
  • 1
    To be clear about what `/dev/stdin` *is* -- that filename, for any given process, refers to whatever is directed to the input of that process. So if you run `foo – Charles Duffy Aug 15 '18 at 15:26
  • 1
    ...so, whenever you run `foo | bar`, any write to the `/dev/stdout` of `foo` is a write to the `/dev/stdin` of `bar`. – Charles Duffy Aug 15 '18 at 15:26
  • The thing is the export program is also a pipe! `mysqldump -u root db | gzip`. Do you think we could check its error code on client program? – Thanh Trung Aug 15 '18 at 15:41
  • Yes, you can retrieve the exit status from a pipeline. My answer shows using `set -o pipefail` to make exit status flow through, and there's also `PIPESTATUS`. – Charles Duffy Aug 15 '18 at 15:46
  • See [pipe output and capture exit status in bash](https://stackoverflow.com/questions/1221833/pipe-output-and-capture-exit-status-in-bash) – Charles Duffy Aug 15 '18 at 15:48
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/178089/discussion-between-charles-duffy-and-thanh-trung). – Charles Duffy Aug 15 '18 at 15:53

2 Answers2

1

A literal implementation of what you're asking for (not a good idea, but doing exactly what you asked) might look like the following:

This is a bad idea for several reasons:

  • If a database is large enough to be bothering with, trying to fit it in memory, and especially in a shell variable is a bad idea.
  • In order to fit binary data into a shell variable, it needs to be encoded (as with base64, or uunencode, or other tools). This makes it even larger than it was before, and also adds performance overhead

...however, the bad-idea code, as requested:

#!/usr/bin/env bash

set -o pipefail # if any part of a command fails, count the whole thing a failure

if exportOutputB64=$(someExportCommand | base64); then
  echo "Export database done" >&2
else
  echo "Export database reports failure" >&2
  exit 1
fi

if exportOutputDecompressedB64=$(base64 --decode <<<"$exportOutputB64" | gunzip -c | base64); then
  echo "Decompress of export done" >&2
else
  echo "Decompress of export failed" >&2
  exit 1
fi
unset exportOutputB64

if grep -q 'password error' < <(base64 --decode <<<"$exportOutputDecompressedB64"); then
  echo "Export contains password error; considering it a failure" >&2
  exit 1
fi

echo "Import begin"
mysql -u root db < <(base64 --decode <<<"$exportOutputDecompressedB64")

If I were writing this myself, I'd just set up a pipeline that processes the whole thing in-place, and uses pipefail to ensure that errors in early stages are detected:

set -o pipefail
someExportCommand | gunzip -c | mysql -u root db

The important thing about a pipeline is that all parts of it run at the same time. Thus, someExportCommand is still running when mysql -u root db starts. Consequently, there's no need for a large buffer anywhere (in memory, or on disk) to store your database contents.

Charles Duffy
  • 280,126
  • 43
  • 390
  • 441
  • So storing in variable is the only way? Can we just do like `exportProgram | echo “finish” && gunzip | mysql...` ? – Thanh Trung Aug 15 '18 at 15:35
  • No, `echo` doesn't read standard input, so you can't pipe to it. – tripleee Aug 15 '18 at 15:47
  • Though `exportProgram | ( echo "$0: finish" >&2; gunzip | mysql )` should work. – tripleee Aug 15 '18 at 15:54
  • @tripleee, ...unless they mean that "finish" to imply that `exportProgram` finished, of course, since the `echo` will run right when `exportProgram` is starting. – Charles Duffy Aug 15 '18 at 15:55
  • Tee hee, true. (-: – tripleee Aug 15 '18 at 15:56
  • Yes finish is to say export finish. Also I want to add gunzip is done and start import process. Bu it seems one line is not doable and cannot use within pipe – Thanh Trung Aug 15 '18 at 16:03
  • If you don't want to need to store a lot of content either in RAM or on disk, you need to pipeline your content. In a pipeline, all parts run at the same time, so with `foo | bar`, `bar` starts at the same time `foo` does, so logging that `bar` started doesn't tell you that `foo` is done. – Charles Duffy Aug 15 '18 at 16:09
  • I'll mark your answer as correct because it helps in solving the problem through variable storing – Thanh Trung Aug 15 '18 at 18:40
1

The requirement to not use a temporary file seems extremely misdirected; but you can avoid it by reading into a shell variable, or perhaps an array.

Any error message is likely to be on stderr, not stdin. But you should examine the program's exit status instead of looking for whether it prints an error message.

#!/bin/bash
result=$(someExportCommand) || exit

At this point, the script will have exited if there was a failure; and otherwise, result contains its output.

Now, similarly to error messages, status messages, too, should be printed to standard error, not standard output. A common convention is also to include the name of the script in the message.

echo "$0: Import begin" >&2

Now, pass the variable to mysql.

mysql -u root db <<<"$result"

Notice that the <<<"here string" syntax is a Bash feature; you can't use it with /bin/sh. If you need the script to be portable to sh, the standard solution is still to use a pipe;

printf '%s\n' "$result" | mysql -u root db

Finally, print the status message to stderr again.

echo "$0: Import finished" >&2

Using a shell variable for a long string is not particularly efficient or elegant; capturing the output into a temporary file is definitely the recommended approach.

tripleee
  • 175,061
  • 34
  • 275
  • 318
  • 1
    The export command emits raw binary data, so you can't store it in a regular string shell variable. All the base64 awfulness in my answer is there for a reason. – Charles Duffy Aug 15 '18 at 15:46
  • Oh right, missed that requirement. But if you change `someExportCommand` so it doesn't `gzip` anything, this should work. – tripleee Aug 15 '18 at 15:49