323

The following Perl script (my.pl) can read from either the file in the command line arguments or from standard input (STDIN):

while (<>) {
   print($_);
}

perl my.pl will read from standard input, while perl my.pl a.txt will read from a.txt. This is very handy.

Is there an equivalent in Bash?

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Dagang
  • 24,586
  • 26
  • 88
  • 133

22 Answers22

526

The following solution reads from a file if the script is called with a file name as the first parameter $1 and otherwise from standard input.

while read line
do
  echo "$line"
done < "${1:-/dev/stdin}"

The substitution ${1:-...} takes $1 if defined. Otherwise, the file name of the standard input of the own process is used.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Fritz G. Mehner
  • 16,550
  • 2
  • 34
  • 41
  • 1
    Nice, it works. Another question is why you add a quote for it? "${1:-/proc/${$}/fd/0}" – Dagang Aug 13 '11 at 08:02
  • 18
    The filename you supply on the command line could have blanks. – Fritz G. Mehner Aug 13 '11 at 09:12
  • pretty and useful +1/+1 – racic Nov 26 '13 at 00:29
  • 7
    Is there any difference between using `/proc/$$/fd/0` and `/dev/stdin`? I noticed the latter seems to be more common and looks more straightforward. – knowah Jan 14 '15 at 23:24
  • @knowah: Not only is `/dev/stdin` more straightforward, it's also _more portable_ (doesn't require the `/proc` filesystem): `< "${1:-/dev/stdin}"` – mklement0 Feb 28 '15 at 22:17
  • 27
    Better to add `-r` to your `read` command, so that it doesn't accidentally eat `\ ` chars; use `while IFS= read -r line` to preserve leading and trailing whitespace. – mklement0 Feb 28 '15 at 23:34
  • @mklement0 More portable? I get `/dev/stdin: No such file or directory` using Ubuntu 14.04.3 LTS ... – NeDark Aug 19 '15 at 19:30
  • 1
    @NeDark: That's curious; I just verified that it works on that platform, even when using `/bin/sh` - are you using a shell other than `bash` or `sh`? – mklement0 Aug 19 '15 at 19:47
  • 1
    this doesn't work for me only when my script reading from `/dev/stdin` is fed via pipe from within another script. I get the `No such file or directory` error. The script works fine called from command line though. – hilcharge Mar 25 '16 at 00:57
  • 3
    In a vast majority of cases, you should avoid this. If all you want to do is echo the input back to output, `cat` does that already. Very often the processing could take place in an Awk script and the shell `while read` loop just complicates matters. Obviously, there are situations where you do need to process a line at a time from a file in a shell loop, but if you just found this answer in Google, you should be aware that this is a common newbie antipattern. – tripleee Jul 27 '16 at 08:54
  • 1
    I based on this answer to process info from pipe and or file. Also, I need to merge things from the piped input (or file), so I just did `cat < "${1:-/dev/stdin}" > ${INPUT_FILE}` in order to use in more than one ocassion in the script. – Sebastian Aug 02 '17 at 19:18
  • On the topic of the portability of `/dev/stdin` and `/proc/$$/fd`: [this Unix & Linux answer](https://unix.stackexchange.com/a/340291) has a lot of information. – Leonardo Dagnino Dec 08 '20 at 04:18
155

Perhaps the simplest solution is to redirect standard input with a merging redirect operator:

#!/bin/bash
less <&0

Standard input is file descriptor zero. The above sends the input piped to your bash script into less's standard input.

Read more about file descriptor redirection.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Ryan Ballantyne
  • 4,064
  • 3
  • 26
  • 27
  • 19
    There is no benefit to using `<&0` in this situation - your example will work the same with or without it - seemingly, tools you invoke from within a bash script by default see the same stdin as the script itself (unless the script consumes it first). – mklement0 Feb 28 '15 at 23:43
  • @mkelement0 So if a tool reads half the input buffer, will the next tool I invoke get the rest? – Asad Saeeduddin May 03 '17 at 16:28
  • 1
    "Missing filename ("less --help" for help)" when I do this... Ubuntu 16.04 – OmarOthman Jul 31 '17 at 17:11
  • 6
    where is the "or from file" part in this answer? – Sebastian Aug 02 '17 at 19:13
  • Just what I needed! I sent any CLI arguments too: `less "$@" <&0` – KingBob Feb 07 '18 at 00:22
  • @AsadSaeeduddin Not sure on that one... `less -+S ; cat` I do get stdin piped to less, which is nice - but nothing from cat. `<&0` Seems like an potentially useful trick if you want to make sure some command gets piped...however I'm not sure in which situation this would be useful, as my experiments seemed to indicate the first program gets stdin and it doesn't share or split between invoked programs. – smaudet Jun 27 '19 at 13:52
  • just a note: I was trying this with space between < and & and it didn't work. Learnt it the hard way about shell whitespaces :\ – Yogesh lele Oct 17 '19 at 16:35
  • This solution also works in situations where users have almost no privileges (i.e. can't use `cat` or access to `/dev/stdin`). – Jacopo Pace Mar 30 '23 at 20:56
138

Here is the simplest way:

#!/bin/sh
cat -

Usage:

$ echo test | sh my_script.sh
test

To assign stdin to the variable, you may use: STDIN=$(cat -) or just simply STDIN=$(cat) as operator is not necessary (as per @mklement0 comment).


To parse each line from the standard input, try the following script:

#!/bin/bash
while IFS= read -r line; do
  printf '%s\n' "$line"
done

To read from the file or stdin (if argument is not present), you can extend it to:

#!/bin/bash
file=${1--} # POSIX-compliant; ${1:--} can be used either.
while IFS= read -r line; do
  printf '%s\n' "$line" # Or: env POSIXLY_CORRECT=1 echo "$line"
done < <(cat -- "$file")

Notes:

- read -r - Do not treat a backslash character in any special way. Consider each backslash to be part of the input line.

- Without setting IFS, by default the sequences of Space and Tab at the beginning and end of the lines are ignored (trimmed).

- Use printf instead of echo to avoid printing empty lines when the line consists of a single -e, -n or -E. However there is a workaround by using env POSIXLY_CORRECT=1 echo "$line" which executes your external GNU echo which supports it. See: How do I echo "-e"?

See: How to read stdin when no arguments are passed? at stackoverflow SE

Community
  • 1
  • 1
kenorb
  • 155,785
  • 88
  • 678
  • 743
  • 1
    You could simplify `[ "$1" ] && FILE=$1 || FILE="-"` to `FILE=${1:--}`. (Quibble: better to avoid all-uppercase _shell_ variables to avoid name collisions with _environment_ variables.) – mklement0 Feb 28 '15 at 23:14
  • My pleasure; actually, `${1:--}` _is_ POSIX-compliant, so it should work in all POSIX-like shells. What won't work in all such shells is process substitution (`<(...)`); it'll work in bash, ksh, zsh, but not in dash, for instance. Also, better to add `-r` to your `read` command, so that it doesn't accidentally eat `\ ` chars; prepend `IFS= ` to preserve leading and trailing whitespace. – mklement0 Feb 28 '15 at 23:40
  • Hmm... `while IFS= read -r line` _should_ work - what shell do you use? Note the space after the `=`; what this does is set `$IFS` to the empty string _for the `read` command only_; that is, the global `$IFS` value doesn't change, which is the intent. By contrast, your version - `while IFS=; read -r line` - _does_ change it _globally_ (because `IFS=;` is a separate _command_, not a per-command temporary-environment-variable-setting preamble). – mklement0 Feb 28 '15 at 23:51
  • Glad to hear it; yup, the unquoted `$line` in the `echo` command was the problem. – mklement0 Mar 01 '15 at 00:02
  • 5
    In fact your code still breaks because of `echo`: if a line consists of `-e`, `-n` or `-E`, it won't be shown. To fix this, you must use `printf`: `printf '%s\n' "$line"`. I didn't include it in my previous edit… too often my edits are rollbacked when I fix this error `:(`. – gniourf_gniourf Mar 01 '15 at 00:02
  • 1
    Nope it doesn't fail. And the `--` is useless if first argument is `'%s\n'` – gniourf_gniourf Mar 01 '15 at 00:06
  • 2
    Your answer's fine by me (I mean there are no bugs or unwanted features I'm aware of anymore)—though it doesn't treat multiple arguments as Perl does. In fact, if you want to handle multiple arguments, you'll end up writing Jonathan Leffler's excellent answer—in fact yours would be better since you'd use `IFS=` with `read` and `printf` instead of `echo`. `:)`. – gniourf_gniourf Mar 01 '15 at 00:17
  • For some reason, the variables which i `read` into stay empty. What could be the reason for that? – Cadoiz Jun 24 '22 at 10:20
24

I think this is the straightforward way:

$ cat reader.sh
#!/bin/bash
while read line; do
  echo "reading: ${line}"
done < /dev/stdin

--

$ cat writer.sh
#!/bin/bash
for i in {0..5}; do
  echo "line ${i}"
done

--

$ ./writer.sh | ./reader.sh
reading: line 0
reading: line 1
reading: line 2
reading: line 3
reading: line 4
reading: line 5
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Amir Mehler
  • 4,140
  • 3
  • 27
  • 36
  • 4
    This doesn't fit the requirement by the poster for reading from either stdin or a file argument, this just reads from stdin. – nash Aug 07 '14 at 20:53
  • 3
    Leaving @nash's valid objection aside: `read` reads from stdin _by default_, so there's _no need_ for `< /dev/stdin`. – mklement0 Feb 28 '15 at 21:44
18

The echo solution adds new lines whenever IFS breaks the input stream. @fgm's answer can be modified a bit:

cat "${1:-/dev/stdin}" > "${2:-/dev/stdout}"
Community
  • 1
  • 1
David Souther
  • 8,125
  • 2
  • 36
  • 53
  • Could you please explain what you mean by "echo solution adds new lines whenever IFS breaks the input stream"? In case you were referring to `read`'s behavior: while `read` _does_ potentially split into multiple tokens by the chars. contained in `$IFS`, it only returns a _single_ token if you only specify a _single_ variable name (but trims and leading and trailing whitespace by default). – mklement0 Feb 28 '15 at 23:19
  • @mklement0 I agree 100% with you on the behavior of `read` and `$IFS` - `echo` itself adds new lines without the `-n` flag. "The echo utility writes any specified operands, separated by single blank (` ') characters and followed by a newline (`\n') character, to the standard output." – David Souther Mar 01 '15 at 16:36
  • Got it. However, to emulate the Perl loop you _need_ the trailing `\n` added by `echo`: Perl's `$_` _includes_ the line ending `\n` from the line read, while bash's `read` does not. (However, as @gniourf_gniourf points out elsewhere, the more robust approach is to use `printf '%s\n'` in lieu of `echo`). – mklement0 Mar 01 '15 at 17:01
  • 1
    This is a fantastic solution that grants your script the ability to be used as `script.sh example` OR `cat example | script.sh` – shmup Aug 18 '22 at 05:12
11

The Perl loop in the question reads from all the file name arguments on the command line, or from standard input if no files are specified. The answers I see all seem to process a single file or standard input if there is no file specified.

Although often derided accurately as UUOC (Useless Use of cat), there are times when cat is the best tool for the job, and it is arguable that this is one of them:

cat "$@" |
while read -r line
do
    echo "$line"
done

The only downside to this is that it creates a pipeline running in a sub-shell, so things like variable assignments in the while loop are not accessible outside the pipeline. The bash way around that is Process Substitution:

while read -r line
do
    echo "$line"
done < <(cat "$@")

This leaves the while loop running in the main shell, so variables set in the loop are accessible outside the loop.

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
  • 1
    Excellent point about _multiple_ files. I don't know what the resource and performance implications would be, but if you're not on bash, ksh, or zsh and therefore can't use process substitution, you could try a here-doc with command substitution (spread across 3 lines) `>>EOF\n$(cat "$@")\nEOF`. Finally, a quibble: `while IFS= read -r line` is a better approximation of what `while (<>)` does in Perl (preserves leading and trailing whitespace - though Perl also keeps the trailing `\n`). – mklement0 Feb 28 '15 at 23:21
8

Perl's behavior, with the code given in the OP can take none or several arguments, and if an argument is a single hyphen - this is understood as stdin. Moreover, it's always possible to have the filename with $ARGV. None of the answers given so far really mimic Perl's behavior in these respects. Here's a pure Bash possibility. The trick is to use exec appropriately.

#!/bin/bash

(($#)) || set -- -
while (($#)); do
   { [[ $1 = - ]] || exec < "$1"; } &&
   while read -r; do
      printf '%s\n' "$REPLY"
   done
   shift
done

Filename's available in $1.

If no arguments are given, we artificially set - as the first positional parameter. We then loop on the parameters. If a parameter is not -, we redirect standard input from filename with exec. If this redirection succeeds we loop with a while loop. I'm using the standard REPLY variable, and in this case you don't need to reset IFS. If you want another name, you must reset IFS like so (unless, of course, you don't want that and know what you're doing):

while IFS= read -r line; do
    printf '%s\n' "$line"
done
gniourf_gniourf
  • 44,650
  • 9
  • 93
  • 104
  • This is right answer - I recently learned how to use exec to re-route stdout to a designated file, I should have known it could be used to route a file to stdin. Thanks for sharing your answer, sorry its not getting the love it deserves ! – David Farrell Jul 14 '21 at 22:46
3
#!/usr/bin/bash

if [ -p /dev/stdin ]; then
       #for FILE in "$@" /dev/stdin
    for FILE in /dev/stdin
    do
        while IFS= read -r LINE
        do
            echo "$@" "$LINE"   #print line argument and stdin
        done < "$FILE"
    done
else
    printf "[ -p /dev/stdin ] is false\n"
     #dosomething
fi

Running:

echo var var2 | bash std.sh

Result:

var var2

Running:

bash std.sh < <(cat /etc/passwd)

Result:

root:x:0:0::/root:/usr/bin/bash
bin:x:1:1::/:/usr/bin/nologin
daemon:x:2:2::/:/usr/bin/nologin
mail:x:8:12::/var/spool/mail:/usr/bin/nologin
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
vcatafesta
  • 73
  • 5
3

Please try the following code:

while IFS= read -r line; do
    echo "$line"
done < file
gniourf_gniourf
  • 44,650
  • 9
  • 93
  • 104
Webthusiast
  • 986
  • 10
  • 16
  • 2
    Note that even as amended, this won't read from standard input, or from multiple files, so it isn't a complete answer to the question. (It is also surprising to see two edits in a matter of minutes more than 3 years after the answer was first submitted.) – Jonathan Leffler Mar 01 '15 at 00:07
  • @JonathanLeffler sorry for editing such an old (and not really good) answer… but I couldn't stand seeing this poor `read` without `IFS=` and `-r`, and the poor `$line` without its healthy quotes. – gniourf_gniourf Mar 01 '15 at 00:12
  • 1
    @gniourf_gniourf: I dislike the `read -r` notation. IMO, POSIX got that wrong; the option should enable the special meaning for trailing backslashes, not disable it — so that existing scripts (from before POSIX existed) would not break because the `-r` was omitted. I observe, however, that it was part of IEEE 1003.2 1992, which was the earliest version of the POSIX shell and utilities standard, but it was marked as an addition even then, so this is bitching about long-gone opportunities. I've never run into trouble because my code does not use `-r`; I must be lucky. Ignore me on this. – Jonathan Leffler Mar 01 '15 at 00:27
  • 1
    @JonathanLeffler I really agree that `-r` should be standard. I agree that it's unlikely to be in cases where not using it leads to trouble. Though, broken code is broken code. My edit was first triggered by that poor `$line` variable that badly missed its quotes. I fixed the `read` while I was at it. I didn't fix the `echo` because that's the kind of edit that gets rolled back. `:(`. – gniourf_gniourf Mar 01 '15 at 00:34
  • How does it work? What is that `IFS=` thing? Why is it necessary? There is some information [in a comment](https://stackoverflow.com/questions/6980090/how-to-read-from-a-file-or-stdin-in-bash#comment45851623_6982423). – Peter Mortensen Jun 21 '21 at 12:51
  • The edits made it identical to [sorpigal's (later) answer](https://stackoverflow.com/questions/6980090/how-to-read-from-a-file-or-standard-input-in-bash/6982423#6982423). – Peter Mortensen Jun 21 '21 at 12:53
3

More accurately...

while IFS= read -r line ; do
    printf "%s\n" "$line"
done < file
sorpigal
  • 25,504
  • 8
  • 57
  • 75
  • 5
    I assume this is essentially a comment on http://stackoverflow.com/a/6980232/45375, not an answer. To make the comment explicit: adding `IFS=` and `-r` to the `read` command ensures that each line is read _unmodified_ (including leading and trailing whitespace). – mklement0 Feb 28 '15 at 22:54
2

The following works with standard sh (tested with Dash on Debian) and is quite readable, but that's a matter of taste:

if [ -n "$1" ]; then
    cat "$1"
else
    cat
fi | commands_and_transformations

Details: If the first parameter is non-empty then cat that file, else cat standard input. Then the output of the whole if statement is processed by the commands_and_transformations.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Notinlist
  • 16,144
  • 10
  • 57
  • 99
  • 1
    IMHO the best answer so because it points to the true solution: `cat "${1:--}" | any_command`. Reading to shell variables and echoing them may work for small files but does not scale so well. – Andreas Spindler Feb 07 '17 at 17:49
  • 1
    The `[ -n "$1" ]` can be simplified to `[ "$1" ]`. – agc May 01 '17 at 02:06
2

I combined all of the above answers and created a shell function that would suit my needs. This is from a Cygwin terminal of my two Windows 10 machines where I had a shared folder between them. I need to be able to handle the following:

  • cat file.cpp | tx
  • tx < file.cpp
  • tx file.cpp

Where a specific filename is specified, I need to used the same filename during copy. Where input data stream has been piped through, then I need to generate a temporary filename having the hour minute and seconds. The shared mainfolder has subfolders of the days of the week. This is for organizational purposes.

Behold, the ultimate script for my needs:

tx ()
{
  if [ $# -eq 0 ]; then
    local TMP=/tmp/tx.$(date +'%H%M%S')
    while IFS= read -r line; do
        echo "$line"
    done < /dev/stdin > $TMP
    cp $TMP //$OTHER/stargate/$(date +'%a')/
    rm -f $TMP
  else
    [ -r $1 ] && cp $1 //$OTHER/stargate/$(date +'%a')/ || echo "cannot read file"
  fi
}

If there is any way that you can see to further optimize this, I would like to know.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
daparic
  • 3,794
  • 2
  • 36
  • 38
2

Two principle ways:

  • Either pipe the argument files and stdin into a single stream and process that like stdin (stream approach)
  • Or redirect stdin (and argument files) into a named pipe and process that like a file (file approach)

Stream approach

Minor revisions to earlier answers:

  • Use cat, not less. It's faster and you don't need pagination.

  • Use $1 to read from first argument file (if present) or $* to read from all files (if present). If these variables are empty, read from stdin (like cat does)

    #!/bin/bash
    cat $* | ...
    

File approach

Writing into a named pipe is a bit more complicated, but this allows you to treat stdin (or files) like a single file:

  • Create pipe with mkfifo.

  • Parallelize the writing process. If the named pipe is not read from, it may block otherwise.

  • For redirecting stdin into a subprocess (as necessary in this case), use <&0 (unlike what others have been commenting, this is not optional here).

      #!/bin/bash
      mkfifo /tmp/myStream
      cat $* <&0 > /tmp/myStream &           # separate subprocess (!)
      AddYourCommandHere /tmp/myStream       # process input like a file, 
      rm /tmp/myStream                       # cleaning up
    

File approach: Variation

Create named pipe only if no arguments are given. This may be more stable for reading from files as named pipes can occasionally block.

#!/bin/bash
FILES=$*
if echo $FILES | egrep -v . >&/dev/null; then # if $FILES is empty
   mkfifo /tmp/myStream
   cat <&0 > /tmp/myStream &
   FILES=/tmp/myStream
fi
AddYourCommandHere $FILES     # do something ;)
if [ -e /tmp/myStream ]; then
   rm /tmp/myStream
fi

Also, it allows you to iterate over files and stdin rather than concatenate all into a single stream:

for file in $FILES; do
    AddYourCommandHere $file
done
Chiarcos
  • 324
  • 1
  • 10
1

The code ${1:-/dev/stdin} will just understand the first argument, so you can use this:

ARGS='$*'
if [ -z "$*" ]; then
  ARGS='-'
fi
eval "cat -- $ARGS" | while read line
do
   echo "$line"
done
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
1

Reading from stdin into a variable or from a file into a variable.

Most examples in the existing answers use loops that immediately echo each of line as it is read from stdin. This might not be what you really want to do.

In many cases you need to write a script that calls a command which only accepts a file argument. But in your script you may want to support stdin also. In this case you need to first read full stdin and then provide it as a file.

Let's see an example. The script below prints the certificate details of a certificate (in PEM format) that is passed either as a file or via stdin.

# print-cert script

content=""
while read line
do
  content="$content$line\n"
done < "${1:-/dev/stdin}"
# Remove the last newline appended in the above loop
content=${content%\\n}

# Keytool accepts certificate only via a file, but in our script we fix this.
keytool -printcert -v -file <(echo -e $content)

# Read from file

cert-print mycert.crt

# Owner: CN=....
# Issuer: ....
# ....


# Or read from stdin (by pasting)

cert-print
#..paste the cert here and press enter
# Ctl-D

# Owner: CN=....
# Issuer: ....
# ....


# Or read from stdin by piping to another command (which just prints the cert(s) ). In this case we use openssl to fetch directly from a site and then print its info.


echo "" | openssl s_client -connect www.google.com:443 -prexit 2>/dev/null \
| sed -n -e '/BEGIN\ CERTIFICATE/,/END\ CERTIFICATE/ p' \
| cert-print

# Owner: CN=....
# Issuer: ....
# ....

Marinos An
  • 9,481
  • 6
  • 63
  • 96
0

I don't find any of these answers acceptable. In particular, the accepted answer only handles the first command line parameter and ignores the rest. The Perl program that it is trying to emulate handles all the command line parameters. So the accepted answer doesn't even answer the question.

Other answers use Bash extensions, add unnecessary 'cat' commands, only work for the simple case of echoing input to output, or are just unnecessarily complicated.

However, I have to give them some credit, because they gave me some ideas. Here is the complete answer:

#!/bin/sh

if [ $# = 0 ]
then
        DEFAULT_INPUT_FILE=/dev/stdin
else
        DEFAULT_INPUT_FILE=
fi

# Iterates over all parameters or /dev/stdin
for FILE in "$@" $DEFAULT_INPUT_FILE
do
        while IFS= read -r LINE
        do
                # Do whatever you want with LINE here.
                echo $LINE
        done < "$FILE"
done
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Gungwald
  • 109
  • 1
  • 6
0

This one is easy to use on the terminal:

$ echo '1\n2\n3\n' | while read -r; do echo $REPLY; done
1
2
3
cmcginty
  • 113,384
  • 42
  • 163
  • 163
0

As a workaround, you can use the stdin device in the /dev directory:

....| for item in `cat /dev/stdin` ; do echo $item ;done
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
0

With...

while read line
do
    echo "$line"
done < "${1:-/dev/stdin}"

I got the following output:

Ignored 1265 characters from standard input. Use "-stdin" or "-" to tell how to handle piped input.

Then decided with for:

Lnl=$(cat file.txt | wc -l)
echo "Last line: $Lnl"
nl=1

for num in `seq $nl +1 $Lnl`;
do
    echo "Number line: $nl"
    line=$(cat file.txt | head -n $nl | tail -n 1)
    echo "Read line: $line"
    nl=$[$nl+1]
done
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
0

Just test the number of arguments to your script, and test whether the first argumetn ($1) is a file. If false, use the stdin -:

#!/bin/bash
[ $# -ge 1 -a -f "$1" ] && input="$1" || input="-"
cat $input

See this qustion.

Youjun Hu
  • 991
  • 6
  • 18
0

@gniourf_gniourf's answer is the correct one but uses a lot of bashisms. Since this question is the top Google result, here is a POSIX compliant version:

#!/bin/sh

if [ $# -eq 0 ]; then
    set -- -
fi

for f in "$@"; do
    if [ "$f" = - ] || exec < "$f"; then
        while IFS= read -r line; do
            printf '%s\n' "$line"
    done
done

or if you want to be terse:

#!/bin/sh

[ $# -eq 0 ] || set -- -
for f; do
    { [ "$f" = - ] || exec < "$f"; } &&
    while IFS= read -r line; do
        printf '%s\n' "$line"
    done
done
JM0
  • 340
  • 1
  • 13
-1

Use:

for line in `cat`; do
    something($line);
done
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
  • The output of `cat` will be placed into the command line. The command line has a maximum size. Also this will not read line by line, but word by word. – Notinlist May 01 '17 at 11:16