if [ x"$@" = x"" ] ; then cmtc='-nc' ; elif ...; then ...; else cmtc='-c'; fi if diff -wu f1 f2 > $tf.diff 2>&1; then echo a; else echo b; fi
*Tags*: cmd:if, cmd:while
if [ x"$@" = x"" ] ; then cmtc='-nc' ; elif ...; then ...; else cmtc='-c'; fi if diff -wu f1 f2 > $tf.diff 2>&1; then echo a; else echo b; fi
for ii in `jot $n`; do for attr in 0 1 4 5 7 ; do
while [ "${DIRSTACK[1]}" ] ; do popd +1 > /dev/null 2>&1 ; done
while expr `netstat -a | grep ":www" | wc -l` '>' $ipm ; do :; done while :; do ...
while read line; do echo postinstall: $line done
![]() |
no ";" after 'then' & 'do'! |
/usr/bin/cal | # bold today sed -e "s/^/ /; s/$/ /; s/ $4 /$bold $4 $unbold/"
— Look, can even put comments between pipe command!
documented on: 02-06-99
do not use the form
condition_yes && condition_no || echo ok
Use
condition_yes && ! condition_no && echo ok
Conditions can be evaluated consecutively.
For example, if I want to do 'echo ok' based on two consecutive conditions, i.e, only condition 1 and condition 2 are true, I'll 'echo ok', I'll write
condition1 && condition2 && echo ok
The evaluation is shortcut, i.e, if condition1 fails, condition2 won't be executed:
$ set -x
$ true && true && echo ok + true + true + echo ok ok
$ false && true && echo ok + false
$ true && false && echo ok + true + false
Caution, the algorithm is different than C when there is || in the evaluation:
$ true && false || echo ok + true + false + echo ok ok
Seems to be working fine, but watch this:
$ false && true || echo ok + false + echo ok ok
Oops, that's not what you intented!
documented on: 2004.05.18
case @$1 in @-*) flag=$1; shift;; esac
while list do list done until list do list done
The while command continuously executes the do list as long as the last command in list returns an exit status of zero. The until command is identical to the while command, except that the test is negated; the do list is executed as long as the last command in list returns a non-zero exit status. The exit status of the while and until commands is the exit status of the last do list command executed, or zero if none was executed.
documented on: 1999.12.01
$ [ -f TiOn.txt ] && { echo a; echo b; } [ -f TiOn.txt ] && { echo a; echo b; } + '[' -f TiOn.txt ']' + echo a a + echo b b
Note the difference in syntax:
'(echo $pwd_base; echo $pwd; echo *)' '{ echo $pwd_base; echo $pwd; echo *;}'
list is executed in the current (that is, parent) shell. The { must be followed by a space, and } must be lead by a ';'
$ [ -f TiOn.txt ] && { echo a; echo b } [ -f TiOn.txt ] && { echo a; echo b } >
— need the ';' at the end!
$ [ -f TiOn.txt ] && { echo a; echo b;} [ -f TiOn.txt ] && { echo a; echo b;} + '[' -f TiOn.txt ']' + echo a a + echo b b
— the ending space in not important.
documented on: 1999.11.05 Fri 09:54:18
l(){ if [ "$TERM" = 'xterm' ]; then echo aaa; fi; }
$ if [ "$TERM" = 'xterm' ]; then echo aaa; fi aaa
— ok
l(){ if [ "$TERM" = 'xterm' ]; then echo aaa; fi }
— syntax error near unexpected token `}'
l(){ if [ "$TERM" = 'xterm' ]; then echo aaa; fi }
— ok
l(){ if [ "$TERM" = 'xterm' ]; then echo aaa; fi }
— syntax error near unexpected token `}'
l(){ if [ "$TERM" = 'xterm' ]; then echo aaa; fi }
— ok
l(){ if [ "$TERM" = 'xterm' ]; then echo aaa; fi; }
— ok
# return the status of last executed command return $? # 0: success, others: failure
e.g.:
grep ... return $? # 0: found, 1:no found
To test the return value, use -eq / -ne instead of = / !=, for algebraical equality.
[ $? -ne 0 ] && failure_handling
funcdemo(){ # Demostrate how a function returns string instead of # returning value echo "somthing about $1 and $2" }
result=`funcdemo aaa bbb`
sh: if [ -t 0 ]; then ... fi C: if(isatty(0)) { ... }
All of the shells from the Bourne shell category (including rc) use the "." command. All of the shells from the C shell category use "source".
Although this may not be a complete listing, this provides the majority of information.
csh Some versions have system-wide .cshrc and .login files. Every version puts them in different places.
Start-up (in this order): .cshrc - always; unless the -f option is used. .login - login shells.
Upon termination: .logout - login shells.
Others: .history - saves the history (based on $savehist).
If the shell is a login shell, this is the sequence of invocations: First, commands in /etc/.login are executed. Next, commands from the .cshrc file your home directory are executed. Then the shell executes commands from the .login file in your home directory.
tcsh Start-up (in this order): /etc/csh.cshrc - always. /etc/csh.login - login shells. .tcshrc - always. .cshrc - if no .tcshrc was present. .login - login shells
Upon termination: .logout - login shells.
Others: .history - saves the history (based on $savehist). .cshdirs - saves the directory stack.
sh Start-up (in this order): /etc/profile - login shells. .profile - login shells.
Upon termination: any command (or script) specified using the command: trap "command" 0
ksh Start-up (in this order): /etc/profile - login shells. .profile - login shells; unless the -p option is used. $ENV - always, if it is set; unless the -p option is used. /etc/suid_profile - when the -p option is used.
Upon termination: any command (or script) specified using the command: trap "command" 0
bash Start-up (in this order): /etc/profile - login shells. .bash_profile - login shells. .profile - login if no .bash_profile is present. .bashrc - interactive non-login shells. $ENV - always, if it is set.
Upon termination: .bash_logout - login shells.
Others: .inputrc - Readline initialization.
documented on: 02-21-99 11:46:23
![]() |
eval has to be followed by a command name! |
$ type eval eval is a shell builtin
cmd='cat -n' $ echo aaa | $cmd 1 aaa
cmd='echo $$; cat -n' $ echo aaa | $cmd $$; cat -n
cmd='{ echo $$; cat -n; }' $ echo aaa | $cmd bash: {: command not found
cmd='cat -n; echo $$; ' echo aaa | $cmd + echo aaa + cat '-n;' echo '$$;' cat: invalid option -- ;
: Can anyone find an example in which eval would make a difference in this : code:
: a) : read -r cmd < $cmdlist : eval "$cmd" : b) : read -r cmd < $cmdlist : $cmd
#!/bin/sh
cat > foo.$$ <<-eof echo a| echo b eof
cmdlist=foo.$$
echo eval read cmd < $cmdlist eval "$cmd"
echo \!eval read cmd < $cmdlist $cmd
rm foo.$$
# gives: # eval b !eval a| echo b
documented on: 02-05-99 23:59:04
>How can I use "heredoc" to list something and then feed it into a pipe,
The syntax would be:
cat <<EOF | xargs -i echo {} {} file1 file2 file3 EOF
Ken Pizzini
Another way to keep the EOF string on a line alone:
{ cat <<EOF file1 file2 file3 EOF } | xargs -i echo {} {}
John Savage
documented on: 06-11-99 20:47:27
Newsgroups: comp.unix.shell
>I came across the following usage: > >LD_LIBRARY_PATH=... soffice > >which set a vriable before running a program. > >Can you explain it a little bit? I.e., the syntax, where to use, >where to find man...
It is documented in the man page for your (Bourne-like) shell. In the bash man page it is in the section "SIMPLE COMMAND EXPANSION", where the relevant sentences state:
If no command name results, the variable assignments affect the current shell environment. Otherwise, the variables are added to the environment of the executed command and do not affect the current shell environment.
Ken Pizzini
documented on: 2000.08.13 Sun 21:30:01
Newsgroups: comp.unix.shell
> I want to pass a swich taken from command line into subroutines. > E.g, for the following code, I want to pass the command line swich > -t and its parameter into sub2: [snip] > You can see, the first time it works fine, but the second time it is > not. and the output of the echo (with "!!" as comment) is wrong too. > > The reson I can't pass directy is that I also want to do some > processing along the passing. This is a general question about how > to pass swithes along to subroutines with proper quoting and in > proper way.
>sub1(){ > > [ "x$1" = "x-t" ] && { > file_title_s="-t '$2'" > shift 2 > } > sub2 $file_title_s "$@" more from sub1 >}
Is there a reason that you are restricted to one variable? It's simpler if you use two:
sub1(){ case $1 in -t) file_title_s=-t file_title_o=$2 shift 2 ;; esac sub2 "$file_title_s" "$file_title_o" "$@" more from sub1 }
Ken Pizzini
this one works fine.
$ cat /tmp/test sub1(){ echo sub1:$# [ "x$1" = "x-t" ] && { file_title_s="-t \"$2\"" shift 2 } args= for arg in "$@"; do args="$args \"$arg\""; done eval set -- $file_title_s $args sub2 "$@" "5 6" } sub2(){ echo sub2:$# [ "x$1" = "x-t" ] && { file_title="$2" shift 2 } echo "$1 should be: 2nd" echo $file_title echo "$@" } [ "x$1" = "x-t" ] && { file_title_s="-t \"$2\"" shift 2 } echo sub0:$# args= for arg in "$@"; do args="$args \"$arg\""; done eval set -- $file_title_s $args sub1 "$@" "3 4" $ ast-ksh -x /tmp/test -t '1 2' 2nd 3rd + [ x-t = x-t ] + file_title_s='-t "1 2"' + shift 2 + echo sub0:2 sub0:2 + args='' + args=' "2nd"' + args=' "2nd" "3rd"' + eval set -- -t '"1' '2"' '"2nd"' '"3rd"' + set -- -t '1 2' 2nd 3rd + sub1 -t '1 2' 2nd 3rd '3 4' + echo sub1:5 sub1:5 + [ x-t = x-t ] + file_title_s='-t "1 2"' + shift 2 + args='' + args=' "2nd"' + args=' "2nd" "3rd"' + args=' "2nd" "3rd" "3 4"' + eval set -- -t '"1' '2"' '"2nd"' '"3rd"' '"3' '4"' + set -- -t '1 2' 2nd 3rd '3 4' + sub2 -t '1 2' 2nd 3rd '3 4' '5 6' + echo sub2:6 sub2:6 + [ x-t = x-t ] + file_title='1 2' + shift 2 + echo '2nd should be: 2nd' 2nd should be: 2nd + echo 1 2 1 2 + echo 2nd 3rd '3 4' '5 6' 2nd 3rd 3 4 5 6
Cyrille.
Though seem complicated, the above is really the minimum steps. Here is a trying-to-be-simple one and its failure record:
![]() |
copied exactly from above |
sub1(){ echo sub1:$# [ "x$1" = "x-t" ] && { file_title_s="-t \"$2\"" shift 2 } args= for arg in "$@"; do args="$args \"$arg\""; done eval set -- $file_title_s $args sub2 "$@" "5 6" } [ "x$1" = "x-t" ] && { file_title_s="-t \"$2\"" shift 2 }
![]() |
neglect the for loop |
eval set -- $file_title_s "$@" sub1 $file_title_s more parameters
![]() |
!! |
$ sh -x test_script0 -t "This is a test" 2nd 3rd + [ x-t = x-t ] file_title_s=-t 'This is a test' + shift 2 + sub1 -t 'This is a test' more parameters + echo sub1:7 sub1:7 + [ x-t = x-t ] file_title_s=-t "'This" + shift 2 args= args= "is" args= "is" "a" args= "is" "a" "test'" args= "is" "a" "test'" "more" args= "is" "a" "test'" "more" "parameters" + eval set -- -t "'This" "is" "a" "test'" "more" "parameters" + set -- -t 'This is a test' more parameters + sub2 -t 'This is a test' more parameters 5 6 + [ x-t = x-t ] file_title='This + shift 2
> Is there any way to nest backticks? What I mean is follows: > localhost% echo "You have `expr `frm | wc -l` - 1` messages."
Get rid of csh
in ksh or sh, you might try:
echo "You have `expr \`frm | wc -l\` - 1` messages."
If you _must_ use csh, then put this into a script:
#!/bin/sh echo "You have `expr \`frm | wc -l\` - 1` messages."
For ksh, there is a more readable syntax (to me, anyway)…
$ echo "You have $(expr $(frm | wc -l) -1) messages."
Dave
There's also the $() notation, found in POSIX-compliant shells:
$ echo You have $(expr $(frm | wc -l) - 1) messages.
Neater, simpler, and washes whiter.
Simon
> Are there any differences between $() and backticks, except from how easy > they are to nest?
Backticks are available in all shells, i.e., sh, ksh, bash, csh, …, while $(…) can be found in ksh, bash, and other POSIX shells.
You're always on the safe side when you use backticks, however, this can be a pain in the neck when nesting.
Jens M. Felderhoff
The backtick form does an extra level of quote processing before executing the contained command, and the $() form doesn't, so you can test a command outside of $() and not have to worry about adding quotes. Certain kinds of commands work poorly if at all inside backticks; the examples of this I've seen all involve here-documents. In addition, though it's not as big a deal, $(<filename) does the same thing as `cat filename` without creating another process.
Eric Amick
documented on: 1999.11.04 Thu 16:52:36
$ set -vx
$ echo `/bin/echo \`pwd\` ` echo `/bin/echo \`pwd\` ` /bin/echo `pwd` pwd +++ pwd ++ /bin/echo /home/users/tongsun/txtHist + echo /home/users/tongsun/txtHist /home/users/tongsun/txtHist
$ echo `/bin/echo \`basename \\\`pwd\\\`\` ` echo `/bin/echo \`basename \\\`pwd\\\`\` ` /bin/echo `basename \`pwd\`` basename `pwd` pwd ++++ pwd +++ basename /home/users/tongsun/txtHist ++ /bin/echo txtHist + echo txtHist txtHist
$ echo $(/bin/echo $(basename $(pwd))) echo $(/bin/echo $(basename $(pwd))) /bin/echo $(basename $(pwd)) basename $(pwd) pwd ++++ pwd +++ basename /home/users/tongsun/txtHist ++ /bin/echo txtHist + echo txtHist txtHist
> How can I specify two (or more) commands for xargs. E.g.: > How can i make the two echo both work for the following command: > ls | xargs -i echo {}\; echo {} {}
ls | xargs -i sh -c 'echo {};echo {} {}'
Dave
documented on: 1999.09.17 Fri 13:38:17
>I'm trying to do, I have a list of users that I need to send an email >to in a file. the problem is that I need to customize the subject >line. so what the input file looks like is this: > >email <tab> subject <cr> >email <tab> subject <cr> >... > >I want to loop through the file and use the mail command > >mail $email -s $subject < email.txt
while read email subject; do mail -s "$subject" "$email" <email.txt done <input_file
Ken Pizzini
documented on: 1999.09.24 Fri 11:11:31
> Does unix has a EOF char, like DOS does? How can I emulate an ^D by > program?
The eof character is set in the tty drivers under UNIX; if you want to send an EOF on a pipe you'ld have to close the pipe. > > The reason I'm asking is that I'm feeding both the plot command > and plot data to gnuplot (by command splot "-"). And I found out > that I need to press ^D to get out of data entry mode and get back > to command mode, to issue following commands.
I think it's easier to write the plotdata to a temporary file, and give the name of the file as argument to splot. > > I'm using the the following perl command to simulate the ^D, and > apparently gnuplot doesn't interpret plain ^D char. > > print GP "\n\004\n"; > > Is there any way to do that? Thanks!
There is a way to send EOF's from one process to another by using pseudoterminals (pty/tty pairs) I'll recommend you to read one of Stevens's books (Advanced Programming in the Unix Environment or Unix Network Programming) on the details. Anyway, the telnet deamon uses the trick!
Peter
When writing to a pipe, you send EOF to the reading process by closing the pipe.
However, if your intent was to continue writing more stuff to the pipe, you won't be able to. Perhaps you should be using a program that's designed to have data piped to it, rather than using one that expects to deal with an interactive terminal; I'll bet the gnuplot package includes such a thing.
If not, you'll probably have to use a pty, like the other responder mentioned. Perl probably has a package to automate this.
Barry Margolin
documented on: 2000.11.27 Mon 16:46:15
> > read all my variables < my.input.file > > > > If you need to read several lines, use exec: > > > > exec 3<&0 # save old stdin to unused file descriptor 3 > > exec < my.input.file # make my.input.file stdin > > read line1 > > read line2 > > read line3 > > exec <&3 # get back original stdin > > exec 3<&- # and close fd 3. > >Thanx, that will work. > > > > There is redirection to duplicate input or output to some file > > > descriptor (10<inpfn , 11>outpfn), but I do have no clue whatsoever what > > > these file descriptors are used for and how. > > > > 0 is stdin, 1 is stdout, and 2 is stderr. The rest are yours to > > play with as you need. > >That's what I thought, but what can you do with them? None of the bash >commands uses a file descriptor as argument. In the example you give, >you use them as a temp var to store the standard I/O files. Is there >anything else?
You can do something like:
exec 10< datafile
and then later
read var1 var2 var3 ... <&10
will read the next line from descriptor 10, which is connected to the datafile.
You could also use ksh instead of bash; its "read" command has a -u option to specify a one-digit descriptor number.
Barry Margolin
documented on: 2000.12.20 Wed 12:56:54
Newsgroups: comp.unix.shell
> I have a row of characters separated by a '+' ...like so: > > bob+smith+william ghol+steve > > How can I split the columns and assign each of them to variables. > into this > > var1=bob > var2=smith > var3="wiliam ghol" > var4=steve
This will get you part of the way there
#!/bin/sh x='bob+smith+william ghol+steve' OLD_IFS=$IFS # save internal field separator IFS="+" # set it to '+' set -- $x # make the result positional parameters IFS=$OLD_IFS # restore IFS var1=$1 var2=$2 var3=$3 var4=$4 echo var1=$var1 echo var2=$var2 echo var3=$var3 echo var4=$var4
output:
var1=bob var2=smith var3=william ghol var4=steve
Use the 'read' command to do this, it splits the input into variables based on the current contents of $IFS. You can change $IFS to '+' for only the duration of the 'read' command putting the assignement at the beginning of the line, then it will do everything for you at once ;
$ cat file bob+smith+william ghol+steve $ IFS=+ read var1 var2 var3 var4 <file $ set |grep '^var.=' var1=bob var2=smith var3='william ghol' var4=steve $
Note, be careful where the read is executed:
echo 'a1 b2 c3' | IFS=' ' read var1 var2 var3 var4
$ set |grep '^var.='
— nothing is set, because the read is executed in the sub shell
echo 'a1 b2 c3' > $tf IFS=' ' read var1 var2 var3 var4 < $tf
$ set |grep '^var.=' var1=a1 var2=b2 var3=c3 var4=
— get it this time!
# reset vars 1st $ jot -s ' ' 3 | tee $tf 1 2 3
$ IFS=' ' read var1 var2 var3 var4 < $tf
$ set |grep '^var.=' var1=1 var2=2 var3=3 var4=
export var1 var2 var3 echo 'a1 b2 c3' | IFS=' ' read var1 var2 var3 var4
$ set |grep '^var.=' var1=1 var2=2
— no, even export won't help here!
q="bob+smith+william ghol+steve"
n=0 IFS="+" for v in $q do n=`expr $n + 1` if expr "$v" : '.* .*' > /dev/null then z="var$n=\"$v\"" else z="var$n=$v" fi eval "$z" done
The above script is purely Bourne shell. If you are using a modern shell, there are more efficient ways of doing it.
For example, in bash or ksh, I'd put the values into an array:
q="bob+smith+william ghol+steve" oldIFS=$IFS IFS="+" var=($q) ## var=(${q// /\ }) # to escape spaces (use \\ on command line) IFS="$oldIFS"
$ echo ${var[2]} william ghol
Chris F.A. Johnson
>I need to do the following > >for x in $(cat filename) >do > do anything here with the single lines of the file >done > >This works well if there are no spaces in the lines. > >How can i set the IFS variable to separate the input only in case of >newlines. > >but this starts a new shell and I loose the value of the variables I set in >the loop when I come back ( or is there a way to give them back ?
I have found the solution using goggle at http://www.linux.ie/pipermail/ilug/1999-December/010019.html
Andreas
Beware, the IFS variable is dangerous and should be removed from all vendor distributions. (Garfinkle & Spafford, 1993)
Why do you need to re-set it and is there another way you can solve your problem ?
Many programs will start behaving strangely if it's not what they (you) expect it to be . . . (BASH being one of them).
# possible solution . . . IFSTEMP=${IFS} IFS=^M # put as little code as possible in here . . .
# IFS=${IFSTEMP}
documented on: 2000.09.04 Mon 14:39:21
The '\xa0' in ISO-8859-1 is a "No-break Space":
<NS> /xa0 NO-BREAK SPACE
$ printf 'a\x41G\xa0M' aAG M
$ touch `printf 'a\x41G\xa0M'` $ diron -rw-rw-r-- 1 tong tong 0 Sep 8 23:48 aAG?M
although it list different in ls listing, it shows up nicely on bash command line, when pressing <TAB> to complete the fname.
After all, it is however not suitable for the substitution of <space> char. Because it can neither show up correctly in Unix or Windows. Under Windows, it is shown as a' (one-char).
> Has anyone written shell scripts that can be translated into non-English > languages? I am only interested in translation for prompts and
Use external files which contains the messages, prompts, etc. I wrote a small example script for you. It uses a file where the messages are stored by number. A record look like this : number=text At the beginning of the script, all messages are read into a one- dimentional array and the index-numbers are stored in contants. Change the line MsgFile="English.dat" when you want to use other files with messages in other languages.
Below are the two necessary files, 1st the script and the 2nd is the English.dat file.
# # The script # #!/usr/bin/ksh # Message contants msg_info="1" msg_prompt="2" msg_success="3" msg_failure="4" msh_end="5" # The message file MsgFile="English.dat" # Use grep to ignore comment- and empty lines grep -Ev '^#|^[ ]*$' ${MsgFile} | while read MsgLine do Messages[${MsgLine%%=*}]="${MsgLine##*=}" done # The part of the script which actually uses the messages echo "${Messages[${msg_info}]}" printf "${Messages[${msg_prompt}]}" ; read Value case "${Value}" in y|Y) echo "${Messages[${msg_success}]}" ;; *) echo "${Messages[${msg_failure}]}" ;; esac echo "${Messages[${msg_end}]}"
# # English.dat # 1=This is a script with multi-language support 2=Don't you think it's a nice script (Y/N) ? : 3=I thought so ! 4=That's to bad ! 5=This is the end of the script - bye.
> I need to write a program that will basically hit the web site and > return a value to a log...Thought of using a call to lynx, but wasn't > sure how to determine success or failure
Use $?.
while sleep 30 ; do lynx -head -dump http://www.yoursite.com if [ $? -ne 0 ] ; then logger "Web site is down." fi done &
J. S. Jensen
Newsgroups: comp.unix.shell
> > > What exactly does export mean? I mean, does it like kind of declaration > > > or something? > > > > > > What the impact would it be if the order of setting and exporting > > > is different. > > > > > > Let me explain with examples: > > > > > > # case 1 > > > export PS1 > > > PS1='$ ' > > > > > > # case 2 > > > PS1='$ ' > > > export PS1 > > > > > > # case 3 > > > PS1='$ ' > > > export PS1 > > > PS1='# ' > > > > > > what is the system status for above 3 cases. Thanks > > > > They all end up being the same in the end. An export means to make > > the variable available to all child processes. Without the export, > > the child process won't inherit the parents nonexported variables. > > Since you are all in the one process setting the PS1 and exporting > > it, all 3 will export the variable to the children. The export also > > only needs to be executed once, just like typeset -x. So for your > > third example, where you change the value of PS1 after the export, > > the variable is STILL exported so the new value will be used by the > > children called after that point. > > So I thought. I never thought something else until I used bash2.03 > in linux debian. > > bash-2.03$ set | grep PS1 > PS1='\s-\v\$ ' > bash-2.03$ env | grep PS1 > bash-2.03$ export PS1 > bash-2.03$ env | grep PS1 > PS1=\s-\v\$
Whay are you using 'env' with bash? env is a csh-ism (actaully usually a separate command, e.g. /usr/bin/env). Use export.
> so far so good... > > bash-2.03$ sh -c 'echo $PS1' > > bash-2.03$ > > and, now, why the sub-process 'sh' didn't get the variable? Does it > mean bash for debian is different? I tested on Solaris and it worked > as I expected (PS1 get printed).
I tested it on RH5.2 running bash-1.14.7 (eeek!) and got the same result as above. I then tried the following on a SunOS 5.7 and got this:
bash$ ls -l $(cat /etc/shells) $(command -v ksh) $(command -v bash) -r-xr-xr-x 1 bin bin 151976 Oct 6 1998 /bin/csh* -r-xr-xr-x 3 bin root 91668 Oct 6 1998 /bin/sh* -r-xr-xr-x 2 bin root 257444 Sep 1 1998 /sbin/sh* -r-xr-xr-x 1 bin bin 151976 Oct 6 1998 /usr/bin/csh* -r-xr-xr-x 2 bin bin 192764 Oct 6 1998 /usr/bin/ksh* -r-xr-xr-x 3 bin root 91668 Oct 6 1998 /usr/bin/sh* -rwxr-xr-x 1 gnu gnu 606724 Dec 18 1996 /usr/local/bin/bash*
Note that sh _doesn't appear_ to be ksh (diff reportes them as being different too: well it would really, since it comapares file size first). I *think* that sh is an original version of Bourne.
[me@machine me]$ sh -c 'echo $PS1'
[me@machine me]$ ksh -c 'echo $PS1'
[me@machine me]$ bash -c 'echo $PS1' [\u@\h \W]$
Moral of the story? You should *never* rely on PS1 being set in a non-interactive shell session. If you did the following, however, then it would work:
[me@machine me]$ ksh -c 'PS1="hello >" ; echo $PS1' hello > [me@machine me]$ sh -c 'PS1="hello >" ; echo $PS1' hello > [me@machine me]$ bash -c 'PS1="hello >" ; echo $PS1' hello >
The PS1 variable is a special case though, as it is automatically exported by the shell, e.g. the above is equivalent to:
sh -c 'export PS1="hello >" ; echo $PS1'
For a child shell to inherit it's parent's variables, that variable must be exported, e.g.
[me@machine me]$ PSTEST="hello >" [me@machine me]$ bash -c 'echo $PSTEST'
[me@machine me]$ export PSTEST="hello >" [me@machine me]$ bash -c 'echo $PSTEST' hello >
so the above becomes:
[me@machine me]$ export PS1="$(echo $PS1)" [me@machine me]$ksh -c 'echo $PS1' [\u@\h \W]$ [me@machine me]$sh -c 'echo $PS1' [\u@\h \W]$ [me@machine me]$bash -c 'echo $PS1' [\u@\h \W]$
So, PS1 is only exported automatically to other _bash_ shells, not to ksh, or sh. Since bash aims to emulate sh as closely as possible when it is run as sh, that's why you don't see your new prompt.
I hope this clears it up for you. :-)
Cheers,
Dave.
|>>> What exactly does export mean? I mean, does it like kind of declaration |>>> or something? |>>> |>>> What the impact would it be if the order of setting and exporting |>>> is different.
In recent shell implementations, no difference. In very old Bourne shell implementations, "export" copies, so subsequent modifications to the variable don't go into the environment.
|> bash-2.03$ set | grep PS1 |> PS1='\s-\v\$ ' |> bash-2.03$ env | grep PS1 |> bash-2.03$ export PS1 |> bash-2.03$ env | grep PS1 |> PS1=\s-\v\$ | | Whay are you using 'env' with bash? env is a csh-ism (actaully usually a | separate command, e.g. /usr/bin/env). Use export.
He's using "set" to display local variables, and "env" to display environment variables. I'd use "printenv PS1", but there's nothing wrong with the way he's doing it. "export" wouldn't be useful.
| Note that sh _doesn't appear_ to be ksh (diff reportes them as being | different too: well it would really, since it comapares file size | first). I *think* that sh is an original version of Bourne.
Diff, ls etc. are not always good ways to identify shell implementation. When they're not identical files, the next step is "strings".
| ... The PS1 variable is a special case though, as it is automatically | exported by the shell, ...
PS1 is not automatically exported, from bash or ksh.
$ unset PS1 $ PS1='CMD> ' CMD> printenv PS1 CMD>
Donn Cave
>and, now, why the sub-process 'sh' didn't get the variable? Does it >mean bash for debian is different?
It means that sh is resetting PS1 on start-up. Try testing with a variable which has no special meaning to the shell.
Ken Pizzini
documented on: 2000.06.06 Tue 10:56:58
> ps -fu $1 | awk '{print $2}' > > PID > 1034 > 4290 > > What's an easy way to eliminate the heading-string (PID)
format of an awk command is
True=nonzero, false = 0 NR = record (line) number (begins at 1)
So:
ps -fu $1 | awk 'NR > 1 {print $2}' > user_proc
documented on: 22:24:17
Newsgroups: comp.unix.shell
> typeset -i i=0 > while [ "$i" -lt 10 ]; do > <command> > let i=i+1 > done > > "((i=i+1))" or "let i+=1" are alternative syntaxes for the 4th line that may > also work in bash.
in bash you do it like this
#!/bin/bash i=0 while [ "$i" -lt 10 ] do <command block> i=`expr $i + 1` done
joeri
integer i for ((i=0; i<10; i++)) do ... done
You need version 2.04. And you will need to:
alias integer='typeset -i'
Dan Mercer
documented on: 2001.03.19 Mon 00:28:33
host:~/ndl>echo a > a host:~/ndl>echo a > b host:~/ndl>cat a b >a cat: a: input file is output file
host:~/ndl>cat b >>a host:~/ndl>cat a a a
documented on: 02-06-99 19:57:54