There was a post on linuxquestions.org:
Newsgroup posting software
http://www.linuxquestions.org/questions/linux-general-1/newsgroup-posting-software-yenc-and-uuencode-612354/#post3020065
that asked for a GUI software to post a cd image. The conclusion reply was due to a lack of "huge demand for posting software". This really triggered me into thinking…
> Well, it seems there isn't a huge demand for posting software, your only > option may be the CLI app newspost, here are some examples. . .
Yeah, that's true. But to me, the reason is not merely no huge demand, but the fundamental different between *niX and Window$' philosophy.
Window$' philosophy is everything GUI. For example, You can find over hundreds of renaming tools, from mp3 specify renaming tools to general purpose ones. So if you need to batch rename something, you need first to think, "Oh boy, of the over a hundred renaming tools that I have, which one can help me this time?" For those hard to do requests, you may have to look into the help file of each specific renaming tool, then finally realize that none of them can help you this time.
For me, under niX, the *single* CLI command 'rename' never failed me, no matter how bizarre the renaming request might be.
$ rename Usage: rename [-v] [-n] [-f] perlexpr [filenames]
OK, enough OT talk and back to newposting software. Again, to me the CLI makes more sense. Using the very same example OP uses — posting a cd/dvd image, a simple command,
newspost -q -y -n alt.binaries.test -s 'big image' the.package.iso.*.rar
will post the over-a-hundred rar files for you automatically. This is *much more* straightforward than launching a GUI and then click, click, click, click, click, click, click…
Moreover, you can use at or cron to schedule the posting, eg, while you were sleeping. This is trivial to CLI tools, but you will be at the mercy of the GUI tools to give you that feature.
The CLI tools have the greatest flexibility than any GUI tools could possibly have, yet they demand the least footprint of memory use. The GUI opponent would normally be ten times bigger, and more than ten times less flexible.
Still, all Window$ GUI tools proudly boast that they are standalone all-in-one product that can solve all your needs. E.g., If you have installed three newposting GUI tools, each of them will have their own NNTP protocol handling, new posting, par checking and repairing, or even NZB parsing and generating capabilities, instead of relying on common tools to handle the common tasks. I rarely see Window$ tools that are based on other tools, because normal Window$ users never expect that it will happen and will freak out on seeing a tool that depend on more than three other tools.
The *niX tools however is well known to utilize existing tools as much as possible. It is very common to see a Debian package that bases on more than ten other packages. The CLI tools are fundamental building blocks to achieve versatile goals. It is very common in this note collection that 3 to 5 (and even more) CLI tools works together to achieve one goal (One very common Window$ oriented question was, "how come grep doesn't have the capability to recursively descend into sub directories and search within a specific type of files". The answer is, "welcome to the niX philosophy — a program will only do one thing but do it well").
Let me explain using a vivid example. Each *niX CLI tool is one power tool in your tools collection. You use them to make whatever thing you'd like to make, be it chair, desk, shelf, etc. There is one Window$ software company that sells a standalone all-in-one GUI tool, chair-maker. The chair-maker is really convenient for you, all you need is to throw in the wood, nails and paints. It will then spit out a chair for you. Easy!!!
But wait, how about the sizes, and styles? Now the trouble begins, all kinds of chair-maker producers now produce all kinds chair-makers that have slightly different controls over how to customize build chairs of different sizes and styles. Some advanced ones will boast that their products are capable of building stools as well (since chairs and stools are really not that much different). Some really advanced ones will boast that they can build all kinds chairs, stools, and even bar-stools. The problem is, they need to have panels and panels of control bottoms in order to make "all kinds" of them that you want. Now everybody is happy. But is it really? What if you want to do a little bit different? Also, does such fancy chair-maker can produce all styles of chairs exist in the world? Imagine how complicated just its control panels and bottoms will be in order to produce a chair like this:
A normal Window$ user will normally end up with several chair-makers in their tools collection to make various chairs. Further, this is yet only about a chair. They also need maker for desk, shelf, etc as well. I.e., each thing they make they end up having several "easy" but actually fancy, expensive and cumbersome tools. Now looking back at the "simple" niX power tools in your tools collection, which philosophy makes more sense? Which tool set is hard in learning to use in the long run? Which tool you will more likely to forget how to use after a longer period of time? Before I entered the *niX world, I was impeded by the common saying "*niX do not have as many tools as Window$ has". Now I just laugh at such stupid propaganda.
documented on: 2008-01-12
http://www.linuxlots.com/~dunne/unix-philosophy.html
While the rest of the world points and clicks in a scary little world of icons, all alike, we in the world of Unix get to use a good old-fashioned CLI, or Command Line Interface. One reason why the command line has remained so pervasive in Unix environments is that the implementation, the Unix shell in its various incarnations, is actually pretty damn good. It allows the user to use the tools provided to build new tools. This, by any other name, is programming. And programming is the essential activity of computing. Without it, a computer, however expensive the materials of which it is made, is no more than an expensive heap of junk. At all levels beyond the bare transistors, it is programs that make it what it is.
The unfortunate legions of office workers today saddled with Windows are obliged to worship their computer as an all-knowing god that can do no wrong, that is always finding fault with them; and consequently develop a fierce hatred for it. This is inevitable. One cannot effectively use any tool without some understanding of its workings.
Almost as soon as one begins to use Unix, one is programming the shell. The first pipeline one builds,
ls -l | less
for instance, shell program is a small program in itself. Shell programming proper begins when such combinations of commands are put in a file where they can be run repeatedly. Unix makes no distinction between executable files of one stripe or another. A text file with execute permission containing our little pipeline above is no different to it, in principle, than GNU Chess. This is a great advantage, in that it allows us to "cut our coat to suit our cloth", so to speak, in choosing the most appropriate programming tool for the task in hand, secure in the knowledge that whatever we choose to build in, our finished product will be treated by the system as just another program.
"There are many people who use UNIX or Linux who IMHO do not understand UNIX. UNIX is not just an operating system, it is a way of doing things, and the shell plays a key role by providing the glue that makes it work. The UNIX methodology relies heavily on reuse of a set of tools rather than on building monolithic applications. Even perl programmers often miss the point, writing the heart and soul of the application as perl script without making use of the UNIX toolkit." | ||
-- David Korn |
"This is the Unix philosophy. Write programs that do one thing and do it well. Write programs to work together. Write programs to handle text streams, because that is a universal interface." | ||
-- Doug McIlroy |
Most computer application programs can be thought of as software tools. If the only tool you have is a hammer, everything looks like a nail. The guy who writes letters in his spreadsheet program is a good example of this. Unix programs too are software tools. And Unix is a toolbox stuffed full of these tools. The more tools you have, the more you can do.
Two concepts in particular stand out that make the "toolbox" much more useful. The first is the idea of the filter.
The concept of a filter is a key idea for anyone who wishes to use Unix effectively, but especially for the programmer. It is one that migrants from other operating systems may find new and unusual.
So what is a filter? At the most basic level, a filter is a program that accepts input, transforms it, and outputs the transformed data. The idea of the filter is closely associated with several ideas that are central features of Unix: standard input and output, input/output redirection, and pipes.
Standard input and output refer to default places from which a program will take input and to which it will write output respectively. The standard input (STDIN) for a program running interactively at the command line is the keyboard; the standard output (STDOUT), the terminal screen.
With input/output redirection, a program can take input or send output someplace other than standard input or output — to a file, for instance. Redirection of STDIN is accomplished using the < symbol, redirection of STDOUT by >.
The second idea is the pipe. The pipe (|) is a junction that allows us to connect the standard output of one program with the standard input of another. A moment's thought should make the usefulness of this when combined with filters quite obvious. We can build quite complex programs, on the command line or in a shell script, simply by stringing filters together.
The combination of filters and pipes is very powerful, because it allows you a) to break down tasks and b) to pick the best tool for tackling each task. Many jobs that would have to be handled in a programming language (Perl, for example) in another computing environment, can be done under Unix by stringing together a few simple filters on the command line. Even when a programming language must be used for a particularly complicated filter, you are still saving a lot of development effort through doing as much as possible using existing tools.
I could go on, but the Unix Philosophy has been expounded in detail already by some of its most notable proponents. What I've written here should hopefully provide a concise summary. For further reading, you could do worse than start with two books by some Bell Labs luminaries: The Unix Programming Environment. and Software Tools. I should also mention Mike Garantz's The Unix Philosophy. This provides a useful foil to the other two books: shorter on examples and code, longer on the ideas behind them, and very much worth reading.
Here are some web resources:
Copyright (c) 1995-2007 Paul Dunne,
documented on: 2008-01-12
http://www.linuxjournal.com/article/2877
September 1st, 1995 by Belinda Frazie
From the book "The Unix Philosophy"
Author: Mike Gancarz Publisher: Digital Press ISBN: 1-55558-123-4
The main tenets (each of which have sub-tenets) of the Unix philosophy are as follows:
The author introduces each tenet with a simple, real-world example (or "case study") , then further explains why the tenet is important by including non-technical computer-world examples.
Tenet 1. Small is beautiful.The book offers an example of how Volkswagen ran an ad campaign with the phrase "small is beautiful" in the US to promote the VW bug, but the idea was generally ignored in the US until the price of oil went up and Americans learned the advantages of small cars. The author draws an analogy to these nouveau small-car-appreciators to programmers at AT&T Bell Labs discovering that small programs were also easier to handle, maintain, and adapt than large programs.
In a non-Unix environment, a program to copy one file to another file might include, as in an example given in the book, twelve steps which do more than perform a file copy. The twelve steps perform extra tasks, some of which are considered "safety features" by some. The steps might include checking to see if the file exists, if the output files are empty, and prompting users to see if they know what they're doing (for example, "Are you really really sure you want to do this, and does your mother know you're doing this?"), etc. Just one step of the sequence might be the actual copy command. A Unix program (or command) would only include the one copy command step. Other small programs would each do the other 11 steps and could be used together if the Unix user wanted to use these extra steps. Although the author purposefully steers away from giving Unix examples until near the end of the book, I would have liked to see several Unix commands strung together to accomplish all the tasks described by the twelve steps.
Tenet 4. Choose portability over efficiency.The example given here is of the Atari 2600 which was the first successful home video game. Most of the code for the game cartridges was very efficient but nonportable. With the advent of new hardware (the "5200"), the code had to be rewritten to run on the 5200 which took time and money. The author proposes that Atari would have been the largest supplier of software in the world if its code had been portable.
There is a three-page analogy of selling Tupperware to the "use software leverage to your advantage" tenet. Who would have realized a multilevel marketing scheme is a good way to write software?
A sub-tenet of the leverage tenet is allow other people to use your code to leverage their own work. Many programmers hoard their source code. The author states that "Unix owes much of its success to the fact that its developers saw no particular need to retain strong control of its source code." Unix source code was originally fairly inexpensive compared to the cost of developing a new operating system, and companies started choosing Unix as the platform to build their software on. Companies who chose Unix spent their effort and money on developing their applications, rather than on maintaining and developing an operating system.
documented on: 2008-01-12