[conspire] vim vs sed awk and grep + interactive vs automation [was Re: Slice of life]

Tony Godshall tony at of.net
Sat Sep 19 17:33:42 PDT 2009


Thanks to Rick and Carl for that very cogent discussion.

...
>> On an unrelated note, I also liked your mention of the advantage of
>> scripting over crappy GUI tools - what an exquisite example to use
>> when someone asks "Why should I bother to learn this crap anyways when
>> I can just use a GUI?".  The more I use the CLI, the more comfortable
>> I get with it and the more productive it makes me.
>
> Sadly, this is a case where even my beloved vim can be The Wrong Thing.
>
> o  When I found myself making a change across two zonefiles, I resisted
>   getting really good with sed, awk, and grep.
>
> o  When I found myself making a change across a dozen zonefiles, I _still_
>   resisted getting really good with sed, awk, and grep.
>
> o  But when I suddenly found myself having to make a change across 200
>   zonefiles, _then_ I decided it was time to get really good with sed,
>   awk, and grep.  ;->
>
>
> Manual editing seems like a great idea until you start adding zeroes to
> the number of files to edit.  And, even if you edit all 200 files
> flawlessly, how about the backout procedure?  And how do you get other
> people to review the main and backout procedures in advance?
>
> A backout procedure on 200 files consisting of however many lines of sed
> and awk runs in milliseconds, whereas doing the same fix using "vi *" or
> whatever takes ages -- time you don't want to spend when the production
> DNS is broken.
>
> Automated editing / searching / filtering is not just utterly cool but
> also necessary to quality and timeliness of work.

Amen to that.  My personal favorite for this kind of stuff is perl:

A better sed than sed:

  perl -i -pe 's{}{}'

Faster and no leaning toothpick hell

A better awk than awk:

  perl -ne 'if(m{regexp}){print "blah blah"}'

Again faster, and in both cases better and more consistent eregexps

I still use grep/egrep but quickly switch to perl if I need to do
anything with the results.

That said, I don't generally program in perl except for rolling a
oneliner into a /usr/local/bin script for convenience: generally just
adding "use strict" and breaking into lines for readability.[1]

One thing you haven't mentioned but is very useful is that when using
vim or other interactive tool to edit is that you might realize after
editing you want to automate: use version control (cvs, git, watever)
diff.  That tells you exactly what you changed which you can then roll
into a patch or a series of very scriptable s{}{}- this is of course
not a comment for Rick and Carl but rather for those others on the
list who are learning from this discussion.

Best Regards

Tony



[1] There is actually a "rational subset"[2] of perl that's actually
quite readable.

[2] We actually coined this term about C++ when we were developing
warehouse management systems in Napa and had seen some examples of
code using advanced features of C++, apparently by programmers wanting
to show just to show how cool they were.




More information about the conspire mailing list