Is anyone using awk?

Studying and awk came up.

Spent about an hour and I see some useful commands that extend past what “cut” can do. But really when dealing with printf() format statements is anyone using awk scripts for this?

Or is everyone just using their familiar scripting language. I’d reach for Python for the problems being presented as useful for awk.

ikidd,
@ikidd@lemmy.world avatar

Just had to use it today to turn a key file into a single string with line breaks:

awk ‘NF {sub(/r/, “”); printf “%s\n”,$0;}’ id_rsa

tanakian,

awk often can be found in my scripts.

willybe,

I used awk to migrate users from one system to another. I created template scripts for setting up the user in the new system, I dumped the data from the old system, then used awk to process the dump and create scripts for each user in the new system. That was a fun project.

ninekeysdown,
@ninekeysdown@lemmy.world avatar

Everyday. I’ve got a lot of stuff that uses it. Granted most of it was mostly created a decade ago but with minimal maintenance it works great. The most helpful script is parsing megacli outputs so I can get a heads up on drive failures and rebuilds among other things.

Legisign,

cut is actually next to useless, because it cannot understand that multiple spaces can still be a single separator in most text files in /etc. You have to use AWK.

WindowsEnjoyer,

Best use-case of AWK is that you can avoid using grep for picking a Nth word in specific line. I tend to ask GPT4 to write one-liner for me. Works super great.

AMillionNames,

awk is supposed to be simpler. If it isn’t, just use your favorite scripting language. It comes from a period of time when a lot of the scripting languages weren’t as easy to use or readily available.

jxk,

Awk has the advantage over Perl/Python/etc. that it’s standardized by POSIX. Therefore you can rely on it on all operating systems. It’s pretty much the only advanced scripting language available that is POSIX – the alternative would be some heavy shell scripting or almost-unreadable sed.

d3Xt3r,

Therefore you can rely on it on all operating systems.

… all except that one OS which we don’t like to talk about but annoyingly remains the most popular consumer OS. :P

yetAnotherUser,

Android?

WaLLy3K, (edited )

awk is pretty damn solid. When I was completely rewriting the gravity.sh script from Pi-hole about six years back, it was easily the fastest for parsing and uniquely sorting content from files with a couple million lines. It made things much more usable on Raspberry Pi Zero hardware, since changing to another language like Python was out of the question.

OpenStars,
@OpenStars@discuss.online avatar

awk is awesome! I love it, and I do not regret learning how to use it.

That said, my workflow invariably always shifts from starting with awk to do something simply with a tiny one-liner, to then doing that with perl or python, and sometimes even creating a file to make the by-now multi-line scripts more easily readable.

I do not recommend starting with awk, if you do not know other languages already such as Python.

In short, let your intuition guide you.

olafurp, (edited )

I think it’s pretty niche but is a great tool for parsing / converting data into a format that is more easily digested by another program.

Think for example a report from an 80’s system that spits out many tab separated values in a different format based on some code. Then these tables are all separated by two blank lines and order of them is randomised. To top that off you need to then pipe it all to a different program that only accepts a specific format.

You could do it in Python by doing a parse, process, stringify code but if you know awk you can do all those steps at the same time with less code.

Sure, in the age of REST the Python approach is better but awk is a very powerful tool for the “I have a specific output but need a specific input” problem.

visika,

I used awk in a physics computer simulations course I had at the university. That’s a nice tool to know how to use

Decker108,

I think I’ve used it once in 15 years or so. It’s typically easier to go with bash or Python.

neidu2,

Every day. piping srdout to a combination of awk and sed makes shell operations a lot easier. A lot of my earlier perl hacks have now been replaced by a combination of awk, sed, and xargs

corsicanguppy,

I use awk on the daily. It has a wider and more consistent install base than perl.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • linux@lemmy.ml
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #