Is anyone using awk?

Studying and awk came up.

Spent about an hour and I see some useful commands that extend past what “cut” can do. But really when dealing with printf() format statements is anyone using awk scripts for this?

Or is everyone just using their familiar scripting language. I’d reach for Python for the problems being presented as useful for awk.

jxk,

Awk has the advantage over Perl/Python/etc. that it’s standardized by POSIX. Therefore you can rely on it on all operating systems. It’s pretty much the only advanced scripting language available that is POSIX – the alternative would be some heavy shell scripting or almost-unreadable sed.

d3Xt3r,

Therefore you can rely on it on all operating systems.

… all except that one OS which we don’t like to talk about but annoyingly remains the most popular consumer OS. :P

yetAnotherUser,

Android?

Ramin_HAL9001, (edited )

I used to use the command line, Bash, Awk, Sed, Cut, Grep, and Find (often piped to one another) quite often. I can recall that the few times I used Awk was usually for collating lines from logs or CSV files.

But then I switched to using Emacs as my editor, and it gathers together the functionality of all of those tools into one, nice, neat little bundle of APIs that you can easily program in the Emacs Lisp programming language, either as code or by recording keystrokes as a “macro.”

Now I don’t use shell pipelines hardly at all anymore. Mostly I run a process, buffer its output, and edit it interactively. I first edit by hand, then record a macro once I know what I want to do, then apply the macro to every line of the buffer. After that, I might save the buffer to a file, or maybe stream it to another process, recapturing its output. This technique is much more interactive, with the ability to undo mistakes, and so it is easier to manipulate data than with Awk and shell pipelines.

netwren,

This is fascinating to me. Do you have any links or suggestions for this workflow to learn more?

Ramin_HAL9001, (edited )

This is fascinating to me. Do you have any links or suggestions for this workflow to learn more?

I am glad you asked, because I actually wrote a series of blog posts on the topic of how Emacs replaced my old Tmux+Bash CLI-based workflow. The link there is to the introductory article, in the “contents” section there are links to each of the 4 articles in the series. The “Shell Basics” (titled “Emacs as a Shell”) might be of particular interest to you.

If you have any specific questions, or if you have recommendations for something you think you would like to learn from one of my blog posts, please let me know. I would like to write a few more entries in this blog series.

pelya, (edited )

Grep is fiiiiine.

sed is okay but a little nasty, when your sed script is longer that one search-replace command you gotta ask yourself what you’re doing really (yes, sed is a full-featured Turing-complete programming language, if you go far enough into the man page).

When I see awk in any stackoverflow recipe, I just say ‘fuck it’ and rewrite the whole thing in Python. Python is included into the minimal system image in Debian, the same as awk, but is way less esoteric, and you can do python -e ‘import os, sys; commands;’ for a one-liner console script.

And if you want to talk about portability, try writing scripts for Android 4.4 ash shell. There’s no [ ] command. You do switch/case to compare strings.

netwren,

Have you tried ripgrep?

pelya,

No, and I don’t think I will learn another tool for something that I can already do using grep/sed/find commands, which I know by heart.

netwren,

That’s fair

WalrusByte,
@WalrusByte@lemmy.world avatar

Sometimes I copy and paste an awk command from online, but I can never remember how to write it myself

cybersandwich,

I’m convinced no one actually knows how to write awk. It’s all copy and pasted from the web.

caseyweederman,

I am very, very slowly chiseling it into my long-term memory. I feel like Rincwind.

upt,

No, but I heavily use perl still… I feel like you can’t really call yourself a Linux person without knowing perl and python both. Knowing awk can’t hurt though.

sping,

Really? I disliked Perl for 3 decades on unix and Linux and I’ve never felt like I have been held back by not knowing or using it. I don’t remember the last time I saw a Perl script, let alone needed to understand one.

eestileib,

Perl kinda killed awk and sed.

Then python kinda killed perl.

AnUnusualRelic,
@AnUnusualRelic@lemmy.world avatar

I used awk until perl, then there was no going back.

bizdelnick, (edited )

Yes, but for a very specific case. I used to write highly portable scripts that could be executed in different environments (various linux distros, including minimal containers, freebsd and even solaris 10). I couldn’t use bash, perl, python and even gawk. Only POSIX shell (I always tested my scripts with dash and ksh93, for solaris 10 compatibility - with its jsh), portable awk (tested with original-awk, gawk and mawk) and portable sed (better forget it if you need to support solaris).

Before that I didn’t understand why should I need awk if I know perl. And awk really sucks. Once I had to replace a perl one-liner with an awk script of ~30 lines for portability.

P.S. I never use awk just for print $1 as many do. It’s an overkill.

bionicjoey,

P.S. I never use awk just for print $1 as many do. It’s an overkill.

cut is better for this use-case IMO. Awk is good for when cut won’t cut it.

Lydia_K,
@Lydia_K@startrek.website avatar

I use awk all the time, nothing too fancy, but when you need to pull out elements of text it’s usually way easier than using cut.

awk {’ print $3 ‘} will pull the third element based on your IFS variable (internal field separater, default is whitespace)

awk {’ print $NF ‘} gets you the last element, and awk {’ print $(NF-1) '} gets you one element from the last, and so on.

Basic usage but so fast and easy for so many everyday command line things.

FigMcLargeHuge,

You can also add to the output. I use it frequently to pull a list of files, etc, from another file, and then do something like generate another script from that output. This is a weak example, but one I can think of off my head. Not firing up my work laptop to search for better examples until after the holidays. LOL.

awk {‘print "ls -l "$1’}

And then send that to a file that I can then execute.

ikidd,
@ikidd@lemmy.world avatar

Just had to use it today to turn a key file into a single string with line breaks:

awk ‘NF {sub(/r/, “”); printf “%s\n”,$0;}’ id_rsa

tanakian,

awk often can be found in my scripts.

willybe,

I used awk to migrate users from one system to another. I created template scripts for setting up the user in the new system, I dumped the data from the old system, then used awk to process the dump and create scripts for each user in the new system. That was a fun project.

ninekeysdown,
@ninekeysdown@lemmy.world avatar

Everyday. I’ve got a lot of stuff that uses it. Granted most of it was mostly created a decade ago but with minimal maintenance it works great. The most helpful script is parsing megacli outputs so I can get a heads up on drive failures and rebuilds among other things.

Legisign,

cut is actually next to useless, because it cannot understand that multiple spaces can still be a single separator in most text files in /etc. You have to use AWK.

WindowsEnjoyer,

Best use-case of AWK is that you can avoid using grep for picking a Nth word in specific line. I tend to ask GPT4 to write one-liner for me. Works super great.

AMillionNames,

awk is supposed to be simpler. If it isn’t, just use your favorite scripting language. It comes from a period of time when a lot of the scripting languages weren’t as easy to use or readily available.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • linux@lemmy.ml
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #