Thursday, May 14, 2020

Useful Linux Commands

Whether you want to try running Linux on your desktop or just need to connect to a Linux server, these commands will come in handy.

Another handy item - a website where you can type/paste in a unix command and it will explain each chunk of it to you:  http://explainshell.com

Beginning Commands

  • man - displays the manual for a command - reading the manual for any of these commands is a great idea.  Also, most commands support the `--help` switch to make them tell you about their options.
  • cal - displays a calendar for the current month.
    • Variations:
      • `cal 9 1971` - displays the calendar for Sept 1971.
      • `cal 9 1752` - this is a fun one...can you explain why it looks different?
  • cat - concatenate - types out a file (or files) to standard output.  You can redirect it into a file or another command.  (See below for more on that.)
  • cd - change directory...like on DOS/Windows.
    • Variations:
      • `cd` with no args takes you to your home directory (called ~ for short).

      • `cd -` takes you to the previous directory.  Repeated use of this will toggle between two directories.
      • `cd /` takes you to the root (top level) directory.
      • `cd ..` takes you to the parent of the directory you were in.
  • clear - clears the terminal screen.
  • cp - copy - copies files
    • Variations:
      • `cp -r somedir` - recursively copies the whole directory tree under somedir.
  • date - shows the current date and time.  (This can be a good way to see what time zone the machine uses.)
  • df - shows disk free space.
    • Variations:
      • `df -h` gives human-readable values like 32G, 16M, etc.
  • du - shows disk usage.  The -h flag works here too.
  • echo - displays a message on screen - this is more useful in shell scripts.
  • env - lists your environment variables.
  • file - tells what kind of file this is.  Try it with a few JPGs, MP3s, source code files, Word docs, etc.
  • head - shows the first <n> lines of a file.
  • less - similar to the 'more' command in DOS/Windows; makes lots of output go by one screen at a time.  It allows you to go back and to search for a string - read the man page.  (The 'more' command exists in Linux too...but 'less' is way better!)
  • locate - another "find file" utility.
  • mkdir - makdirectory - like on DOS/Windows.
    • Variation:
      • `mkdir -p` will make a deeply nested directory even if all the intermediate directories didn't exist yet...it makes all of them in one go.
  • mv - move a file/directory.  Also used for renaming.
  • pwd - print working directory.  Most Linux systems are set up so that the working directory is shown in your prompt, but in case yours isn't, there's pwd.
  • rm - remove a file/directory.

    • `rm -r` - Recursive delete - USE CAUTION

    • `rm -f` - Force delete (don't ask "are you sure" for write-protected files) - USE LOTS OF CAUTION

    • `rm -rf` - SERIOUSLY DUDE WATCH OUT

  • sleep - delay for a specified number of seconds.  This is nice sometimes in shell scripts or repeated commands.
  • sort - sorts the contents of a text file and displays the results on screen.
  • sudo - superuser do - run a command as the superuser.
  • The <Tab> key is like magic when you're typing a command...after typing a letter or two, <Tab> autocompletes a filename, a command, etc.
    • Variations:
      • a<Tab><Tab> - prints a list of all commands starting with 'a'.
      • <Tab><Tab> - prints a list of all available commands (autocomplete without typing anything first).
  • tac - like cat, but backwards.  (Honestly, I've never used this, but it's simple and fun.)
  • tail - shows the last <n> lines of a file.
    • Variations:
      • Watch a file as some other process is writing to it using the "follow" flag:  `tail -f logfile.txt`
  • touch - sets a file's timestamp to now.  If the file doesn't exist, it creates it (with zero length).
  • top - shows a constantly updating list of running processes.  It has a few hotkeys for customizing the list - press the '?' button to see a list of them.
  • traceroute - shows the network path to a remote host.  It's not often useful, but it's fun.
  • uniq - prints out only the unique lines in a file.
  • wc - word count - counts words, lines, letters, etc. in a file.
  • whatis - displays a one-line summary of the specified command.
  • whereis - another "find file" utility.

Intermediate Commands

  • chown - change owner of a file or directory.  You have to be the owner of the file/dir (or the superuser) to do this.
    • Variations:
      • `chown someguy.somegroup somefile` - sets the owner and the group on the file all at once.
  • chgrp - change group - changes the group for a file/dir.
  • Ctrl+C - kill the current process.
  • Ctrl+D - logout immediately (can be surprising if hit by accident)
  • Ctrl+L - redraw the screen (sometimes useful if the screen gets messed up)
  • Ctrl+Z - suspend the current process ("send it to the background").
  • fg - bring a suspended process back to the foreground.  You can supply a Process ID as an argument to tell it which process to bring back.
  • diff - finds differences between two files.  There are a lot of options...type `diff --help` to see them.
  • find - searches for files.
    • Search for a specific file:  `find . -name somefile.txt`  The '.' in that command means to start searching at the current directory.
    • Search for files with a name pattern:  `find . -name '*.java'` (You need to put the wildcard string in quotes to keep the system from expanding it...you want the find command to see it as a star, not a long list of filenames.)
    • Search for files older than one week: `find . -mtime +7`
    • There are a lot of other ways find can search for files...read the man page to learn about them.
  • grep - searches contents of files.
    • Variations:
      • `grep -i` - make the search case insensitive.
      • `grep -r` - recurse through subdirectories.
      • `grep -l` - only list the names of matching files; don't show the matching content itself.
  • history - shows all the commands you have run today (and maybe before then).
    • Associated commands:
      • `!1234` - run command number 1234 from the history list.
      • `!abcd` - run the last command that started with 'abcd'.
      • `!!` - repeat the previous command.  (Like pressing the up-arrow and then Enter.)
      • Ctrl+R - reverse search - as you type, it will find the most recent command that contains what you've typed as a substring.  When (if) you see it display the command you want, press Enter to run it.  (Or left-arrow to pull it up so you can edit it first before running.)
  • kill - stops a process from running.  (Find the Process ID with ps or top.)
    • Variation:
      • `killall abcd` - stops all processes named abcd.
  • ps - process status - lists running processes.  It has many possible switches to affect what it shows and how - read the man page to see a lot of examples.

Advanced Commands

  • chmod - change mode - apply permissions to a file/directory
    • Possible permissions are read, write, and execute.  (Execute permission for a directory means you can cd into it.)
    • Permissions can apply at the level of the item's owner, a user group, or all users on the system.
    • Notation type 1: "ugo"
      • u, g and o stand for user, group and other  ("user" meaning the user that owns the file).
      • r, w and x stand for read, write and execute.
      • + means to turn a permission on; - means to turn it off.
      • Examples:
        • `chmod u+rw,g+r,o+x somefile` => owning user can read and write somefile, other users in the same group can only read it, and anyone else can only execute it
        • `chmod ug+rw,o+r somefile` => owning user and others in the same group can read and write somefile, anyone else can only read it
        • `chmod ugo+rw somefile` => everybody can read and write the file
        • `chmod o-w somefile` => shut off write permission for others
    • Notation type 2: octal digits
      • In this notation style, each set of read/write/execute permissions is envisioned as a set of binary digits...1 for "permission granted", 0 for "permission denied".
      • Then the sets of three binary digits are converted into single octal (base 8) digits.  (They're the same in base 10, if that's easier to think about.)
        • Examples:
          • read only => r = 1, w = 0, x = 0 => binary 100 => octal 4
          • read and write => r = 1, w = 1, x = 0 => binary 110 => octal 6
          • read, write and execute => r = 1, w = 1, x = 1 => octal 7
      • Then, the user, the group, and the other each get their own digit describing their permission level.
        • Examples:
          • `chmod 755 somefile` => owning user can read/write/execute, others in the same group can read and execute, and everyone else can read and execute
          • `chmod 660 somefile` => owning user can read/write, others in the same group can read/write, everyone else can't do anything with this file
      • This seems totally bonkers!  But if you look at how `ls -l` displays file permissions, it will start to make sense pretty quickly.
  • screen - very useful if you are connected remotely to a Linux box to run some lengthy process.  It creates a special session for you where you can run your long process, and if your connection to the remote host dies, you can re-connect when it's possible and pull that same screen session back up...and there's your long process, still running like nothing happened.  Using screen can be a little complicated, but there are a number of web pages and YouTube videos that can tell you all about it.
  • vi - visual editor - the most wonderful and horrible text editor in the world.  It can be very powerful and very painful, sometimes in quick succession.  Because it is the only text editor that is pretty much guaranteed to be installed on every Unix system everywhere, you should try to get familiar with it (at least to the point that you can exit out of it without saving anything).  There are a number of web pages and YouTube tutorials.

Slick Tricks

Chaining Commands

Using the pipe operator (the vertical line '|'...it's probably near your Enter key), you can redirect ("pipe") the output of a command and make it become the input to another command.  A few examples should explain it:

  • How many files are in this directory?  Run `ls -1 | wc -l`.
    • `ls -1` lists the files (only their names), one per line.
    • `wc -l` counts the number of lines in whatever it's given - in this case, the list of files from `ls -1`.
  • I have a big list of email addresses, pretty scrambled up, with a lot of repeats.  Can I get a simple list with each address only listed once, in ABC order?  Yes...run `sort emails.txt | uniq`.
    • `sort emails.txt` puts the list in ABC order, and `uniq` strips out the duplicates.
    • In this case, just running `sort -u emails.txt` would do the trick.  There's more than one way to do almost anything!
  • What commands have I run recently that involved piping one command into another?  Run `history | grep '|'`.  (The second pipe has to be in single quotes, or else the system's going to think it's trying to redirect grep's output!)

Repeated Commands, part 1

You can make the command shell run a rudimentary for loop, to run some command a bunch of times, against the output of some other command.  Again, some examples will explain it.  I find it easiest to type these in a little bit at a time, hitting Enter in the places where you see a semicolon...it won't close the command and run it until you give the `done` command.

Notice that giving it a command `enclosed in backticks` makes it take the output of that command as the big bunch of stuff to iterate over.  (The backtick character is most likely very near the Esc key, and probably has the tilde (~) character on it too.)  You might notice that I've been enclosing sample commands in backticks this whole time...that's because they won't mess things up if you accidentally bring them along with the command as you copy/paste into a terminal to try something out.

OK, on with the examples!

Example 1: I have a bunch of unsorted text files (lists of data), and I want to grab all the files that mention my project in their filenames, sort them, and copy them to another directory.  I'll run this:

for datafile in `find . -name '*XYZProject*'` ; do sort $datafile > /home/otherdir/$datafile ; done

You can probably tell that datafile is a variable...each time through the loop, it will be the name of a file whose name contains the string "XYZProject".

Example 2: Here's a long one.  I have a directory called 'xsl' with a bunch of .xsl files inside, and some of them are referred to in various other xml files throughout my project's whole directory tree.  I'd like to see those uses.  But a bunch of the files in the tree are subversion bookkeeping files, and I don't want to see those (since they would be duplicates), and I also don't want the system to tell me about any binary files (.class, .jar, etc.) that happen to contain the xsl filenames.

for xslfile in `ls -1 xsl`; do echo ------------------; echo Searching for $xslfile ...; grep -ri $xslfile * | grep -v svn | grep -v Binary; done

  • `ls -1 xsl` gets us the filenames of all the files in the xsl directory.  We will run the stuff between `do` and `done` once for each filename.
  • The echo commands make our output easier to read.
  • `grep -ri` searches recursively and case-insensitively for any file containing the name of the current xslfile.
  • `grep -v` is a reverse search...it searches for anything NOT containing the search string.

This will produce a bunch of output like this:

------------------

Searching for veeblefetzer.xsl ...

JingleBells.abc:              <from expression="doXSLTransformForDoc('xsl/veeblefetzer.xsl', $BellVariable)"/>

ABC/123/something/Model.xml:              <from expression="doXSLTransformForDoc('xsl/veeblefetzer.xsl', $AirplaneVariable)"/>

------------------

Searching for jujube.xsl ...

JingleBells.abc:              <from expression="doXSLTransformForDoc('xsl/jujube.xsl', $JingleVariable)"/>

ABC/123/something/Model.xml:              <from expression="doXSLTransformForDoc('xsl/jujube.xsl', $ShipVariable)"/>

------------------

Searching for fishsticks.xsl ...

JingleBells.abc:              <from expression="doXSLTransformForDoc('xsl/fishsticks.xsl', $SleighVariable, 'OhWhatFun', $OhWhatFun)"/>

ABC/123/something/Model.xml:              <from expression="doXSLTransformForDoc('xsl/fishsticks.xsl', $CarVariable, 'MakeAndModel', $MakeAndModel)"/>

------------------

...etc.

Repeated Commands, part 2

You can also make a while loop.

Example 1: Watching a directory as a separate ftp process is writing files there.

`while true; do ls -l; sleep 5; clear; done`

Example 2: Watching free disk space as something is writing to the disk.

`while true; do df -h; sleep 5; clear; done`

Extra power for the find command

The find command has a parameter called -exec that will make it execute a command on everything it finds.  You can have multiple -exec parameters, too.

Example 1: Find text files older than two weeks, and delete them.

`find -mtime +14 -name '*.txt' -exec rm {} \;

Example 2:  Hmmm, that last example seems a little risky.  Let's make it ask us first before it does the delete, for each individual file.  This uses the -ok parameter instead of the -exec parameter.

`find -mtime +14 -name '*.txt' -ok rm {} \;

Example 3: I have a bunch of jar files, and some of them are pretty old.  What's in them?

`find -mtime +365 -name '*.jar' -exec jar tf {} \;`

Example 4:  Wow, that was a bunch of stuff flying by my face really fast.  I couldn't tell where one jar ended and the next one began.  Let's break them apart a bit.

`find -mtime +365 -name '*.jar' -exec echo "---------------------" \; -exec jar tf {} \;`

Example 5:  That was better, but I still don't know which jars all this stuff came from.  Let's make it put the name of each jar under the line break.  We could do that with another -exec echo, but the -print parameter will do it too.

`find -mtime +365 -name '*.jar' -exec echo "---------------------" \; -print -exec jar tf {} \;`

Example 6:  Good stuff, but it zoomed by so fast I couldn't pick anything out.  Let's put this into a file so I can actually read it (and maybe send it to someone else).

`find -mtime +365 -name '*.jar' -exec echo "---------------------" \; -print -exec jar tf {} \; > /tmp/oldjars.txt`


Get a single field in a line of text

The cut command is helpful for picking out just one part of a line of output, such as a file listing or a line from a data file.

Example 1: Find the third field in a comma-delimited string from a data file, from a line containing the word "marshmallow".

`grep marshmallow file.txt | cut -d ',' -f 3`

The grep picks out the line you want.  The -d ',' means to consider the comma character as the field delimiter.  The -f 3 means to take the third field.

Example 2: Like example 1, but you want to find the last field in the string, and you don't know how many fields there will be.

`grep marshmallow file.txt | rev | cut -d',' -f 1 | rev`

The grep still picks out the line you want.  The rev command reverses the entire string, after which it's easy to have cut take the first field!  And then you use rev one more time to flip the result forward again.

Example 3: You want to get the "percentage full" field from the output of the df command, just for the /srv directory.

`df | grep "/srv" | cut -d'%' -f 1 | rev | cut -d' ' -f 1 | rev`

df gives an output like this:

Filesystem           1K-blocks      Used Available Use% Mounted on
/dev/mapper/osvg-root_lv
                      10190136   4403816   5786320  44% /
tmpfs                  8162648        12   8162636   1% /dev/shm
/dev/sda1               499656     70528    429128  15% /boot
/dev/mapper/osvg-home_lv
                       8191416    593788   7597628   8% /home
/dev/mapper/osvg-srv_lv
                     227034492 214650456  12384036  95% /srv
/dev/mapper/osvg-var_lv
                      56635708  22726496  33909212  41% /var

The grep picks out just the line you want, with "/srv" in it.  (The slash keeps it from also grabbing the line before...tricky.)  Then, cutting on the percent sign makes the "95" be the last thing on the line.  Then it's just like example 2 to get the last space-delimited field
.

Thursday, November 24, 2016

Things to be thankful for in 2016

Today is Thanksgiving in the USA, and it's traditional to think about what you are thankful for.  Here are some things that are coming to mind for me...

Writers

  • Joe Posnanski - the best sports writer in the world.  KC Royals fans love him maybe more than anyone else, but trust me, if you like sports at all, you'll love his stuff.
  • Yael Abouhalka - a local writer who speaks truth to power.
  • Tim Urban - author of the Wait But Why blog, source of many deep dive posts on many fascinating topics.
  • Orson Scott Card - a novelist whose sci-fi books dig deep into many topics.  I've loved introducing both of my daughters to his work, specifically the Enderverse.
  • Randall Munroe - author of the xkcd web comic, the "What If?" series, and the "Thing Explainer" book.  Such simple drawings, such great subject matter.
  • Ellis Morning - an indie author who is great about bringing together unlikely subjects for an entertaining story.  She's also very open about her writing process and (at least for now) responds personally if you write to her.

Musicians

Podcasts

People

  • Ruth Harder - pastor of Rainbow Mennonite Church, she always has something timely and challenging to say.  She is the right person to lead our church in our mission to advocate for greater acceptance of and outreach to marginalized people, which looks to be ever more important in these times.  Also, her husband Jesse Graber is one amazing artist and musician.
  • My teammates at work - I'm truly blessed to work with such a talented, friendly and diverse group.  If they're reading, you guys, you mean so much to me.  Thanks for making our team so great and making me a part of it.
  • The many sides of my family - we're pretty spread out over the country, and we don't see eye to eye on everything, but we do a pretty good job of remembering that we are family and we love each other no matter what.
  • My lovely wife - you're the best match for me and you do such a great job of keeping this production together.  I love you!
  • My kids - you two keep me laughing and always make me proud.  I love you!
  • Barack Obama - really gonna miss this guy.

Wednesday, November 16, 2016

"You Are Not So Smart" podcast - making better moral arguments

I've been listening to a number of podcasts for a while, and one of my favorite recent additions is called "You Are Not So Smart".  I'm still finding out what it's about a little more with each episode, but the most recent one I listened to was especially relevant to the current times and issues.

From their summary of the episode:  "In this divisive and polarized era how do you bridge the political divide between left and right? How do you persuade the people on the other side to see things your way?  New research by sociologist Robb Willer and psychologist Matthew Feinberg suggests that the answer is in learning how to cross something they call the empathy gap."

Check it out:  https://youarenotsosmart.com/2016/11/04/yanss-088-how-to-bridge-the-political-divide-with-better-moral-arguments/