Fun with NFSv4 and mount paths

Here’s one that bit me, and chewed off far too many hours.  The only advantage to the process is that I learned some incidental things about FreeIPA, or “RHEL Identity Management”.

Background: CentOS 7.4 originally, and as vanilla as I could possibly keep things.  I’d started setting up IPA earlier this year, and it all was “working” until I decided to make some “improvements”, namely getting home directories automounted.  I started out using /users as the mount point/parent for home directories on the server, but read some advice to use a different directory on the server to actually hold the home directories so that the file server could also automount the users’ homes to the same location – more consistent.

I set this up, moving the directories, and modifying the IPA map to reflect this.  Though it didn’t happen immediately, this turned out to break things, and so for the benefit of anyone else who trips over this:

Seeing messages indicating that the home directory couldn’t be mounted wasn’t an IPA issue:

mount.nfs4: mounting home1.us.kmpeterson.com:/users/kmp failed, reason given by server: No such file or directory

(actually, most apparently: “can’t mount home directory” on login)

And:

Dec 28 16:27:05 client1.us.kmpeterson.com oddjob-mkhomedir[18428]: error creating /home/kmp: No such file or directory

There wasn’t essentially a problem with the map.  By the time I got to this point, I’d reviewed over and over the exports file entry:

/users *(rw,sec=krb5p:krb5i:krb5,fsid=0)

That’s pretty clear.  And mount (on the NFS server) appears to reflect this:

/dev/mapper/volgroup1-home on /users type xfs (rw,relatime,seclabel,attr2,inode64,noquota)
auto.home on /home type autofs (rw,relatime,fd=19,pgrp=1189,timeout=300,minproto=5,maxproto=5,indirect,pipe_ino=31126)
/dev/mapper/volgroup1-home on /home/kmp type xfs (rw,relatime,seclabel,attr2,inode64,noquota)

And the one I really thought was germane:

lookup_read_map getautomntent_r lookup_read_map: lookup(sss): getautomntent_r: No such file or directory

What’s the answer then:

The first message mislead me.  “No such file or directory” indicated the issue was that NFS wasn’t exporting what I thought it was.  The real problem is that the client wasn’t trying to mount what I thought it was.

The export had been changed to export /users rather than /home.  My previously working map, when I was using /home on the server had this option string:

-fstype=nfs4,rw,sec=krb5p:krb5i:krb5,soft,rsize=8192,wsize=8192 home01.us.kmpeterson.com:/home/&

Changing home to user:

-fstype=nfs4,rw,sec=krb5p:krb5i:krb5,soft,rsize=8192,wsize=8192 home01.us.kmpeterson.com:/users/&

Wasn’t the right answer.  The right answer is:

A client using NFSv4 to mount a filesystem exported with fsid=0 won’t process (as understood in my CentOS 7 environment) the “server path”.  This means that the export must be the root, and the mount specifies the path.  The mount key must end with the string :/& — the /home from the server along with the key construct the target path.

Or, what worked is this:

/etc/exports:

/home *(rw,sec=krb5p:krb5i:krb5,fsid=0)

auto.home option string:

stype=nfs4,rw,sec=krb5p:krb5i:krb5,soft,rsize=8192,wsize=8192 home01.us.kmpeterson.com:/&

This mounts home01.us.kmpeterson.com:/home/kmp to /home/kmp on the NFSv4 client.

Clear as… well, it’s been a long day.

RTFM (Read The Fine Manpage) (logrotate)

Here’s another quick one.

I’ve been fighting this one for years, and never had the additional hour to figure out what was really happening.

Why are my logfiles rotating recursively? As in:

dovecot.error.log-20210101-20210201-20210301-20210401-20210501-20210601-20210701-20210801

I just never figured it out. I know I changed stuff around and was eventually able to get it to stop. But I never ran it down all the way to the end.

That’s a problem these days. So many things you don’t understand, and don’t have the time to fix the right way.

The answer turns out to be simple (and is now highlighted in my local copy of the manpage):

Please use wildcards with caution. If you specify *, logrotate will rotate all files, including previously rotated ones.

logrotate(8) (Red Hat Linux)

This means you, trying to shortcut logrotate to process all dovecot.* pages.

By kmpontech Posted in Ouch Tagged

Note Taking Tools

At the office, we’ve been having a discussion about note-taking tools.

I was a fan of the original Google Notebook.  Networked note taking application!  Pretty cool in 2006.  I never used it heavily, but there was a specific use for it that I came up with, mostly snippets and excerpts of stuff from work, and for that it was useful.

Then, they killed it.  I don’t remember when exactly, but the Wikipedia entry indicates that it was in 2011, though I didn’t wait around to see exactly how it all ended.  And, yes, they reversed themselves, and there’s Google Keep now.

I never kept good notes about things, never really kept a diary.  I do have a ton (probably that order of magnitude, literally) of old paper records and stuff (anyone interested in a System/370 “yellow card”?), but I’ve experienced an enormous change in my field that’s made keeping notes utterly essential to keep my sanity and not lose precious time attempting to wring data out of tired neurons.

We’ve been going back and forth, as I said, so I thought I’d put down the two options I’m using.  (Why two: for things like Notes and Email, I really want two completely different systems, and two completely different clients for work-related – hence, in some nominal way, “company property”, and my own technical memory and development.)

When Google announced Notebook’s termination, I did some amount of research, and ended up adopting Evernote for my personal development/memory needs.  The cool feature that was most attractive was the ability to snap an image of a whiteboard into the app, and get the contents transcribed and made searchable.

What I’ve grown to like about Evernote:

  • yes, that image thing works really well
  • I’ve never gone near the storage limits (upload limit, actually – see below).
  • Folders, stacks of folders, and tags
  • Sharing notes/notebooks with friends
  • Excellent support via email
  • I really love the Skitch app (which I think is now macOS-only); I use it 2-3 times per week to add content to Evernote, but also to share screenshots via a public URL.

But there are inherent negatives that have developed:

  • They are clearly pushing towards teams, and it seems lots of the development effort is in that direction – and that doesn’t match my use case.
  • It’s gotten expensive.  The price has gone from $45/year to $75/year, and I understand those who think this isn’t a good value for them.  And there was sort of a middle-plan that was dropped earlier this year that’s also caused unhappiness.  Limits are set on uploads/month rather than total data stored (and are pretty low for the free plan).
  • I really don’t understand the sharing thing as well as I’d like: it seems opaque to me at times.  My spouse doesn’t have a paid account, and sharing with me doesn’t work the way she’d expect.

The last time I started a new position, I decided that I wanted to keep a “Work” notebook – partly again due to IP issues, but also since we were required to track hours on tasks/customers, and I wanted to be able to easily share that separate from any personal data.  So, the then-new kid was Microsoft OneNote, and I started working with that.

Remember, the use is rather different, but here’s what I’ve found in comparison to Evernote:

  • The image transcription/search functionality is there, and seems to work well, too.
  • Because I’m visually-oriented and want consistent presentation, I make use of OneNote’s named styles a lot.  And the ability to copy a style (the format “paint” function) keeps me from serious jumbling of fonts, sizes, and formats.
  • The arrangement of “Notebooks”, “Folders” and “Pages” seems somewhat more usable for me.
  • Because of how I use OneNote, I haven’t run into issues with space utilization.  However, Microsoft sells storage on a capacity basis, and I believe that there is a 1TB plan that comes with a personal subscription to Microsoft Office that is the location of stored notes.  With the other benefits, this brings into question the value of the Evernote (paid) service.

I’m using both, and there are features about both that I like.  But I haven’t yet (for my use cases) found one feature that would make me choose if I had to.  That doesn’t seem like a good thing for Evernote.

Setting up Puppet Again

How things are often: something gets done, and for whatever reason I don’t write up the notes very well.

(Yes, there’s the larger issue of why I actually am duplicating any effort at all, but it’s a long story.  Later.)

I’ve been working on setting up several Puppet installations over several different networks, and I’d actually gotten a simple set of tasks working.  So, I said there’d be no problems in setting up a new network.  After all, my notes said:

my_notebook-1.png

Well, now, that shouldn’t be a problem.  Look at 2) .  I’m sure I would have added notes later if that wasn’t true.

Two hours later, the page looked a bit different:

My_Notebook-2

Two additions:

First, my reminder to myself that misleading documentation is worse than none.

Second: needed firewall update, the autosign.conf file wasn’t there, another old config, and the kicker: Puppet wouldn’t start – just like when I when I worked this the first time in March, and that took some time to figure out, again.

It turns out that Puppet (server) needs access to temporary space that’s generally allocated in /opt/puppetlabs (I think).  The problem is that our security baseline has all filesystems except for those with system-supplied binaries (so, places like /data which is where we put configuration and payloads for services, /tmp, etc.), mounted with the noexec option, meaning no file on the filesystem can be executed.  Puppet’s default temporary allocation apparently needs to write and execute something there, which breaks Puppet Server startup.

My recollection of this is that it’s only an issue during initialization, so I decided the cleanest place to allocate the space was in /boot.

(Meta-apology: I should have documented better how exactly I came to this conclusion, but what makes this meta is that since I had to go back unexpectedly and re-hash this, I jumped to the conclusion – not recording what led me to it.  But now that I’ve set up this category for just that type of thing, I’ll help someone worthy  – probably me, in 2019 – with the next thing I run into.)

Ouch!

So, here’s hoping that Google &co. are running down these pages…

I realized a while ago that I’ve run into too many intractable issues – except that I keep thinking, “this can’t just be me”.  And there are legions of people out there with blogs, and Google finds on them answers to …well, sometimes those very specific intractable problems.

Thanks to all you out there.  Now: because I’m still working on relatively meaty problems, it’s time to give back.

I’m setting up the Ouch category to highlight things that bit me.  I’m going to try to put out there a post when I run into something that I couldn’t find an answer to but had to figure out myself.

Here’s a stupid one: only because it seems reasonable now but I had to figure this out twice: setting up a new Puppet server (next).

By kmpontech Posted in Ouch

A Lack of Presence

There was once this thing called “presence” – I miss it.

I realized not too long ago that I’ve been using technologies like text and instant messaging for, well, decades.  When I started, you had to be logged in to the mainframe.  Then it was IM. (iChat anyone?  MSN Messenger? Jabber?)  Now, it’s on our phones.

The thing that bugs me is that what I always saw as the biggest drawback – “too bad I’m not logged in all the time” – got solved.  Be careful what you wish for! Continue reading

Three tips for new Mac Users

I got a call last week from a neighbor.  “Stop by when you get a chance,” he said.  “I have something to show you.”

Turns out that he’s just switched.  Has a shiny new iMac on his desk, with a not-so-old PC off to the side. Congratulations!

I had this funny feeling that I should have an opinion for him about what he should do next. He was fairly prepared – he’s a smart guy. I kept thinking I should say, “okay, 1-2-3, you should consider doing these things,” but I didn’t have a list at hand. That I didn’t started gnawing at me.

So, I put this together.  Not a top 50 list, or top 25 list, or even the top 10… how about just three things to start with, because they might be the most useful for anyone who wants to actually start to understand their Mac.

Set up a username and password for your Mac.  It’s a very good idea to keep your Mac locked – to configure it so that when it boots up or you wake it up it prompts for your username and password.  I hear you saying  your computer is unlikely to be stolen, and that makes sense but I have another point: not having your computer totally open is a good habit.  When you log in to your computer, you’re reminding yourself that the information you store is valuable.  You do need to provide this information anyway when you install and/or upgrade software; if you use it every day it’s easier to remember.

But more practical reasons are that when you log in to your Mac, you’re not only starting your “session” but you’re unlocking the Mac Keychain.  The Keychain can store passwords for you (explicitly – using Safari, for example) and also stores other security-related data, such as WiFi passwords and security certificates.  When you log in every day with your username and password, you can take advantage of the Keychain without any further effort.  And, by making a strong password for your login, you enable the Keychain to store data securely.  In fact, if you want to remember only one strong password, use it to login (and for your keychain); that’s the “key” to unlock any others.

Eliminate Distractions: Organize your Dock. A new Mac comes with a shiny array of attractive icons in your Dock when you log in.  I suggest you get rid of those you won’t be using often enough to have them sitting there waving at you, and then make it easy to find any of your other applications:

You can eliminate icons in your dock by dragging them to your desktop – there’s a puff-of-smoke effect, and a “woosh” sound that accompanies this.  That’s how to subtract, and that’s half the job.  The other half is having a quick way to get to the apps you don’t use as often by leaving the “Launchpad” icon in your dock or dragging your Application folder to the dock.  What you’ve just done is divided your apps into two tiers – those you use every day (or often enough to justify “one click” access), and the rest that you want to be able to find relatively easily.

Set Activity Monitor to run at startup.  There’s a good way to keep an eye on what your Mac is doing.  It’s called Activity Monitor, find it in the Utilities folder (which is itself in your Applications folder).  It gives you a list of all of the applications running on your Mac (choose the “All Processes” choice in the pick tab next to “Filter”, then click on the “% CPU” column to sort by the most active process).  The bottom half of that window shows you counters and a graphic of your usage of disks, memory, and network utilization.  Consider having this app start when you log in: right-click (or control-click) the icon in the dock, and from Options choose “Open at login”.  That way, it’s always hanging out so that you can switch to it and answer questions like “why is everything so slow” or “is that thing actually doing anything”.  This can be very useful.

Do you see the underlying idea to all of these tips?  If you implement all of these, you learn a bit about how Mac security works, how to navigate around, and what your Mac is actually “working on”.  In other words, if you set a password, configure your dock, and habitually have activity monitor running, you’ll get a bit more visibility into your new computer.  Just having yourself set up this way will help you see a bit better how the gears go around, and might help you get a conceptual step forward towards figuring out how to do more – and have more fun.

Bonus Round:

Check out MacWorld’s How To’s and Videos.

And consider learning more by using these apps:
Broaden your browser use: try Google Chrome.  More functional, updated frequently (so, arguably more secure), and I personally find the bookmarks easier to use.
Get good secure storage for your passwords: try Password Wallet .  It’s a tool to remember passwords for you.  Doesn’t actually “integrate” with your browser like most tools, which makes it a bit more secure.
Store those little pieces of information so you can find them easily wherever you are: try Evernote . File things like images, web pages, and screen shots, other notes. You can search for them, and it runs on a website, or apps for almost every current device.
File your mail messages really quickly and efficiently: try MsgFiler .  Two to six (in my case) keystrokes to file mail messages into the folder of your choice. Makes it just as easy to save for archive as to delete your mail.

(These are all unsolicited recommendations … this is just my opinion, I’m not being compensated for mentioning these.)

Hope these all help!  Have fun!

Spring Cleaning

Spring always seems to take forever to reach New England.  This is nothing new; I’m in agreement with a traditional mode of thinking here that there really isn’t a “Spring”, there is Winter and Summer, and in between “Mud Season”.

But there are cobwebs to be cleared, always.  Like those in the computer.

While I’m not speaking literally, consider the degradation of the state of our computers, or “bit rot”.  It represents a real phenomenon; the concept of entropy turns out to apply to ordered data in our storage systems and to our own organizational structures.

I’ve always followed the advice that I give to others: upgrade your software early and often.  I’m usually first out of the block for new OSs for my client machines (not servers – but that’s another story).  That’s rather extreme – I don’t suggest this for most people – but falling a few versions behind is a significant problem.

I get surprised at unintended consequences, though, as with how upgrading software feels like spring cleaning, but it’s not.  In fact, software upgrades generally leave as much as possible from the previous state as possible.  That’s prudent – you make fewer decisions about deleting things, you decrease risk.  But just because you moved in new furniture doesn’t mean you don’t have cobwebs that pop up.

You’re thinking that there’s a practical story here, and you’re right: whenever I take a machine and “clean it”, it ends up running faster.  Yes, more speedy.  Why?

It’s about ordering of data: if you back a computer up and then restore the backup – especially if you back up and reinstall the OS and the Applications from the original media, followed by only using your backup to restore your data – you’re moving everything back onto the computer into a state where you’ve gotten rid of all the old stuff that you never use anymore, and you’re writing from the beginning of your disk, one block right after another.  Which means that when you go to read the data, it’s all effectively faster to access because it’s been reorganized.

This used to be called “defragmentation”.  Most OSs do this automatically now, or don’t actually need what the defrag applications did, but what they do need is a external process (you) to make decisions about what is no longer needed, and have the computer clean off the disk and rebuild.  There’s nothing like doing the living room by moving all the stuff in there out and cleaning everything before moving the stuff back in – you’ll probably find stuff you don’t want to move anymore, but the real value is just getting to the bare floor and going from there.

I used to think that the slowing of computers primarily related to new software functionality, new web applications, and more intensive use.  I still believe that these are factors.  However, I’m constantly surprised about how much of an impact a full cleaning cycle has on my supposedly well-maintained machines, and take stock at how it’s good to simplify the environment from time to time.

Speaking of the (physical) living room…

Asterisk, Firewall, and Hearing Voices

From the “While I’m thinking about it” department, with help from the “maybe this will be useful for someone else” division.

I’d added a Analog Telephone Adaptor (ATA) to my home Asterisk system a while ago, but hadn’t really gotten a chance to work on it.  About six months ago, I realized that it was misconfigured: when I received calls (which was 95% of the use, as it turned out) audio was fine, but I couldn’t initiate calls with it and get audio.

I realized this because I hadn’t been using Asterisk as my sole voice service, and so cursory testing of the ATA didn’t expose the problem – calls to it on my own network worked fine, so I assumed initiating calls would, too.

I thought about all the Fun with NAT that we always have to deal with, and assumed that was the issue.  As is so often the case, ran off in that direction with no solution.  Looked at the configurations, and everything there looked okay.  Turned directmedia off, so that Asterisk would remain in the media stream, but still no audio.  On the other hand, this ruled out NAT issues, didn’t it?  And I’d never had NAT issues with Asterisk, even though I never actually modified the firewall on that host to open ports for RTP.

My Polycom IP550 (always) worked.  Great.  And the ATA didn’t.  Looking at the network traffic, I saw that call setup looked pretty similar between the two devices… but then I noticed something interesting, that the ATA would get “destination unreachable/host administratively prohibited” replies from Asterisk with RTP.

I dug deeper, and noticed that the first RTP packets when using the IP550 were from the Asterisk server to the IP550, while with the ATA the audio started with the ATA – but there were those unreachable rejects, and no audio at all the other direction.

Stop reading here if this starts to make sense to you – once I put this all together, I started to come up with the answer.

I never had to open the RTP ports on the Asterisk server, because it turns out the IP550 was configured with progressinband=yes.  (There was a note in my config that this was suggested by Polycom).  Because I hadn’t really looked into it (and it was only suggested by Polycom), I never thought too much about it.  But it’s the key, when running Asterisk on a host with a firewall: rather than having to adjust the firewall to manually open ports for RTP, progressinband has Asterisk generate audio to indicate call progress back to the terminal, which means that the Asterisk server “talks first” (starts sending from the negotiated port using RTP).  This means that Asterisk effectively opens the port for RTP bidirectional traffic, and thus obviates the need for the firewall to do so.

It had nothing to do with NAT, but was still a firewall issue.  I run firewalls on internal hosts – I think it’s just good practice.  And I have to be honest: I figured all this out because when I did open ports for RTP, the ATA started working.  That was what I needed to know to figure out the rest of it.

I haven’t yet figured out the drawbacks, if any, to progressinband, but in this case I’m happy to be through this particular issue.  This is one of those occasions where a lot of work – many blind alleys – led, eventually to a fix that’s brilliantly obvious once you know it.  That’s how life is in technology, though.

Word for the day: “micromort”

Thanks to Bruce Schneier for the heads up on this:

micromort, n.  a numerical score for an event based on a probability of death of 1 in 1,000,000 (1 × 10-6, or 0.0001%).  Unit abbreviation μmt.    Examples: hang-gliding=8μmt, horse-riding=0.5μmt; 100 miles of travel in a car=0.5μmt.

As a blogger with the nom de plume of “Stubborn Mule” put it, “shopping for coffee you would not ask for 0.00025 tons (unless you were naturally irritating), you would ask for 250 grams.” The ability to communicate risks in an accurate but understandable manner is undercut by large denominators expressing very small risks.  Multiplying that fraction to get a micromort makes things more perceptible.

References:

  1. Note that these probabilities are based on findings in the U.K., which may not be equivalent in other regions.
  2. The earliest citation that I could locate was R. A. Howard, “On making life or death decisions” in “Societal Risk Assessment: How Safe is Safe Enough?” (1980, ISBN 0306405547), referenced from Wikipedia.
  3. David Spiegelhalter et al have a wonderful site on the topic of Understanding Uncertainty, with some excellent tools.  Spiegelhalter gave a talk at the LSE’s Department of Economics in 2010 with a very good overview.