Micro Utilities

Recently I’ve created a tiny site called tool.haus. It’s a repository for all the small utilities I write from now on.

On that note, I thought about the API that these little utilities should expose.

Most bash scripts or utilities use command line arguments to define behavior. This is great for humans but can be a pain when calling scripts from other scripts.

My current strategy is to implement the same API in all scripts:

JSON -> STDIN

Debug output -> STDERR

JSON -> STDOUT

All options, flags and input data are encompassed in the input JSON. Likewise, all output is exposed as JSON. Human-readable output is sent to STDERR.

We’ll see if this utility convention works out!

Advertisement

Silly Security 2: Chroot Jail

I am not a security professional or expert. I’m an average guy just learning and trying stuff out.

Picking up from last post, we have a couple things setup.

  1. Entry user “somebody” with limited capabilities (no file system write access, etc)
  2. A user we want to user: “intendeduser”
  3. Remote ssh is allowed for “somebody”, but only local ssh is enabled for “intendeduser”

Least Privilege

Why give “somebody” more access than required? The user’s only purpose is to serve as a staging area for access to the intended user. As of now, “somebody” can scan through most of the file system and explore other users’ home directories.

What is the minimum set of capabilities “somebody” requires?

SSH.

Chroot Jail

The plan is to force “somebody” into a modified jail environment using chroot. Chroot is an operation that changes the apparent root directory of a user.

My goal is to create a jail environment that allows nothing but SSH.

Implementation

I followed the steps from: http://allanfeid.com/content/creating-chroot-jail-ssh-access

But added these changes:

  • Only copied over the bash and ssh binaries
  • Bound /dev/tty and /dev/urandom to their jail equivalents

The final step was crucial. I struggled for almost an hour trying to debug ssh. The key was to copy over strace and use it to slowly determine and satisfy the missing dependencies.

I discovered several missing libraries through strace, but the most difficult step was diagnosing the need for /dev/tty.

When all else seemed to be working, ssh would exit with:

Could not create directory '/nonexistent/.ssh'.
The authenticity of host 'localhost (127.0.0.1)' can't be established.

The program exited without prompting for a password. A smarter man would’ve known that /dev/tty is required for the prompt.

Don’t make the same mistakes I did.

Silly Security

No one gives a shit about my servers. They host a few funny sites and services mostly for personal consumption.

But recently I made the mistake of reading the details of the security update offered for my Mac.

So I set off to secure my shit once!

High Level Strategy:

  1. No more password authentication on my front-facing servers, ssh key auth only from a single control server.
  2. Design a hardened login scheme for the central control server.

Central Server Login Scheme:

Normal Setup

  • Create “intendeduser”, my user for personal, everyday activity.
  • Add “intendeduser” to sudoers
  • Install fail2ban, which mitigates brute force attacks to some extent.
  • Disable root login

Tiered Users

I made a duplicate of the nobody user, somebody:

$ useradd somebody -d /nonexistent -s /bin/sh

The somebody user would have no privileges, and would simply serve as the first stage in a 2-stage auth scheme. Remotely, you could ssh in as “somebody”, but no other user. A second ssh intendeduser@localhost would be required to complete the login as the intended user.

This was accomplished by adding the following to my /etc/ssh/sshd_config file:

AllowUsers somebody intendeduser@localhost

This allows “somebody” to ssh in from anywhere, but only allows us to ssh in as “intendeduser” locally.

Note: I wonder if localhost doesn’t actually guarantee local access only. I’ll investigate more later.

Issues

The “somebody” user still has a significant amount of access. He/she can browse and read the majority of the files on the machine. Guessing the “intendeduser” would be pretty trivial.

In my next post, I’ll discuss a solution using a chroot jail!

Possible Improvements:

Quick list of things that came to mind

  • Two-Factor authentication using a PAM (Pluggable authentication module)
  • Switch ssh port (I didn’t just for convenience)
  • Limited login time before “somebody” user is automatically logged out

How to upgrade about halfway to Ubuntu 14.04

In light of the Bash vulnerability or “ShellShock”, I decided to upgrade some of my servers to 14.04. I started with nextcode. I opened a terminal on my local machine, ssh’d in, and ran do-release-upgrade. Unfortunately, I forgot about class at 2:30pm.

The thing about Ubuntu release upgrades is they often prompt you for manual confirmation: “do you want to replace bla.conf? yN?”. I wanted to be able to respond to these requests while on campus. This would be trivial if I had originally ssh’d and opened a tmux session, which I didn’t.

No worries. I’ll suspend and background the upgrade process, open a new connection in tmux and let that shell take ownership of the process. Ez-Pz. So I do this, up to the final glorious step where the new shell would inherit the fragile process.

$ reptyr 1234

$ command not found

Hmm… I’ll just install it.

$ sudo apt-get install reptyr

$ [EXPLOSION]

Oops, dpkg is in the process of a distro upgrade. Uhh wait if I just kill the process, I’m sure I’ll be able to rerun do-release-upgrade and pick up where I left off, right? (Young and naive Michael, so full of hope in this world).

$ kill -9 ….

$ sudo do-release-upgrade

(No new releases found).

Rest in peace, looks like I’ve successfully completed a partial upgrade to 14.04. After running sudo dpkg -a configure, which seemed to finish the installation of 14.04 packages, nextcode is behaving quite strangely. Word on the street is a clean install might be scheduled.

Page Tables

After ripping through the 6.828 lab that I so greatly feared (in 4 hours, too), I feel like I understand page tables. In fact, I could become a page table. I would be accepted into their ranks, translating linear addresses and handling permissions.

I think 6.828 is growing on me.

Comparisons

Today we hosted a NextCode event: a head-to-head coding competition. Two people competed on the spot in the TFL, close to the TV wall where we could use the projector. We displayed each competitors code on monitors (mine and Kevin’s) for the audience to appreciate. Overall, it was great fun and attracted quite a crowd at times. However, the focus of this post is on the last round, where an impressive code and his friends joined the fray.

We picked an annoying problem (shortest path in a maze) and let them at it. It was clear that one coder had extensive IOI, USACO experience. He whipped out Dijkstra’s in a few minutes but got a WA. Kevin jumped in to race him and we had a proper fight. Both produced code like a fountain, but couldn’t get the elusive “pass”. As this was happening, one of the inexperienced friends wrote a one-liner which randomly guessed an answer in the range (1-20). The focus was still heavily on the two coding giants banging away on their keyboards.

Kevin transitioned to VIM and the suspense became tangible. The guest coder was still getting WA and after copy-pasting his elegant code, so did Kevin. Then suddenly, on his 60th submission the one-line hero got a “pass”. WHAT, everyone went crazy. The atmosphere loosened up; everyone was smiling and laughing again. More than anything, I think people were just happy the seriousness was gone.

The guest coder was clearly miffed. He began by blaming our test cases, then moved to interrogating Kevin. IOI, IMO, IPho? Did you do TopCoder? Oh, CodeChef? The classic “size-up”. I remember this feeling all too well from my freshman year. Young and insecure, we come into MIT constantly making comparisons. This kid was clearly brilliant, but desperately wanted to demonstrate it. Interview questions, math puzzles, coding challenges; we constantly look to prove ourselves in such small ways. When I say “we”, I really mean me. The “me” from freshman year and even the current “me”. Events like screwing up an ACM round or not getting a brain-teaser used to eat into my self-esteem.

I’m better now though. I lose and I let go. Not always, but far more often now. It’s just a matter of perspective (and ego reduction), but it requires constant and deliberate effort.

Permanent, Temporary and Masked

How does web-forwarding work? According to Gandi, my search engine ranking right now is going to be terrible.

There are 3 types of web forwarding:

Direct(Permanent) – (forwarded domain name is visible) – Good search-engine ranking
Direct(Temporary) – (same story) – Bad search-engine ranking
Masked – (hides the url that’s being reached) – Very poor search-engine ranking

However, I think for now a very poor search-engine ranking is more of a blessing.