Posted by & filed under Linux, Node.js.

Normally when I host my Node.js-based applications, I’ll SSH into my server, open up a screen or tmux session, run node ./server.js, detach, and call it a day. Of course, if you’re reading this article, you’re fully aware that this is a horrible solution and are looking for an alternative.

One thing that is going to change between the hacky method of hosting and this new method is that it won’t be YOU that is executing the process, but instead the SYSTEM. So, we’ll be taking a few extra steps here and there to enforce that concept.

Process Management

For starters, we don’t want to execute Node directly (even once we set up our service). Instead, we want to run it behind some sort of service that will keep the process alive in the unfortunate event of a crash. There are many tools for this, such as monit, PM2, or nodemon but the one I’m most familiar with is called forever. Feel free to use an alternative if you’d like.

First, we’ll want to install forever on the server. Run the following command to take care of that:

sudo npm install -g forever

Once you’ve got forever installed, it’s a good idea to have it throw pid files and log files somewhere. I threw a directory into /var/run for just this purpose (although I’m not sure if this is technically the best place for such a thing):

sudo mkdir /var/run/forever

Application Location

If you’re used to storing your Node.js projects in your home directory (like I was…), you need to stop! Instead, store them somewhere which makes more sense as far as the entire server is concerned. The directory /var is pretty good for doing this, and if your application serves up HTML, throwing it in /var/www is probably a good idea.

I host a lot of applications and websites on my server, so I put my sites directories like /var/www/example.org.

Run Time Configuration

There are a few changes you may want to make to your application to make it server-friendly. One thing I always find myself needing to do is pass a port number that I want my process to listen on. My server has a few IP addresses (network interfaces) and sometimes I’ll also need to pass in which interface I want to bind to.

A common solution for this is to pass along command line arguments. A different approach that I’ve been liking lately is to set environment variables (environment variables are analogous to named parameters and CLI arguments to normal function arguments).

A quick note on Node.js server listening conventions: Whether you’re using the built in http module, or going with express or other similar frameworks, the convention is that you call a .listen() method on the main http object you’re working with. The first argument is a port number, and the second argument is a hostname. If you don’t provide a hostname or pass in null, it defaults to listening on all interfaces (e.g. ‘0.0.0.0’). If you pass in the string ‘localhost’ or ‘127.0.0.1’, the port can only be accessed from the local machine. If you pass in the ip address of one of your interfaces, it will only listen on that interface.

Here’s an example of how you might implement both of these methods in your scripts:

Command Line Arguments

./server.js 9000 "localhost"

Code:

#!/usr/bin/env node

var app = require('express')();

var port = parseInt(process.argv[2], 10) || 80;
var interface = process.argv[3] || null;

app.listen(port, interface);

Environment Variables

SERVER_PORT=9000 SERVER_IFACE="localhost" ./test.js

Code:

#!/usr/bin/env node

var app = require('express')();

var port = process.env.SERVER_PORT || 80;
var interface = process.env.SERVER_IFACE || null;

app.listen(port, interface);

Debian Service

Now for the fun part! First, create yourself an empty init script, substituting the word SERVICE for the name you want to use for the service:

sudo touch /etc/init.d/SERVICE
sudo chmod a+x /etc/init.d/SERVICE
sudo update-rc.d SERVICE defaults

Once that’s done, paste the following simple service template into the file, swapping out SERVICE to whatever you’d like to use:

#!/bin/sh

export PATH=$PATH:/usr/local/bin
export NODE_PATH=$NODE_PATH:/usr/local/lib/node_modules
export SERVER_PORT=80
export SERVER_IFACE='0.0.0.0'

case "$1" in
  start)
  exec forever --sourceDir=/var/www/SERVICE-p /var/run/forever start server.js
  ;;

  stop)
  exec forever stop --sourceDir=/var/www/SERVICE server.js
  ;;
esac

exit 0

This script went with the environment variable method of configuration. Since we’re dealing with a bash script, I threw the variables at the top of the script instead of on the same line as the command we executed for added readability. Of course, if you adopted the command line argument method, omit the two export lines and add your arguments to the end of your command.

If you’d like to start (or even stop) the service, you can run the same old commands that you’re likely used to:

sudo service SERVICE start
sudo service SERVICE stop

Consider the Following

Now that you’ve got everything setup, your service should be able to survive a reboot of your machine! Go ahead and run sudo init 6 right now just to be sure. Just kidding.

If you ever want a list of your currently running applications, run sudo forever list. Read up on the forever documentation to see what else you can do (hint: log reading).

That Debian service script we wrote is a bit lacking! If you check out the contents of /etc/init.d/skeleton, you can get an idea of a more robust script.

Posted by & filed under Personal.

The posts to this blog may have subsided, but my exuberance (and GitHub commits) have not.

Working on Book #2

My next book is titled A Consumer-Centric Approach to RESTful API Design and unlike my first book will be self published. I’m a lot more excited about this book than the previous one, Backbone.js Application Development. With the first book, the publisher approached me with a topic, a table of contents, and a title, and asked me to do the rest. Backbone.js, despite being a technology that I used at the YC startup I co-founded and my day job, just wasn’t something I was passionate about. This next book regarding API design is something I’ve been really excited about for a while.

I feel much more confident in my abilities to market this book, as well as the quality of the content. My technical reviewers are well known in the industry and are helping me create an awesome book aimed at a typical web developer who is interested in building an easy to consume API. The target audience are web developers with at least a years experience.

Finch App

A bunch of friends and myself competed in Ann Arbor Startup Weekend 2014. The project we built is called Finch, and while we’re still solidifying what the app will do, it’s essentially an event image aggregator. We’ve got an iOS app, (mobile friendly) website, some websockets, and an API. It’s still under massive development but we were able to pull a lot off in one weekend.

My role in the project has been a bit different that what I’ve done in the past. I’ve been a programmer for over 8 years now, but for this project I took on more of a Product Manager role, a sort of liaison between teams as well as doing some mentor and architectural decision making. I must admit, doing people interaction work has been a lot of fun. At my day job, I’ve been slowly transitioning from hardcore programming to people interaction by becoming a Developer Advocate, and I can see this being a pivotol point in my career path.

Left my Day Job

I actually put in my notice over a month ago and have been doing a lot of work on the side during this sabbatical. Of course I’ve been working on the book as well as Finch. I’ve also been working a lot with some technologies I’ve wanted to get more proficient with, such as PostgreSQL and MongoDB. I’ve used MongoDB with previous projects, but I’ve been wanting to get more experience. And I keep hearing good news regarding PgSQL vs MySQL, so it’s been fun to learn another SQL dialect.

Moving to San Francisco

And finally the Pièce de résistance is that I’m moving to San Francisco, CA. The reasons for this are plentiful: SF is the mecca for our industry; Ann Arbor, MI is currently 17 degrees and buried under a foot of snow; There are considerably more opportunities out west; I’m currently a mere 2 hours from my hometown.

The part that’s been fun explaining to my family and friends is that I don’t have a job lined up. I’ve got plenty of savings though, and a buddy of mine accepted a position at a company which provides temporary housing, so we’ll have a month to hunt for a real apartment and I’ll be living rent-free during this time. There is actually going to be five of us total making the move from A2 to SF (turns out none of us like the snow that much).

Posted by & filed under APIs.

Some friends and I are working on a project called CodePlanet.IO. It’ll be a high-quality tutorials website, and we plan on eventually releasing screencasts of full-stack development using various web-related technologies.

Our first big post is an article of mine on the Principles of good RESTful API Design. You might not have realized it, but I’ve been working as a Developer Advocate / API Architect for the last couple months at my current employer.

Check the site out and be sure to keep an eye out for our upcoming screencasts!

Discuss on Reddit or Hacker News.

Posted by & filed under Reviews.

I recently read a copy of Debian 7: System Administration Best Practices, written by Rich Pinkall Pollei and published by Packt. Full disclosure: I’ve published a book through Packt, and they sent me a free copy of the book to do this review.

Relevant background: Debian has been my preferred distribution for a few years, with apt-get being the package management system I’m most familiar with. My website, as well as several others I control, are hosted on a Debian 7 Linode VPS, and I perform all maintenance via SSH. Even my current development machine is running Linux Mint, which is Debian under the hood.

Overview

This book is an ambitious attempt to cover many facets of Debian Linux sysadmin within the confines of 100 pages. There is a lot of material to cover, and the 100 pages of this book falls a bit short (on several occasions the author mentions a touched-upon topic as being outside the scope of the book). The ideal audience is a narrow set of intermediate users, perhaps having used Linux for at least one year and no more than two years. While some of the information is catered towards beginners, such as the overview of Linux in the beginning and different filesystems, much of the content requires intermediate knowledge of Linux.

In the later sections of the book, the focus switches slightly to the LAMP stack, with Apache, PHP, and some commonly used tools for headless web server administration. I’ll be honest, when I picked up the book I assumed it was on the topic of server administration, but with earlier sections covering desktop-focused topics such as full-disk encryption as well as Window Managers, the book covers the full gamut of Debian environments.

The Good

The first chapter, Debian Basics for Administrators, is a short history and overview of Linux and how Debian fits within the Linux ecosystem. While this chapter doesn’t contain any information which would warrant the for Administrators part of the title, it is good information for anyone running a Linux-based Operating System.

I found the section on Filesystems much appreciated. While the default FS selection presented during installation is usually fine for most installations, as a beginner it’s easy to get caught up and wonder what the differences are.

Considering the brevity of the book, there is a decent amount of information provided on the concept of Disk Encryption, a topic many books on this subject would have left out. In light of recent government surveillance revelations this topic is quite worthy of being covered by more books.

The book overall is a quick read, one which the user can get through in a dedicated afternoon (I read my copy over a few hours yesterday). Plenty of the topics are high-level, and don’t necessarily get the readers hands too dirty.

The Bad

The Package Management section covers the basics of apt-get and aptitude, dselect and dpkg, and mentions a tool called Alien for installing non-Debian packages. There’s even a sub-section on performing manual builds of software. While it’s nice to cover these tools, I’ve personally never had to use Alien, aptitude, dselect, nor dpkg, with apt-get being 95% of package management I do, and building from source being the last 5%.

Considering the audience, I would like to have seen the author guide the reader through an actual build process they may encounter in the wild. For example:

sudo apt-get install build-essential
wget http://nodejs.org/dist/v0.10.24/node-v0.10.24.tar.gz
tar -zxvf node-v0.10.24.tar.gz
cd node-v0.10.24
./configure
make
sudo make install

Many topics are mentioned, which requires intermediary knowledge of Linux, but a quick explanation could have made the topic digestible by beginners. For example, on page 24, the author talks about the swap partition, and how it is used for paging memory to disk. What does paging memory mean? By rewording this into saying “If the computer runs out of RAM, which it might have about 8GB of, it can temporarily store data into the slower hard drive, which may be 1TB.”, the audience is much larger.

On page 43, the author mentions config files having been changed between the package maintainers version and the sysadmin’s version, and how there are tools for showing diffs and choosing which version to go with. I would like to have seen more emphasis on this section, as it is the biggest cause of a nuked Debian installation (for me personally, that is).

Also on page 43, the author mentions how PHP can change and how “re-coding web pages” may need to occur. While much of the software has automatic dependency checking, like phpMyAdmin installed via apt-get, scripts installed by a user will be unaware of said version change and can break. A bigger distinction on this would have been beneficial for many a beginner.

On page 53 the author talks about services and how to control and configure them. As an example, he covers Apache 2 to some length. However, he covers Apache 2-only commands such as a2ensite, a2dissite, and apace2ctl. While it is nice that the author chose a service which exemplifies the many different configuration options Debian services have (config includes, -enabled vs -available), I really wish he would have covered the build in service command which can be applied to all services (e.g. sudo service mysql restart). The service command also could have been covered in the System Management chapter when talking about init scripts.

The topic of Linux Clusters is mentioned a few times throughout the book, but it feels artificial, and after reading the book, I am no closer to understanding how to build my own Linux Cluster.

On page 43, the reader is told that they can read email sent to the root mail account to get information about upgrade notes. It would be great if the author covered how to do this (e.g. sudo mail).

Also on page 43, in the After the Upgrade section, a nice tip for the reader would be to reboot their machine if it is a development machine. I’ve often found myself rebooting a Linux laptop weeks after I’ve run a few upgrades, only to find X Windows not starting, and wishing I had done it sooner while the changes were still fresh in my mind.

On page 41, the author tells the reader to read the Debian release notes, but doesn’t mention where to find them.

Neither the sys-rc-conf, sysv-rc-conf, nor bum commands mentioned on page 60 exist on my Mint Linux laptop nor my headless Debian 7 server. I looked into it some, and one must first sudo apt-get install sysv-rc-conf to get those programs. The same goes for the parted command mentioned on page 67. Whenever covering a command which doesn’t exist in the base-installation, installation of said programs should be mentioned beforehand.

The System Management > Filesystem section on page 66 might have been better merged into the Filesystem Layout chapter.

On page 80 the author mentions editing the /etc/sudoers file for giving users the ability to run sudo. However, the preferred “safe” method for editing this file is by using the visudo command (which is even mentioned within the file). This command provides syntax checking and does file locking, and provides other niceties.

The Ugly, AKA, Errata and Nit-Picking

Page 10: “leading-edge” should be “bleeding edge”.

Page 42: apt-get dist-upgrade and aptitude full-upgrade could have been highlighted as code entirely, but that’s a matter of opinion considering the context.

Page 52: The image of text would have been better served by using text. The config parent/children relationship seems off due to how config files are loaded.

Page 53: There are two spaces instead of one in “These are the files  that are part of”.

Page 53: Could have mentioned that normal files in sites-enabled, which aren’t symlink’d to sites-available, still load as normal.

Page 63: The text says the interfaces file was generated automatically, but I was under the impression Debian only auto generates DHCP interfaces (it is displayed as static).

Page 67: The note that EXT4 is 2-20 times faster than EXT3 for FSCK would be good to know in the earlier Filesystem chapter when selecting a Filesystem.

Page 68: The paragraph at the bottom beginning with “Note” would have been a good candidate for the bracketed “note” paragraph style.

Page 68 & 69: The notes about gparted and live systems seems redundant with each other.

Page 71: The tangent about a NAS device introduces many new concepts and may leave the reader confused.

Page 74: The phrase “Straight servers” is unfamiliar to me; perhaps “Headless servers” would have been better?

Page 74: The claim that European users prefer KDE and Americans prefer Gnome should have a citation.

Page 74: The term “home sites” should be “websites”.

Page 75: The gdm3setup command needs to be entirely highlighted as code.

Page 75: The note at the bottom of the page should be clarified by stating Linus Torvalds is the creator of Linux.

Page 79: The tip at the bottom of the page should say root account password, not root account login. It’s still possible to become the root user, the account just doesn’t have a password which can be used for login (e.g. run sudo su and you’re root).

Page 96: The Installing Webmin file content should be bold, and the second line is missing the trailing backslash.

Page 98: The comment on using Webmin to make changes, then manually checking the file to make sure the configuration is legit, leaves the reader wondering why they would want to use Webmin at all.

Posted by & filed under Game Dev.

I just spent the last two nights building a version of Conway’s Game of Life using JavaScript and the HTML5 Canvas tag. You can check out the game here:

The game is really simple. You progress through different levels by getting a specified goal to turn on. This goal is always a 1×1 grid location and is highlighted in blue. You’re allowed to change the are of the level highlighted in pink. Changing that area is as simple as clicking a grid location to toggle the state between on and off.

Game of Life Screenshot

Conway’s Game of Life is a rather simple simulation. It is a state machine, exemplified by a limitless 2d array (mine is just 64×64). This thing falls into a category of similar things called “Cellular Automata”. You can read more about it on WikiPedia, however, the four simple rules are copied here for your convenience:

  1. Any live cell with fewer than two live neighbors dies, as if caused by under-population.
  2. Any live cell with two or three live neighbors lives on to the next generation.
  3. Any live cell with more than three live neighbours dies, as if by overcrowding.
  4. Any dead cell with exactly three live neighbours becomes a live cell, as if by reproduction.

Each frame of the playing animation represents a single generation.

Posted by & filed under Web Server.

If you throw some SSL onto your NGINX hosted website (as you’ve likely noticed thomashunter.name is now doing), you may notice a few hard-to-diagnose issues. Many PHP scripts look for the presence of a certain server variable, namely, $_SERVER['HTTPS'], to determine if it is behind an SSL connection.

To fix this, you need to add the following line to your server block:

fastcgi_param HTTPS On;

Interestingly, it is quite hard to find documentation on this topic, and I have no idea why. I’m not sure if the HTTPS server variable is that common, but I do know that Apache always provides it, and many PHP scripts rely on it. Honestly, it isn’t a bad idea to manually set this to Off if you know that your website isn’t behind SSL, as I’ve seen some code do silly things.

Check out the following crazy logic some common PHP systems use for checking if the current site is secure, all of which rely on the presence of this parameter. Most importantly, notice how every single one of these common PHP systems do it differently:

Posted by & filed under Linux.

Chromium is the entirely free version of Google Chrome. What makes it entirely free? Well, it doesn’t include license restricted code, such as the PDF viewer. If you’re like me, you’re a stickler for installing software using your distributions package manager, and prefer doing so over installing packages outside of the package manager. And honestly, I can’t think of a single other reason I keep using Chromium instead of Chrome. But I digress!

To get the PDF viewer working in Chromium, so that you can click a PDF link and view it in your browser instead of requiring it to be downloaded, just do the following.

First, download the Chrome .deb package: https://www.google.com/intl/en/chrome/browser/

Extract the file opt/google/chrome/libpdf.so from the package, and save it to /usr/lib/chromium-browser.

Once you’ve done that, restart the browser (close all windows), and then attempt to view a PDF file.

Chromium Viewing a PDF

Chromium Viewing a PDF

You can also visit chrome://plugins/ to confirm that the plugin is listed.

Chromium PDF Plugin

Chromium PDF Plugin

Posted by & filed under Linux.

Everyone knows that script kiddies are constantly bombarding servers with login requests, attempting to get access to an account which you might have secured with a stupid password. I was curious to find out which accounts they were attempting to login as, and more importantly, if any of these accounts were actual accounts I knew of.

I couldn’t find anything on the internets, but I was able to cobble together the following (overly) complex command:

sudo cat /var/log/auth.log | grep -oEi "Invalid user ([a-zA-Z0-9]+)" | colrm 1 13 | sort | uniq -c | sort -h

If you’d like an explanation, check out the command breakdown on Explain Shell.

Here are some of the more popular accounts people attempt to login as:

 30 ftpuser
 33 astrid
 33 autumn
 33 bailey
 36 avalon
 36 testuser
 39 git
 42 bezhan
 42 test
 45 admin
 45 asuka
 45 auction
 45 bar
 45 bella
 48 bbs
 54 bandit
 57 bind
 57 oracle
 63 nagios
 69 au
 78 ben
 87 ftp
 93 bill
 864 ftptest

If you know of a better way to format this command (I have a feeling the length can be cut in half) leave a comment!

Posted by & filed under Reviews.

Today, my friend Daniel Elliott and I assembled our O2 Headphone Amplifier kit, which we ordered from Head ‘n’ Hifi (although, if you don’t want to assemble this yourself, you can buy it on Amazon preassembled). It took us about six hours to build the whole thing, however, if you’ve soldered things before you could easily have it completed in three hours (this was my first time soldering to a PCB).

While building this thing, we struggled in a few areas, and wish we had known some things in advance. In this article I’ll outline these things.

Part Identification

The most tedious and error prone part of the process was identifying which parts were which. The Bill of Materials tell you which generic parts go to which location on the PCB, however, there wasn’t an easy way to tell which parts we had were which generic parts. Unfortunately, if you order the kit from anywhere other than Head ‘n’ HiFi, the parts will likely be different, and this list will be useless.

  • R1, R2: Red Yellow Black Black Brown
  • R10, R11, R15, R18: Brown Silver Black Black Brown
  • R6, R12, R13: Brown Red Red Black Yellow
  • R3, R7, R19, R23: Red Brown Yellow Black Brown
  • R9: Orange Orange Black Red Brown
  • R14, R20: Brown Black Black Red Brown
  • R16, R22: Brown Brown Black Green Brown
  • R17, R21: Brown Brown Black Brown Brown
  • R4, R5, R8, R24: Red Brown Black Gold Brown
  • R25: Brown Yellow Black Green Brown
  • C10, C15, C17, C18: Blue Cubes
  • C16, C21: 223, 312, apparently it doesn’t matter
  • C11, C12, C19, C20: BC 221
  • C13, C14: Square White Things
  • C1: 105Z (single)
  • C6, C7: 105Z (pair)
  • C2, C3, C4, C5: 381GB
  • C8, C9: B1129
  • D1, D2, D5, D6: IN5818
  • U6: 7912ACT
  • U5: 7812ACT
  • Q1: 1C25AA (smalles one)
  • Q2: 1D33AA (black top one)

Solder Order

As a general rule, start by soldering smaller components first, then slowly add bigger ones. We did resistors, then small capacitors, IC risers, diodes, etc. If you do the big parts first, they may get in your way later.

Part Orientation

Whenever you’re dealing with a resistor, the direction you solder it to the board doesn’t matter. When dealing with the tiny capacitors, it doesn’t matter either (this I didn’t realize). The cylindrical capacitors and diodes need to be in a certain direction (the PCB has hints everywhere). The transistors need to be in a certain way, and on the PCB you’ll see a thick line where the back of the transistor needs to line up with. The IC chip risers have a cutout which align with the board, and the circles on the IC chip corresponds to the cutouts.

Transistor Oxidation

This was a real pain. The four transistors (U5, U6, Q1, Q2) had very oxidized leads when they arrived. If you look close, they almost look like a white dull aluminum color. Soldering them was an absolute pain and takes a long time. Unless, that is, you scratch the heck out of the surface. Take some sand paper if you’ve got it, otherwise just drag the edge of a blade against the leads, and you’ll see them become really shiny. Once you do that, they will be a LOT easier to work with.

Battery Connectors

Before you solder the battery connectors in, you should really attach a 9V battery to them first. By doing so, you can be sure that the leads will keep the battery flush against the board. My friend did this and I did not. At first I had two leads which were pointed in different directions, and I had to re-solder it. By the time I was done, both sets of batteries protrude from the board slightly, while my friends are flush against the board.

The Little Bag

When the parts arrive, you’ll find inside a smaller bag containing some resistors and riser connectors. Don’t open this bag; they are spare parts. Just throw it off to the side. If you do open the bag and mix it with your parts, it’ll be harder to tell what goes where.

Power Precautions

Don’t remove the batteries while the power is switched on. These are in the instructions. It may damage the circuit.

Don’t use a generic power source for the power jack. This thing is weird; it needs to use an AC -> AC power adapter. I didn’t even know such a thing existed! They are normally AC -> DC. Here’s a link to an inexpensive one that the author recommends: Power Supply on Amazon or Power Supply on Mouser.

Pictures

Here’s a bunch of pictures of the board throughout the process, because hey, following pictures is much easier than diagrams ;)

O2 Headphone Amplifier - Part Bags

Part Bags

O2 Headphone Amplifier - Some Resistors

Some Resistors

O2 Headphone Amplifier - More Resistors

More Resistors

O2 Headphone Amplifier - Solderings

Some Solderings

O2 Headphone Amplifier - Some Capacitors

Some Capacitors

O2 Headphone Amplifier - Side View

Side View

O2 Headphone Amplifier - Mostly Complete

Mostly Complete

Posted by & filed under Security.

TradeBit.com is an online marketplace for selling digital goods. Back when I was highly active selling applications and music on the Envato network, I would occasionally list items for sale on TradeBit which Envato deemed wasn’t up to their standards. Overall I made less than $60 throughout the lifetime of my TradeBit account.

Since large websites seem to be backed every few weeks, with user accounts being leaked left and right, it seemed like a good idea to go through and delete any online accounts which I no longer used.  While attempting to delete my TradeBit account, I was unable to find any automated process for doing so on the website, so I went ahead and contacted customer support to ask them to do it for me.

The reply I received was a bit of a shocker:

TradeBit Email Conversation

TradeBit Email Conversation

The problem with TradeBit requiring the last three characters of a password to cancel an account is at least threefold. I mentioned them in the email, but I’ll reproduce them here.

The first, albeit smallest problem, with providing them with part of my password is that I’m sending this password via email, which is an unencrypted communication medium. Imagine an email not as a letter in a sealed envelope, but a postcard with the message written on the outside. Every set of hands this message passes through is fully capable of reading the email in its entirety (and if you’ve been keeping up with recent news, you’ll know that every email IS read and stored by a third party agency).

Another issue with sending TradeBit the last three digits of a password is that a human is reading this password and performing some action with it. If this were some automated system it wouldn’t be so bad, but who knows what this person could be doing with said information.

The biggest issue is as follows: TradeBit is storing user passwords using one of following insecure methods:

  1. TradeBit may be storing the password in a perfectly-valid, irreversible manner, as well as a separate hash of just the last three characters, which customer service then hashes the provided three characters and compares. This is most likely not the case; it would be a lot of work with very little benefit. It also wouldn’t be as secure as possible, as the hashes of all users last three characters could be brute forced quite easily, and information about a users password is leaked.
  2. TradeBit may be storing user passwords using a reversible form of encryption, such as AES. This would allow them to decrypt a password and compare the provided one with the known one. Again, the odds of this happening are pretty low, and even if this is what is done, a hacker need only get access to their database of user passwords as well as the AES password and they’ve now got a decrypted version of all user credentials. Even if this password is stored within application code, and credentials in the database, both systems could be compromised and the password and hashes taken.
  3. What is most likely happening is TradeBit is simply storing a plaintext version of user passwords in their database.

TradeBit COULD redeem themselves by performing two actions. The first would be to create an automated process for allowing accounts to be deleted. The second would be to switch to a decent password encryption system such as using bcrypt in PHP (TradeBit uses PHP; inspect the headers). This move would be invisible to their users as they could run a process to encrypt user passwords overnight. Since the deletion system would be automated, TradeBit would check to see if the user is currently authenticated before performing the deletion, and wouldn’t require an awkward out-of-bounds email containing a partial password.

Protect Yourself

What you can do to protect yourself from the shortcomings of online services like TradeBit is anonymize your account information as best as possible before making a request to have your account removed. If a service allows you to change your username or email address, go ahead and change them to something which can’t be traced to other accounts of yours BEFORE requesting an account deletion. Also, make sure you are not using a password which you use anywhere else (if you are, change that as well before making the request). This is also useful for deleting your account from ANY service. You’d be surprised how many services don’t actually delete user accounts, but simply add a flag to the database saying the user account is inactive, keeping a copy of your email address and password locked away forever.

If you take these precautions, and a web service with your credentials is hacked, hackers won’t be able to use this information to login to other services you use. These processes are automated, so once your information is leaked, the chances are extremely high that someone will find other services which your credentials work with.