I had a situation where an employee created a new EC2 instance with his keypair and was out the next day. I needed access to it immediately, so this posed a problem. Here is how I gained SSH access via AWS web console: I detached the EBS drive, mounted it on another EC2 instance I did have access to, added my ssh pub key to ~ec2-user/.ssh/authorized_keys, then reattached it back to the old instance. Amazing the ideas that strike you in the event of an emergency.

As long as you have full AWS console access and some light unix chops, it should be fairly straightforward:

  1. Go to the Amazon EC2 control panel, and click “Volumes” (under Elastic Block Store). Look at the attachment information for the old EBS volume to find the EBS attached to the old EC2 instance.
  2. Detach it and attach it to an EC2 instance you have SSH access to via this web console. Keep note the path, probably /dev/sda1. You will have to reconnect it to this path later and AWS doesn’t always guess correctly. When you attach it to your other EC2 it will probably attach as /dev/sdf or something since /sda is taken by the root drive. You can see this in the EBS Volumes table under “Attachment information”. This will be something like (<instance name>):/dev/sdg
  3. Use SSH and connect to your good instance. Type
    mkdir /mnt/oldvolume and then sudo mount /dev/sdf /mnt/oldvolume (or whatever the path given in the attachment information panel was). Your files should now be available under /mnt/oldvolume.

  4. Add your pub ssh key to /home/ec2-user/authorized_keys.
  5. Unmount the volume with umount -d /dev/sdf and follow the above steps to reattach it back. You should be able to login to ec2-user now on your old box.