Dork In Public

We'll make a site on the public internet and fork the best of our content to there from our laptop. We'll consider different kinds of sites and the various tasks of making and keeping them.

Remember, if you don't pay for it, it is not yours.

# Domains

You can register a domain name for your public site. If you already have a domain name, you can probably make a subdomain for your wiki site. It need not be on the same computer. That's what's cool about DNS.

I'm registering domain names with There are many other domain name vendors and most offer many up-sells that make it hard to find what you want. Be patient. Expect to pay 10 or 15 dollars a year.

I feed the DNS system my registered names from a CentOS box running linux. For every domain I've registered, and every subdomain of those, I provide DNS a public IP address. I get this from my connectivity provider.

Wiki can be run in a virtual hosting mode we call a wiki farm. You can direct any number of domains to a farm and they will all have separate sites. I use a DNS wildcard which gives me unlimited subdomains. For this class I have opened up a wildcard using DNS records with wildcards.

$ cd /var/named/chroot/var/named/external/ $ grep dork dork A *.dork A

# Servers

I run several wiki servers on my home LAN. A few of these I port-forward from the internet to the aging laptop or mac mini upon which I have installed wiki. Even a weak laptop battery will carry you through many storm-related power outages. I run my cable-modem in bridging-mode and then use a more easily configured gateway-router to my LAN. Ask a friend for help or read the whole manual.

We've had good luck running wiki from the Digital Ocean hosting service. Nick Niemeir has made this easy by scripting the whole install. See Deploying a Wiki

You may have a friend that is running a wiki farm and would be happy to have you host content within a virtual site within his farm. I am such a friend. Typically these sorts of lends don't come with much customer support. Be careful and make your own backups.

# Claims

We protect our wiki sites from abuse by establishing a single author for each one. The first author to login to a site will in effect claim that site. There won't be a second login allowed.

A site that has not yet been claimed will suggest that you "Claim with your Email". You must create a Mozilla Persona account, login to that account, and then use that account to claim your wiki.

Earlier versions of wiki used OpenID to claim sites. OpenID providers are vanishing fast. Both OpenID and Persona store some login info in the .wiki/state folder. Erase this information to unclaim a site.

# Origin

Wiki reads from many sites but writes only to the site where you begin each browser tab, the origin, where the javascript comes from.

You will want to login to the origin before you start writing. If you own many sites you will find you use many tabs and login on all of them.

If you write on a site you don't own then your changes will be stored in the browser local storage and will be there when you return. See Local Changes

If you forget to login, or were logged out without noticing, you can login to a site you own and 'fork' your local changes to your site.

If you lose connectivity to your site then your edits will be saved locally until you can reconnect and 'fork' your local changes to your site.

# Backup

You will want to make backups of the sites you own or host for others. Don't confuse the journal with a backup strategy. The journal tells others how the pages you decide to keep got the way they are. If you loose pages, the journals go with them.

If you backup files on your laptop then the files in .wiki should get backed up too. Check to be sure. I use Apple's TimeMachine to backup hundreds of wiki sites on an hourly basis.

If you have a site hosted in a farm you can back that up by exporting the whole site as a json file. You can use curl or just SaveAs from the browser.

curl >backup

If you host sites for friends you want to keep then you should make backups for them too. The rsync command will do this efficiently over ssh.

rsync -avz backups

If you write pages that people like then they might just last forever without any additional work on your part. Good pages will take care of themselves in a Darwinian sense but these 'selfish pages' aren't doing this for you.