Setting Up dgamelaunch and WebTiles

OBSOLETE – Last edited by advil apr 2020 [current for Ubuntu 18.04, but known not to work as-is on current Ubuntu as of 2024]

This page corresponds to a set of scripts in the official dgamelaunch-config repository that build and test a container following the core steps illustrated here. See https://github.com/crawl/dgamelaunch-config/blob/master/utils/testing-container/README.md, https://github.com/crawl/dgamelaunch-config/blob/master/utils/build-testing-container.sh, etc. The scripts are kept up to date, but this page is not. You can use the scripts in the same fashion as this tutorial, by running through them manually on the target server (without docker). First go through https://github.com/crawl/dgamelaunch-config/blob/master/utils/testing-container/Dockerfile, and then use the steps from https://github.com/crawl/dgamelaunch-config/blob/master/utils/provision-chroot.sh to set up the chroot. (To be clear: that docker container implementation is designed just for testing, and docker isn't recommended with this configuration – but it replicates in a container the exact steps you would use for setting up a public dgl server.)

This document described, prior to 2021, how to set up an official Crawl server with support for webtiles and console, automatic updates, Sequell/scoring integration, and public ttyrecs. If you just want to run a webtiles server for friends and coworkers, the process is much simpler: see webserver/README in the Crawl source.

For debugging issues related to this setup or a running server, see maintaining dgamelaunch and webtiles.

0. Create the users

Create the users crawl and crawl-dev, with their associated user groups. The former will be used for running crawl, and the latter for administering it.

sudo adduser crawl
sudo adduser crawl-dev

Write down the uid and gid for these users, you will need it later

id crawl
id crawl-dev

These groups and permissions will save a lot of headaches:

sudo usermod -G root -a crawl
sudo usermod -G root -a crawl-dev
sudo usermod -G www-data -a crawl
sudo usermod -G www-data -a crawl-dev
sudo usermod -G crawl -a root
sudo usermod -G crawl -a www-data
sudo usermod -G crawl-dev -a root
sudo usermod -G crawl-dev -a www-data

1. Set up chroot

Set up a chroot in /home/crawl/DGL/. On Debian-like systems you can do:

debootstrap stable /home/crawl/DGL/

Ubuntu users should change 'stable' to match their build env (e.g. 'bionic' for ubuntu 18.04). If you are running this within a docker container, this step will fail unless the container was started with –privileged.

On Ubuntu 18 and later, the chroot will not yet be able to use apt. Use the following steps to finish setting up the chroot (as recommended by https://wiki.ubuntu.com/DebootstrapChroot):

sudo cp /etc/resolv.conf /home/crawl/DGL/etc/resolv.conf
sudo cp /etc/apt/sources.list /home/crawl/DGL/etc/apt/

If you are using a different distribution in your chroot than the outer distribution (not really covered by this guide) you will then need to edit sources.list to change the distribution name; see the ubuntu debootstrap guide linked above for further details.

2. Install prerequisites

Install some prerequisites into the chroot. Besides the base system that debootstrap installed, you need the libraries for running crawl, bzip2 for compressing save backups, the sqlite3 binary for interfacing with the user and version databases, locales, terminal definitions, and a minimal install of python (because character codecs are loaded at runtime). For Debian* systems:

sudo chroot /home/crawl/DGL/

This enters the environment in which crawl will run. After running this then install packages as root of the chroot:

apt update && apt upgrade
apt install bzip2 python-minimal ncurses-term locales-all sqlite3 libpcre3 liblua5.1-0 locales autoconf build-essential lsof bison libncursesw5-dev libsqlite3-dev flex sudo libbot-basicbot-perl

Xenial note: If you can't find locales-all or libbot-basicbot-perl, you probably need to enable universe packages with add-apt-repository universe && apt update. If you don't have add-apt-repository, you'll need to install software-properties-common, or edit /etc/apt/sources.list manually to enable universe packages.

note from araganzar On Ubuntu 14.04 I was unable to find locales-all even after checking universe is in sources.list. Was also unable to install the add-apt-repository package even after adding software-properties-common. Upgrading to 16.04 resolved these issues. You will also want to install vim and git using apt-get install if you don't have them. For crawl install, you also need to install libxtst-dev and libpng++-dev

Once all required packages are installed, generate locales and users:

locale-gen en_US.UTF-8
dpkg-reconfigure locales
adduser crawl
adduser crawl-dev

note from geekosaur

On recent Debian and derivatives with flex 4.x, you want the flex-old package instead of flex. You will get errors about undefined yylex during configure otherwise.

Change uid and gid to match the previous uid and gid:

usermod -u NEW_UID <username>
groupmod -g <NEW_GID> <groupname>

Exit the chroot.

exit

* Ubuntu is similar: ~# chroot /home/crawl/DGL apt-get …

Note:

  • If you're interested in reducing bandwidth at the expense of a little bit of rebuild time, you can install advancecomp and pngcrush on the host system. The makefile will automatically detect those and use them to optimise/recompress the tilesheets.
  • If you're building things from a very clean image, you may also have needed some basic prerequisites in the outer OS, e.g. apt install sudo git build-essential autoconf automake bison libncursesw5-dev flex liblua5.1-0-dev libsqlite3-dev libz-dev pkg-config python3 python3-pip python3-yaml ccache libpng-dev sqlite3 libpcre3 libpcre3-dev apache2, possibly others. The in-repository CI setup code is usually a good guide to the minimal package set here. (TODO: expand)
  • pip3 install tornado. (This will install tornado 6, which is supported, but in some circumstances you may wish to install earlier versions.)
  • For modern versions of tornado, you will have to get the library duplicated into the chroot as well. If you are sticking with python-minimal, the quickest way to do this on the instructions here, using ubuntu 18.04, is, cp -R /usr/local/lib/python3.6/dist-packages/tornado/ /home/crawl/DGL/usr/local/lib/python3.6/dist-packages/.

3. Mount /proc and /dev/pts

Various programs running in the chroot will need /proc and /dev/pts, so mount those. It's probably best to add them to your system's fstab so they're mounted on boot:

sudo vim /etc/fstab

You can either mount the filesystems directly:

# Option 1: direct mount
proc      /home/crawl/DGL/proc       proc    defaults        0       0
devpts    /home/crawl/DGL/dev/pts    devpts  defaults        0       0

or use a bind mount to duplicate the host directories into the chroot:

# Option 2: bind mount
/proc       /home/crawl/DGL/proc     none    bind            0       0
/dev/pts    /home/crawl/DGL/dev/pts  none    bind            0       0

Either way, these lines should come after the existing entries for your host system's /proc and /dev/pts. Now you should be able to mount them with just the name.

mount /home/crawl/DGL/proc
mount /home/crawl/DGL/dev/pts

NOTE: Root can trivially escape a chroot using /proc/1/cwd !

You'll also need to make sure your 'crawl' user can write to /dev/ptmx inside the chroot:

chmod 666 /home/crawl/DGL/dev/ptmx

Docker note: you may want to accomplish the mounts in this section via an entrypoint script rather than /etc/fstab (which isn't even used for e.g. ubuntu umages). The following is a complete script which can be used as an entrypoint script or run manually when starting up the container to minimally start webtiles and dgl; obviously the init.d commands will not actually work until later in these instructions.

#!/bin/bash
mount --bind /proc/ /home/crawl/DGL/proc/
mount --bind /dev/pts/ /home/crawl/DGL/dev/pts/
/etc/init.d/ssh start
/etc/init.d/webtiles start

4. Create crawl-dev dirs

Make the directories /home/crawl-dev/logs and /home/crawl-dev/run, owned by and writable by crawl-dev. Unless noted otherwise, subsequent commands should be run by user crawl-dev.

su crawl-dev
mkdir /home/crawl-dev/logs
mkdir /home/crawl-dev/run

5. Download dgamelaunch

Check out the master branch of dgamelaunch and dgamelaunch-config as the crawl-dev user. You might also want a copy of Sizzell if you intent to announce milestones and games in IRC.

# TODO: publish dgamelaunch and sizzell branches

cd
git clone git://github.com/crawl/dgamelaunch.git
git clone git://github.com/crawl/dgamelaunch-config.git
git clone git://github.com/crawl/sizzell.git

historical note: the crawl org version of dgamelaunch-config is not what most servers have used in the past: see https://github.com/neilmoore/dgamelaunch-config/. However, this guide is most recently verified (by advil) to work with this newly official version.

6. Build dgamelaunch

Build dgamelaunch:

cd dgamelaunch
./autogen.sh --enable-debugfile --enable-sqlite --enable-shmem
make

Copy the binary into /usr/local/sbin/ on your main system, and the ee and virus binaries into /bin on the chroot:

sudo make install
sudo cp ee virus /home/crawl/DGL/bin

7. Configure sudo access

sudo visudo

Give user crawl-dev permission to run dgl binary with sudo. We'll also need permissions for a few additional scripts, as well as webtiles.

crawl-dev ALL=(root) \
  /home/crawl-dev/dgamelaunch-config/bin/dgl, \
  /home/crawl/DGL/sbin/install-trunk.sh, \
  /home/crawl/DGL/sbin/install-stable.sh, \
  /etc/init.d/webtiles, \
  /home/crawl/DGL/sbin/remove-trunks.sh

If you want to use certain automated scripts, you will need to change the first line to allow this without a password:

crawl-dev ALL=(root) NOPASSWD: \

Security Note:

  • If crawl-dev has sudo privileges on a script that they have permission to edit, then they can edit the script to run any command as root.

You may also add permissions for your apache user (www-data on Debian) to execute the build scripts without a password. This is necessary for the /rebuild/ cgi script.

www-data  ALL=(crawl-dev) NOPASSWD: \
  /home/crawl-dev/dgamelaunch-config/bin/dgl update-trunk, \
  /home/crawl-dev/dgamelaunch-config/bin/dgl update-stable *

8. Configure dgamelaunch-config

Look over the various configuration files. You may wish to change directory names etc.

cd ../dgamelaunch-config
view dgl-manage.conf crawl-git.conf dgamelaunch.conf config.py
vim config.py
  • Edit ip addresses, ssl certs, server names, which game modes you want to support, and run on port 8080 for non-ssl. Docker note (possibly mac-specific): You may need to bind address to 0.0.0.0 rather than 127.0.0.1 in order to successfully expose the port to the docker host.
  • Make sure ssl options is set to None if you are not using ssl, or webtiles will not start; this is the default in the dgamelaunch-config repository. It is strongly recommended that you use SSL on a production server, but setting this up fully is beyond the scope of this guide.
  • Setting up SSL, assuming you know cert files, bundles or appending them as a .pem, and private server key9
  • create directory www/crawl_ssl in the chroot. Store files there and point to /var/www/crawl_ssl/file in ssl options.
vim dgamelaunch.conf
vim dgl-manage.conf
  • Edit DGL_SERVER and uid to match crawl
  • Modify contents of /home/crawl-dev/dgamelaunch-config/chroot/data as appropriate for your server and versions. This involves copying and manually editing files as appropriate. (TODO: expand on this)

Exit crawl-dev:

exit

9. Create directories

Make the needed directories. There are a bunch, and I've probably forgotten some, but at the very least you need the following under the chroot /home/crawl/DGL/.

cd /home/crawl/DGL
sudo mkdir crawl-master
sudo mkdir crawl-master/webserver/
sudo mkdir crawl-master/webserver/run/
sudo mkdir crawl-master/webserver/sockets/
sudo mkdir crawl-master/webserver/templates/
sudo mkdir dgldir
sudo mkdir dgldir/data/
sudo mkdir dgldir/dumps/
sudo mkdir dgldir/morgue/
sudo mkdir dgldir/rcfiles/
sudo mkdir dgldir/ttyrec/
sudo mkdir dgldir/data/menus/
sudo mkdir dgldir/inprogress/

You will also need the following set of directories for each crawl version you will support:

sudo mkdir dgldir/inprogress/crawl-git-sprint/
sudo mkdir dgldir/inprogress/crawl-git-tut/
sudo mkdir dgldir/inprogress/crawl-git-zotdef/
sudo mkdir dgldir/inprogress/crawl-git/
sudo mkdir dgldir/rcfiles/crawl-git/
sudo mkdir dgldir/data/crawl-git-settings/

For example:

sudo mkdir dgldir/inprogress/crawl-24-sprint/
sudo mkdir dgldir/inprogress/crawl-24-tut/
sudo mkdir dgldir/inprogress/crawl-24-zotdef/
sudo mkdir dgldir/inprogress/crawl-24/
sudo mkdir dgldir/rcfiles/crawl-0.24/
sudo mkdir dgldir/data/crawl-0.24-settings/

They should all be writable by user 'crawl', and the morgues, ttyrecs, and rcfiles should probably be readable by your web server.

DO NOT chown the whole chroot: various programs and directories inside the chroot need to be owned by root or other users.

sudo chown -R crawl:crawl crawl-master
sudo chown -R crawl:crawl dgldir

Also create the file dgamelaunch directly under the chroot; it will be used as a shared memory key. Contents don't matter.

sudo touch /home/crawl/DGL/dgamelaunch

If you want debug messages, also create the file dgldebug.log and make it writable for crawl. Warning, the file has a reputation of not being incredibly useful:

sudo touch /home/crawl/DGL/dgldebug.log
sudo chown crawl:crawl /home/crawl/DGL/dgldebug.log

Finally, ensure that user crawl has write access to /var/mail so that console messages can be sent.

9.5 Create directories Pt.2

Create the crawl versions database and the save and data directories for trunk:

sudo /home/crawl-dev/dgamelaunch-config/bin/dgl create-versions-db
sudo /home/crawl-dev/dgamelaunch-config/bin/dgl create-crawl-gamedir

Copy the resulting crawl-git directory for each other version you will support:

cd /home/crawl/DGL/crawl-master
sudo cp -a ./crawl-git ./crawl-0.24

10. Publish dgamelaunch configs

Publish the configs into the chroot (and the dgamelaunch config into /etc).

sudo /home/crawl-dev/dgamelaunch-config/bin/dgl publish --confirm

11. Install crawl

Try installing your crawl versions (run as user crawl-dev):

/home/crawl-dev/dgamelaunch-config/bin/dgl update-trunk
/home/crawl-dev/dgamelaunch-config/bin/dgl update-stable 0.24

Notes:

  • At the very least you must run update-trunk, because that is responsible for installing the webtiles server.
  • You probably want to set up a cronjob to run this once a day. if so, you will need to ensure your host has locales set correctly for webtiles to run.

Webtiles require an UTF-8 locale:

sudo update-locale LANG=en_US.UTF-8

I add this line to the beginning, so I have a log of all updates:

exec >> /home/crawl-dev/logs/update.log 2>&1

Exit crawl user:

exit

12. Create symlinks

Set up symlinks in /var/www/ for people to access morgues, ttyrecs, etc. As root (or someone with access to that directory):

Create a folder named crawl

mkdir /var/www/crawl

In /var/www/crawl, create symlinks

cd /var/www/crawl
sudo ln -s /home/crawl/DGL/dgldir/morgue/
sudo ln -s /home/crawl/DGL/dgldir/rcfiles/
sudo ln -s /home/crawl/DGL/dgldir/ttyrec/

12.5 Apache configuration

Some stuff for your Apache config. Note that auth-save-downloader.pl and trigger-rebuild.pl are installed into /usr/lib/cgi-bin by the dgl publish command. You'll need to enable mod-rewrite if that is not already enabled.

On Debian:

sudo a2enmod rewrite
sudo service apache2 reload

note from araganzar - Apache 2.4 and later has different access rules - http://httpd.apache.org/docs/2.4/upgrading.html#access. You no longer use “Order allow,deny” and “Allow from all”, these should be replaced with “Require all granted” - or “Require all denied” if the 2nd directive is “Deny from all”

~Apache Configuration in full~

<VirtualHost *:80>
  ServerName crawl.yoursite.org
  DocumentRoot /var/www/crawl
  RewriteEngine on

  ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/
  <Directory "/usr/lib/cgi-bin">
    AllowOverride None
    Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
    Order allow,deny
    Allow from all
  </Directory>
  
  # Make an alias /saves/ that passes HTTP authentication information and
  # the save file name to auth-save-downloader.
  RewriteCond %{HTTP:Authorization} ^(.+)
  RewriteRule ^/saves/(.*)$ /cgi-bin/auth-save-downloader.pl?file=$1 [PT,E=HTTP_AUTHORIZATION:%1]
  RewriteRule ^/saves/(.*)$ /cgi-bin/auth-save-downloader.pl?file=$1 [PT]

  # Make an alias /rebuild/ that passes HTTP authentication information to
  # the rebuild trigger.
  RewriteCond %{HTTP:Authorization} ^(.+)
  RewriteRule ^/rebuild(/(.*))?$ /cgi-bin/trigger-rebuild.pl?v=$2 [PT,E=HTTP_AUTHORIZATION:%1]
  RewriteRule ^/rebuild(/(.*))?$ /cgi-bin/trigger-rebuild.pl?v=$2 [PT]

  RewriteCond %{REQUEST_URI} ^/ttyrec/([^/]*)/(.*\.ttyrec)
  RewriteCond /var/www/%{REQUEST_FILENAME} !-f
  RewriteRule ^/ttyrec/([^/]*)/(.*\.ttyrec)$ /ttyrec/$1/$2.bz2
  RewriteRule ^/crawl - [L]
  RewriteRule ^/crawl/morgue - [L]
  RewriteRule ^/crawl/rcfiles - [L]
  RewriteRule ^/crawl/ttyrec - [L]
  RewriteRule ^/crawl/meta - [L]

  # Turn off compression for /rebuild so we can see compile messages in real time.
  SetEnvIfNoCase Request_URI ^/rebuild(/.*)?$ no-gzip dont-vary

  RewriteRule ^/(.*) http://crawl.yoursite.org:8080/$1
</VirtualHost>

<VirtualHost *:80>
  ServerName www.yoursite.org
  Redirect / http://crawl.yoursite.org:8080
</VirtualHost>
Listen 8081
<VirtualHost *:8081>
  ServerAdmin admin@yoursite.org

  DocumentRoot /var/www/crawl
  ServerName yoursite.org
  ServerAlias crawl.yoursite.org www.yoursite.org

  SSLEngine on

  SSLCertificateFile9/var/www/crawl_ssl/.crt
  SSLCertificateKeyFile /var/www/crawl_ssl/server.key
  SSLCertificateChainFile /var/www/crawl_ssl/.ca-bundle

  <FilesMatch "\.(cgi|shtml|phtml|php)$">
    SSLOptions +StdEnvVars
  </FilesMatch>

  <Directory /usr/lib/cgi-bin>
    SSLOptions +StdEnvVars
  </Directory>

  BrowserMatch "MSIE [2-6]" \
    nokeepalive ssl-unclean-shutdown \
    downgrade-1.0 force-response-1.0

  # MSIE 7 and newer should be able to use keepalive
  BrowserMatch "MSIE [17-9]" ssl-unclean-shutdown

</VirtualHost>

Notes:

  • To access ssl, the url will show the port such as https://crawl.yoursite.org:8081.
  • Also, when configuring config.py make sure ssl port is set to whichever port your virtual host is for apache ssl.
  • If you are already running ssl, then change it from 443 because it will be already bound.

Restart apache

sudo service apache2 reload

12.6 Start webtiles

Start Webtiles as crawl-dev via:

sudo /etc/init.d/webtiles start

docker note: at this point, to test out webtiles you will need to connect to the relevant port (probably 80 or 8080) on your container. The best practice for doing that (which is required on a mac) is to expose the relevant port to the host with the command line argument like -p 8080:8080. If you didn't do this when you first started the container, commit your current state, stop the container, and restart it with this command. (Don't forget that you will need to rebuild your chroot mounts when restarting the container.)

Troubleshooting Notes:

If nothing happens and the server doesn't start up, disable daemonizing in the chroot copy of `config.py` and run the webserver directly from `/home/crawl/DGL/crawl-master/webserver/`; this will allow you to see the crash. Keep in mind that if you run `dgl publish` during this sort of troubleshooting, it'll wipe out config.py in this directory.

Important note: the following troubleshooting note is deprecated and you shouldn't be using a version of tornado < 5. However, I'll keep it here for a little while just in case:

If you get:

-Tornado error:
File "/usr/local/lib/python2.7/dist-packages/tornado-3.0.1-py2.7.egg/tornado/httpserver.py", line 153, in __init__
TypeError: __init__() got an unexpected keyword argument 'connection_timeout'

You need to install edlothiol's patch for Tornado. As crawl-dev, from /home/crawl-dev:

su crawl-dev
cd

git clone git://github.com/flodiebold/tornado.git --branch http-timeouts-2.4 /home/crawl-dev/tornado
cd /home/crawl-dev/tornado
python setup.py build

NOTE from johnstein:

/etc/init.d/webtiles sets PYTHONPATH for this, so you shouldn't need to install this, but I wasn't able to make it work and needed:
python setup.py install

NOTE from ZiBuDo:

Troubleshoot with sudo tail -f /home/crawl/DGL/crawl-master/webserver/run/webtiles.log
If latin1 encoding error: edit the final file in the trace and change 'latin1' to 'UTF-8'
If ascii encoding error: edit the files in the trace to have “from encodings import ascii” at the top of the file
If can't bind the websocket: make sure its trying on port 8080 and that crawl has proper permissions in the folders you created

NOTE from espais:

I have had an issue with dgamelaunch/WebTiles on Ubuntu 16.04 where the mounting procedure and permissions get trounced upon rebooting. To resolve this I added the following to /etc/rc.local (before exit 0)
# unmount and remount pts/proc
umount /home/crawl/DGL/dev/pts
umount /home/crawl/DGL/proc
mount /home/crawl/DGL/dev/pts
mount /home/crawl/DGL/proc

# fix ptmx permissions 
chmod 666 /home/crawl/DGL/dev/ptmx

# restart webtiles -- webtiles restart doesn't seem to work very well
/etc/init.d/webtiles stop   
/etc/init.d/webtiles start

exit 0 # normally already in /etc/rc.local

NOTE from floraline:

I had an issue where I had enabled the built-in SSL options in WebTiles, and server.py would start to consume 100% CPU time for several hours at a time. This was caused by clients, usually botnets doing port scans and other things, disconnecting in the middle of the SSL handshake. Tornado 2.4.1 does not handle broken SSL handshakes and will enter into a state where it uses 100% CPU and won't stop on its own. I fixed this by making the following change in tornado/iostream.py, routine _do_ssl_handshake:
except socket.error, err:
-   err.args[0] in (errno.ECONNABORTED, errno.ECONNRESET):
+   if err.args[0] in (errno.ECONNABORTED, errno.ECONNRESET) or err.args[0] == errno.EBADF:
        return self.close()

13. Other notes

I'm sure there's more… launching the inotify watcher, crontabs for compressing ttyrecs, cleaning out old trunks, making logfiles and milestones available over the web, setting up an ssh user, set up a dgl-status script in cgi-bin, forwarding port 80 requests to 8080 for webtiles, etc.

13.1 user database

To create the user database, either run webtiles, or do:

sqlite3 /home/crawl/DGL/dgldir/dgamelaunch.db
sqlite3> CREATE TABLE dglusers (id integer primary key, username text, email text, env text, password text, flags integer);
sqlite3> .quit;

13.2 ttyrec compression

For ttyrec compression (these files are huge, even when compressed), make sure you have the program lsof on the host, and run from a cron job (via sudo as crawl-dev):

#! /bin/bash
exec >>/home/crawl-dev/logs/compress-ttyrecs.log 2>&1
/home/crawl-dev/dgamelaunch-config/bin/dgl compress-ttyrecs

13.3 clean old trunk versions

For cleaning out old Trunk versions, run the following script from a cron job (as crawl-dev)

#! /bin/bash
exec >>/home/crawl-dev/logs/clean-trunks.log 2>&1

DGL=/home/crawl-dev/dgamelaunch-config/bin/dgl

# tail -n +6 to skip the header and, more importantly, the most recent
# trunk version.
readarray -t to_del < <(
  $DGL remove-trunks -v | tail -n +6 | awk '$6==0 { sub(".*g","",$4); print $4 }'
)

if (( ${#to_del[@]} )); then
  echo -n "Cleaning trunks at "
  date;
  $DGL remove-trunks "${to_del[@]}"
  echo done.
  echo
fi

13.4 dgamelaunch ssh user

Install sshd, if not already installed:

sudo apt install openssh-server

Add an ssh user: (This will prompt for a password, if you want an empty password you will have to set that elsewhere; also recommended to use an ssh key)

sudo useradd crawler --shell /usr/local/sbin/dgamelaunch
echo crawler:crawler | sudo chpasswd

Note that your ssh user should have /usr/local/sbin/dgamelaunch as their shell, and should have forwarding (particularly TCP forwarding) disabled. You could use the following at the end of your /etc/ssh/sshd_config (note that “Match” affects all options until the next “Match” or the end of the file).

Match User terminal
 AllowAgentForwarding no
 AllowTcpForwarding no
 X11Forwarding no
 PasswordAuthentication yes  # only if you want to allow passwords

If it's not already running, start the sshd server:

/etc/init.d/ssh start

13.5 dgl-status

Set up this dgl-status script in your cgi-bin with 755 permissions and root ownership (for debian/ubuntu, this is located at /usr/lib/cgi-bin). This script creates an easy-to-parse list of all the players.

#! /bin/sh

echo Content-type: text/plain
echo
umask 0022
exec /usr/local/sbin/dgamelaunch -s

Send the URL to Wensley (on Libera IRC, #crawl-dev, !tell Wensley the-url/cgi-bin/dgl-status)

13.6 inotify

Launch the inotify watcher which is necessary to populate the “current location” field in DGL.

sudo /home/crawl-dev/dgamelaunch-config/bin/dgl crawl-inotify-dglwhere

Notes:

  • The watcher has a dependency on the Linux/Inotify2.pm Perl module. This is available in Debian as package liblinux-inotify2-perl.
  • If you add new inprogress directories (for new game versions) you'll need to find and kill that daemon then restart it.

13.7 stats caching

If you want your milestones, logfiles, and scores to be cacheable for stats, you will need to publish those files. Create symlinks for each file and put them in the appropriate folders in /var/www. Suggested directory structure:

  /var/www/crawl/meta/git/
  /var/www/crawl/meta/0.13/
  /var/www/crawl/meta/0.23/
  etc
  

with symlinks to each milestone, logfile, and score file (you will need to individually link each score file for each sprint map).

  ls -lt /var/www/crawl/meta/git/    
  meta/git:
  total 8
  lrwxrwxrwx 1 root root 52 Jan  3 18:31 logfile -> /home/crawl/DGL/crawl-master/crawl-git/saves/logfile
  lrwxrwxrwx 1 root root 59 Jan  3 18:31 logfile-sprint -> /home/crawl/DGL/crawl-master/crawl-git/saves/logfile-sprint
  lrwxrwxrwx 1 root root 59 Jan  3 18:31 logfile-zotdef -> /home/crawl/DGL/crawl-master/crawl-git/saves/logfile-zotdef
  lrwxrwxrwx 1 root root 55 Jan  3 18:31 milestones -> /home/crawl/DGL/crawl-master/crawl-git/saves/milestones
  lrwxrwxrwx 1 root root 62 Jan  3 18:31 milestones-sprint -> /home/crawl/DGL/crawl-master/crawl-git/saves/milestones-sprint
  lrwxrwxrwx 1 root root 62 Jan  3 18:31 milestones-zotdef -> /home/crawl/DGL/crawl-master/crawl-git/saves/milestones-zotdef
  lrwxrwxrwx 1 root root 51 Jan  3 18:31 scores -> /home/crawl/DGL/crawl-master/crawl-git/saves/scores
  lrwxrwxrwx 1 root root 58 Jan  3 18:31 scores-sprint -> /home/crawl/DGL/crawl-master/crawl-git/saves/scores-sprint
  lrwxrwxrwx 1 root root 58 Jan  3 18:31 scores-zotdef -> /home/crawl/DGL/crawl-master/crawl-git/saves/scores-zotdef

It's recommended to also publish your rcfiles, morgue files, and ttyrec files.

  ls -lt /var/www/crawl
  total 12
  drwxr-xr-x 2 root root 4096 Jan  5 00:55 keys
  -rw-r--r-- 1 root root  348 Jan  4 19:31 default.asp
  drwxr-xr-x 4 root root 4096 Jan  3 18:33 meta
  lrwxrwxrwx 1 root root   30 Jan  2 01:33 ttyrec -> /home/crawl/DGL/dgldir/ttyrec/
  lrwxrwxrwx 1 root root   31 Jan  2 01:33 rcfiles -> /home/crawl/DGL/dgldir/rcfiles/
  lrwxrwxrwx 1 root root   30 Jan  2 01:31 morgue -> /home/crawl/DGL/dgldir/morgue/

9 9sudo mkdir /var/www/crawl/meta/git/ 9sudo mkdir /var/www/crawl/meta/0.24/

9Do this in /var/www/crawl/meta/git

9sudo ln -s /home/crawl/DGL/crawl-master/crawl-git/saves/logfile 9sudo ln -s /home/crawl/DGL/crawl-master/crawl-git/saves/logfile-sprint 9sudo ln -s /home/crawl/DGL/crawl-master/crawl-git/saves/logfile-zotdef 9sudo ln -s /home/crawl/DGL/crawl-master/crawl-git/saves/milestones 9sudo ln -s /home/crawl/DGL/crawl-master/crawl-git/saves/milestones-sprint 9sudo ln -s /home/crawl/DGL/crawl-master/crawl-git/saves/milestones-zotdef 9sudo ln -s /home/crawl/DGL/crawl-master/crawl-git/saves/scores 9sudo ln -s /home/crawl/DGL/crawl-master/crawl-git/saves/scores-sprint 9sudo ln -s /home/crawl/DGL/crawl-master/crawl-git/saves/scores-zotdef

13.8 redirect port 80

to automatically make apache redirect webtiles requests for port 80 to 8080 (where webtiles runs by default) do the following (for ubunutu): This was noted in the full apache config above Create the following config file:

  sudo vi /etc/apache2/conf.d/mysite
  
  <VirtualHost *:80>
  ServerName crawl.mysite.org
  DocumentRoot /var/www/
  RewriteEngine on
  RewriteRule ^/crawl - [L]
  RewriteRule ^/crawl/morgue - [L]
  RewriteRule ^/crawl/rcfiles - [L]
  RewriteRule ^/crawl/ttyrec - [L]
  RewriteRule ^/crawl/meta - [L]
  RewriteRule ^/(.*) http://crawl.mysite.org:8080/$1
  </VirtualHost>

Edit DocumentRoot and the RewriteRules for the morgues, rcfiles, ttyrecs, and milestones/logs/scores as required

then restart apache

sudo service apache2 reload

13.9 admins

To add admins to the server (allows access to backup saves and wizmode!):

sudo /home/crawl-dev/dgamelaunch-config/bin/dgl admin add <name>

14. Security notes

Security note: As long as crawl-dev has write permissions to the dgl script and its subdirectories, they can easily use their sudo access to run arbitrary commands. If they can edit and publish the dgamelaunch or webtiles config , they can trick the server into running arbitrary commands as arbitrary users.

If you want to give users the ability to manage crawl without giving them complete root access, move the whole /home/crawl-dev/dgamelaunch-bin directory to a system-wide location like /usr/local/lib, make it writable only by root, and update the sudoers entry and the various cron jobs and init scripts. The crawl-dev user then won't be able to change the server configuration, but will at least be able to use the various management commands (resetting user passwords, etc).

Logged in as: Anonymous (VIEWER)
dcss/server/setting_up_dgamelaunch_and_webtiles.txt · Last modified: 2024-06-10 21:04 by advil
 
Recent changes RSS feed Donate Powered by PHP Valid XHTML 1.0 Valid CSS Driven by DokuWiki