Saturday, September 22, 2012

Dante and Waffle

Chad Versace's XDC2012 Waffle Presentation included a demo of Dante running with a Waffle backend with dynamic switching of the window system (X11 && GLX) or Wayland (using an environment variable.)

I definitely intend to research and hopefully even contribute to Waffle in the future.

Chad quickly wrote 95% (or more) of the Waffle backend. I provided some guidance and a small patch to stubbed-out X11 input. Together, we managed to pull this together just under the wire! Chad, you're awesome! :-)

Friday, August 17, 2012

Prebuilt Android APKs

I have quickly uploaded a Dante-debug.apk and some basic info to

Debug in the APK context means the code is not signed with a key, so you need to enable installation of such packages in the Android Settings menu. The code is compiled with optimizations, but do not expect anything above single-digit frames per second presently.

Optimization is something I'm working towards, but the fact remains that many of the current generation devices simply do not have the GPU horsepower to run the game at an acceptable (greater than or equal to 30 fps) rate. Memory is another major killer; you need as much as possible available.

Currently the Dante icon will launch the Doom 3 game to the main menu. However, at least on my device it's not possible to launch the single-player campaign (aka game/mars_city1) due to memory requirements. Again, this is something that will be improved in the future.

You need to make the directory and copy your Doom 3 *.pk4 files into /sdcard/Android/data/ from your CDs or DVD. You must also copy pak000_gl2progs.pk4 into the same directory otherwise the game will not run! This file contains the GLSL shaders required by Dante.

The APK package is not guaranteed to run on your specific device. You're welcome to file a issue about it on Github but please check for duplicates first. It's also not my fault if the APK destroys your device, kills your kitten, or causes the Mayan Apocalypse.

I'll write up some detailed instructions for compiling and installing Dante on Android later this week or early next week for those who wish to do this or don't trust installing my APK package.

Happy hacking!

Sunday, August 12, 2012

Dante with Android

With much thanks to Krzysztof 'kkszysiu' Klinikowski we now have mostly working support for Android devices in Dante. That is to say, the code is there, the build scripts are there, but it's still not an easy process for the user to build their own Dante.apk package for their phone/tablet.

I am considering writing a detailed how-to guide on setting up the Android SDK and NDK, creating the standalone toolchain, and running the build. It's not really difficult, it's just tedious to execute all of the commands required for the Android build.

I have shell scripts which do most (but not all) of the work checked into Git, however you still need to know how to use them, how to modify them to point to your NDK and standalone toolchain, and finally how to transfer the Doom 3 media assets onto your device. Right now the process is definitely not very easy nor streamlined.

We are not yet at a point where this is user friendly! You're welcome to visit us in the #dante IRC channel on if you really want to build your own packages, get involved in the development, or just see what's happening in real-time. Okay, this is a free-time project, so you may have to idle a bit and the channel is not always active.

  • Performance is currently bad. I am aware of this and it's something we're working on improving in various ways. It's unlikely to happen overnight.
  • Touchscreen input is also bad but I expect this will improve fairly quickly. Adding good on-screen controls to be able to move and look around I see as a much more difficult problem. Perhaps one that will be solved with a wearable device (such as a phone or mini-tablet) a Bluetooth wireless controller, and the Oculus Rift. I'll have to wait for my unassembled prototype in November to find out...
  • Audio support currently does not exist, though progress is being made in this area.
You can see the bugs and planned enhancements/features on Dante's Github Issue Tracker.

Happy Hacking!

Thursday, August 2, 2012

Don't Panic! Doom 3 Repository Rename

I have decided to rename my Doom 3 repository (aka "idTech4 ES2.0", aka "Oliver's Doom", aka "omcfadde's Doom", ...) to an easier to remember name.

Going forward, the repository will be named "Dante" (based on Dante's Inferno and it's fitting description of the Doom 3 game.) Dante journeyed through Hell, guided by the Roman poet Virgil.

Hopefully this will make it much easier to uniquely and easily identify my repository. Unfortunately for those of you who already have clones of the idtech4 repository from Github, you will need to adjust the remote origin to point to the new address. I apologize for any inconvenience.

Don't worry, the project is continuing as usual; the only change is the repository name. :-)

Wednesday, August 1, 2012

Doom 3: Blinn-Phong vs Phong

Without going into the technical and implementation details of each shading algorithms I would like to request comments from people as to which you prefer visually.

Unmodified Doom 3 uses the Blinn-Phong shading model, which is an approximation of the more computationally expensive Phong shading model. Both images are using a fixed exponent of 16.0f, however it's possible to adjust the exponent based on material type in both shading models. Currently, this is not done.

A higher exponent produces a tight specular highlight (for example chrome metal) while a lower exponent will produce a broader highlight (for example matte paint or skin tones.)

Please note that some tuning of the exponent may be required for the Phong model due differences in lighting calculation.

Doom 3: Blinn-Phong (16.0f)
Doom 3: Phong (16.0f)
Doom 3: Phong (4.0f)

Tuesday, July 24, 2012

idTech 4 (aka Doom 3) ES2.0

I've pushed my Doom 3 ES2.0 branch onto GitHub including the GLSL shaders. These shaders could also be used as a template to write new ARB programs for the minimum assets required by iodoom3. The shaders are licensed as GPLv3 and Doom 3 is licensed as GPLv3 + additional terms as defined by id Software.

Things are starting to liven up over on #iodoom3 (Freenode) and we'll hopefully start seeing some cool things  very soon. :-)

I've put a PayPal donate button on my blog; many people don't realize the time and effort that goes into these hobby projects. If this project, or other projects of mine have been useful to you, please do make a donation (even if it's a "gold coin donation" 1-2€ or 1-2$)

I'll stop shamelessly pimping myself out now and get ready for my day job.

Happy hacking!

Friday, May 11, 2012

Doom 3 for Nokia N900

I'm sick with a cold, but since I ran out of movies to watch I thought it best to try to distract myself by seeing whether the N900 could run my Doom 3 branch. It turns out the answer is "yes, but very slowly."

I suspect due to the completely non-optimized code, poor choice of formats (should be using RGB565), and avoid using highp everywhere in the shaders.

Nevertheless, it does run despite the currently poor performance and few rendering bugs...

Saturday, April 28, 2012

Doom 3 and ES2

I spent some time hacking on Doom 3 adding support for EGL and OpenGL ES2.0; obviously there is still a significant amount of work to be completed before this looks even remotely like the game. Some of you might make out the text console in this image, rendered with the incorrect shader. :-) Phoronix, go nuts... :-)

(Updated again 2012-05-07: Doom 3 is nothing without stencil shadow volumes!)

(Updated 2012-05-07: I discovered and fixed a little bug in the modelViewProjection matrix...)

Wednesday, April 25, 2012

Google Calendar Bug-Fixes

I have discovered some bugs in my previous entry regarding Google Calendar synchronization. Firstly there is a bug in the locking mechanism causing it to sometimes hang while running fetchmail. The following is the corrected .procmailrc entries:

# === Google Calendar (from Microsoft Exchange; iCalendar) ===
:0 c:
* H ?? ^Content-Type: multipart/(alternative|mixed)
* B ?? ^Content-Type: text/calendar
 :0 Wac: Mail/multipart.lock
 | munpack -q -t -C ~/Mail/.munpack 2> /dev/null
 :0 Wac: Mail/multipart.lock
 | /home/oliver/bin/
 :0 Wac: Mail/multipart.lock
 | rm -f ~/Mail/.munpack/*

I discovered the second bug when synchronizing a particular iCalender format file containing several BEGIN:VEVENT blocks. Events are expected to have a UID field. I do not know whether this should be unique within the scope of your entire calendar, but it must be unique within the scope BEGIN:VCALENDAR block (i.e. the entire iCalender file.)

Exchange, in all its infinite stupidity decides to assign the same UID to every VEVENT, and as soon as you attempt to upload it to Google Calender via WebDav you will receive a cryptic error message:
500 Server error code: ........ (Pseudo-random characters)

You'll receive an even less helpful error message when trying the "Import Calendar" feature of the web interface:
Failed to import events: Could not upload your events because you do not have sufficient access on the target calendar.

What is the solution? Well, funnily enough a Unique Identifier (UID) should actually be unique! Changing those UIDs to some unique values and uploading the file again works perfectly! (Anything works, I used echo $RANDOM | md5sum | cut -d ' ' -f 1 i.e. MD5 hash of a pseudo-random number.)

I am only aware of one such formatted calendar entry in my dataset, so I adjusted it manually. However, I'd feel safer using sed or awk to replace each "UID" with a real unique identifier. I'll post such a modified in the second part, whenever I feel like playing with awk again! You could always beat me to it... Happy hacking!

Sunday, April 22, 2012

Doom 3 GLSL

I recently implemented a GLSL renderer backend for Doom 3. Yes, there are already a couple of backends existing (e.g. raynorpat's) unfortunately these did not run successfully on my hardware and had serious rendering and pixel errors.

These images are from the first implementation of my backend, where I had accidentally called normalize() on a vector which was almost normalized. The result is pixel-imperfection when compared to the standard ARB2 backend, and the cost of pointless normalization in the fragment shader.

You can also see the importance of running a comparison or image-diff program when implementing a new backend. Can you see the differences between the first two images immediately, with the naked eye? I couldn't.

Finally, here is the backend running the hellhole level. The black regions are areas that would be rendered by the (currently unimplemented in GLSL) heatHaze shader. Not bad for an i965 GPU.

Just for the laughs, here is what happens when Doom 3 decides to try LSD; or fails to pass initialized texture coordinates from the vertex program to the fragment program in the ARB2 backend.

Friday, April 20, 2012

From fetchmail/procmail to Google Calendar

I spent many hours scouring Google before finally figuring out how to implement this easily. Calendar synchronization is very important to me; I want to be able to look at my phone, my tablet, or my browser and see my entire schedule (otherwise I'd never show up anywhere.)

:0 Wc: Mail/multipart.lock
* H ?? ^Content-Type: multipart/(alternative|mixed)
* B ?? ^Content-Type: text/calendar
        :0 Wac:
        | munpack -q -t -C ~/Mail/.munpack 2> /dev/null
        :0 Wac:
        | /home/oliver/bin/
        :0 Wc:
        | rm -f ~/Mail/.munpack/*
This beautiful undocumented recipe is actually quite simple, but perhaps needs some explanation. From man procmailrc:

A line starting with ':' marks the beginning of a recipe.
It has the following format:

       :0 [flags] [ : [locallockfile] ]

       zero or more conditions (one per line)
       exactly one action line

W means wait until the program finishes and ignore any failure messages. a means the preceding recipe must have successfully completed before procmail will execute this recipe. Finally, c means carbon-copy (this recipe does not handle delivery of the mail, so it must proceed further through the chain of recipes.)

It may be more appropriation to use f (this pipe is a filter) instead. I am new to procmail configuration; c works for me.

First match messages containing a Content-Type header of either multipart/alternative or multipart/mixed, second check the message body for a Content-Type of text/calendar (which is the data we want.) If these conditions are true unpack the MIME multipart message into separate files with munpack into a temporary directory. Proceed and run which does the magic, and finally clean up after ourselves.


for i in ~/Mail/.munpack/*; do
        if [ -n "$(grep 'BEGIN:VCALENDAR' ${i} 2> /dev/null)" ]; then
                expect -dc \
                "spawn /usr/bin/cadaver -p ${http_proxy/http:\/\/} \
      ${CALENDAR_ID}/events ; \
                expect \"dav:\" ; \
                send \"put ${i}\\r\" ; \
                expect \"dav:\" ;
                send \"bye\\r\""
Newlines have been inserted into the script so that it doesn't break layouts on other pages; you will need to fix those if copying this script.

I use expect and cadaver to upload the iCalender file extracted by munpack to Google Calendar via their WebDav interface. Finding the CALENDAR_ID is a little confusing. You can find it from the "Settings" option when you login to the Google Calender via a browser. You must use the public Calendar Address, not the Private Address, but you do not need to share the calender publicly; Google will request your login details via WebDav. This is easiest to configure in .netrc. chmod 600 the file for some additional safety.
login john.doe
password example123
The login should not include the otherwise it will fail. Happy hacking!

Fun with UTF-8

I finally decided to configure my computers for UTF-8, mostly due to the frustration of attempting to determine which Finnish character should have been displayed (and the probable annoyance of Finns having their names mangled by my mail client.)
  • /etc/locale.gen is useful for generating only the locales you actually need. I selected English (US and UK), Finnish, and Swedish as UTF-8, ISO-8859-1, and ISO-8859-15 (where appropriate.)
  • .bashrc: export LANG=en_US.UTF-8
  • Setup .Xdefaults:
XTerm*Background: Black
XTerm*Foreground: White
XTerm*UTF8: 2
XTerm*utf8Fonts: 2
Actually the UTF-8 options are not required due to the way I am launching terminals from my window manager, but it does not cause any damage. Black terminal background is mandatory!

The fun part was figuring out why my xterm, specified in .ratpoisonrc as bind c exec xterm -u8 did not launch a UTF-8 terminal. I then realized that Ratpoison was launching xterm from a bare-metal shell and the LANG environment variable was never set; quickly changing the binding to bind c exec xterm +lc -u8 (switch off automatic selection of encoding and respect my -u8 option) resolved the final issue.

Now I can happily read the ä's, ö's, and Å's from a language I do not understand (excluding a few keywords and phrases.) Cool!

Monday, January 30, 2012


I suspect that I'll be receiving quite a few more hits here given the recent retweets. Unfortunately I haven't had much time to update my blog... In fact, I may not have any time in the future.

On a more positive note, I am much more active on Twitter, so you can follow me there (@omcfadde)

Of course the topics we're discussing, SOPA, ACTA, etc are definitely not pleasant or positive. Let's show them we mean business and that we will not accept these agreements!


Updates (possibly) coming in the future.

Epäilen, että tulen saa melkoisesti lisää osumia tässä äskettäisten retweets. Valitettavasti minulla ei ole ollut paljon aikaa päivittää blogiin ... Itse en ehkä ole mitään aikaa jatkossa.

Myönteistä huomata, olen paljon enemmän aktiivisia Twitterissä, jotta voit seurata minua sinne (@omcfadde)

Tietenkin aiheita olemme keskustelleet, SOPA, ACTA jne. eivät varmasti miellyttävä tai positiivinen. Katsotaanpa näytä heille että olemme tosissamme ja että me ei hyväksy näitä sopimuksia!

STOP ACTA (In Englanti - Lue!)

Päivitykset (mahdollisesti) tulossa tulevaisuudessa.