So we had a 10-hour hackathon in our Facebook Tel-Aviv office, and the idea was to hack around iOS stuff. Apple has released beta versions for watchOS 2 that lets you run native apps on the watch and we thought it could be fun to port Doom over to it! (not to mention the fact that JOHN CARMACK is a colleague OMG!!)
Having only a few hours, we figured we’d better start with the simplest version of Doom we could find. A quick search and we found nDoom which seemed very interesting. This thing runs on a graphing calculators, it should run on a watch…. mmm… right??
Xcode -> File -> New Project… a few hours and quite a few code fixes later and the code compiled! OK, apart from the fact it insta-crashed we were at least getting somewhere. After fixing the first few crashes, we chose to believe that the number of places where the code crashes must be finite, and going head-to-head with this will hopefully take less than the few hours we have left.
For the next few hours, we had to quickly understand what makes this Doom code tick, integrate its run loop with ours, figure out how to output an image from this thing, come up with controls that fit on a watch, tune the performance, fix mysterious bugs.. eventually this is what we ended up with:
• As you can’t overlay UI elements on the watch, we used interface groups to create a 3×3 grid of buttons. The lower buttons are toggles for walking around, and the upper are used for shooting and opening doors.
• We found Doom’s buffer that holds the actual pixel data for each frame. Every time there’s a new frame to display, we use CoreGraphics to turn the buffer into a UIImage. This initially gave us grayscale images, but after figuring out Doom’s color palettes and applying them we’ve got color! We now set this image as the background image for the top most container.
• We used the UI thread to dispatch iterations of Doom’s run loop. We wired button clicks to post events to Doom as if they were coming from a keyboard.
• By far the most intensive task was drawing images to the screen. Trying to draw them too fast resulted in an annoying unresponsiveness. Tweaking UIImage’s properties and only updating the image if anything has changed allowed us to squeeze some more juice.
And the result can be seen in the video! (disclaimer: this isn’t production code, and rather was a kick ass hackathon project, so we won’t be making the source code available as it’s an example).
Well now after we’re done with this, the coffee machine with its color touch screen just stares at us waiting… for the next hackathon!