flipCode - Tech File - Henry Robinson [an error occurred while processing this directive]
Henry Robinson
Click the name for some bio info

E-Mail: hr109@york.ac.uk

   06/21/2001, Tech File Update

After a long hiatus, I return. Let's get straight to some interesting bits and pieces. Those of you who are interested in where I've been for the past year can read to the bottom, those who aren't, can't.

What Have I Been Doing With My Time?

I know you don't care about the detailed answer to this question, so I'll keep it technical. I spent last summer working on a project in my department at the University of York. I was funded by Microsoft, but other people involved were funded by an external company, to whom we were to deliver the final project. We were building a 'virtual agent'- an artifically intelligent avatar that would answer questions on a chosen subject domain. Essentially, the user asks questions (by speech and / or keyboard input) in natural English, and the avatar responds with all the knowledge that it has, altering its answer to take the current context into account.

The limiting of the domain is very important, as you can appreciate. Merely building a system that coped with questions on our chosen domain - the workings of a combustion engine - was a large enough task, since the myriad different ways that a human can phrase a question yields combinatorial explosion in the question -> answer mapping. This was also the reason that we couldn't simply keep a table of possible questions and their answers - space requirements got too large too quickly. Therefore, we stored a representation of what the avatar knew, and created a system that would analyse and map a question written in English onto the knowledge representation.

This was all done using a logic programming language called Prolog, which is highly suited to AI. Prolog works by maintaining a database of known facts, and deduces the proveability (my own word :) of a given fact from the known facts, and deductive rules mandated by the logic system it adheres to. For instance, if we had a known set of facts:

    Male ( joe ).
    Parent ( joe, fred ).
    Father ( X, Y ) -: Male (X), Parent (X,Y). Read as male (x) AND Parent (x,y)
    IMPLIES Father (X,Y)
we can run Prolog queries like Father ( joe, fred ), and have Prolog return true, because it analyses the things that it knows and builds a tree containing all the things that it can deduce from the facts. If it finds Father( joe, fred ) in that tree, it knows that Father( joe, fred ) is true.

Further, we can have Prolog find a variable satisfying a query. For instance, Father( X, fred ) would return X = joe.

This is a powerful mechanism, because it automates the ability to 'deduce' things. Once we tell a system everything it knows, it is able to deduce further information by understanding the logical relationships between the facts it knows. This also had the pleasing effect that the system began to know things we didn't expect it to, and test sessions with the system became more and more interesting as the program began to teach us stuff about combustion engines, rather than the other way around.

Performing the Natural Language analysis of the input was not trivial - again Prolog was used to break the sentence up into object, subject, verb etc. The language understood was generated by a grammar, formalising the structure of sentences. Once the input was parsed by this grammar, we had the building blocks required to perform some semantic analysis, and thereby understand exactly what the user was asking. We could then map this understanding onto our knowledge base, and retrieve an English sentence which we then spoke to the user.

(As you may be able to tell, I wasn't heavily involved in the AI side, so I'm not authoritative on exactly how that was performed, and unable to get really deep into implementation details.)

So if I wasn't a core-AI man, what was I doing?

I was on the presentation side. We had to produce a 3d-simulation involving our avatar (a professor model called Alfred), and a combustion engine. I was originally pigeon-holed into working on the camera planning, as Alfred wandered around the model, to keep him and his target in shot and not confuse the viewer. However, as is often the case in projects I imagine, it became apparent that no-one was really building the actual application, the engine. We had a guy working on facial animation and lip-sync, a guy working on dynamic gesturing, somebody doing skeletal animation, and somebody else working on path planning (who also gave me the experience of what it's like to work with someone who's much much brighter than you are. Hi Peter :). Nobody actually taking responsibility for putting all together. Enter our hero...

So essentially I wrote an engine. I could have knocked up a hacky little GL renderer where everybody issued their own commands, and the engine pretty much played the role of GLUT, but I didn't want to do that - I wanted to build something useful. So I spent a little time playing with ideas, and came up with my micro-kernel approach (which I guarantee is not new, but I hadn't build something like that before). Essentially, the engine is a skeleton kernel, which loads required functionality from modules, or SubSystem objects (Everything in the engine is an object). SubSystem's are loaded from DLLs at runtime.

At this point I feel I should share a handy little tip: how do you dynamically instantiate classes at run-time from DLLs? (For those who don't appreciate why this would be a problem, consider how a linker could possibly locate the correct code for a class which is not linked at compile-time). The answer lies in the solution for non-object based procedures - have a generic 'function pointer' that is assigned an implementation by GetProcAddress. We do a similar thing with classes - our equivalent of the function pointer is a pointer to a virtual class. (In fact, virtual-ity is not strictly necessary, but 99 times out of 100 you'll use it). Unfortunately, we have no direct equivalent of GetProcAddress for objects - since objects are instantiations of a basic type they each have to be allocated as we use them. We need a GetNewInstanceAddress procedure that gives us a new instance of a class every time we call it. This will take the place of our new operator for DLL based classes.

So how to do this? It's very simple indeed. A GetNewInstanceAddress procedure could look like this:

BaseClass *GetNewInstanceAddress()
            return new ImplementationOfBaseClass;
So to instantiate our class we just GetProcAddress the GetNewInstanceAddress function, and call it.

Why does this work, and why can't we just do new BaseClass in our code? The reason is that at link time, if the linker can't find the implementation of BaseClass, it'll complain at you. (More accurately, it'll complain when it can't find the implementation of class methods that you try to call). If you mark those methods as virtual, the linker realises it has to think on its feet at run-time, and will dynamically resolve virtual calls to the actual implementation in a derived class. If we need to get at child-specific information in ImplementationOfBaseClass we can simply cast the returned pointer to the correct type.

The problem with this is that you have to also write routines to free up the memory once the class is finished with, since the calling application can't simply delete the BaseClass* - this causes a protection fault for reasons I'm not entirely clear about yet. However, this is dead easy to do, since we can just write

void DeleteInstance(BaseClass *b)
            delete   b;
and call DeleteInstance wherever we would normally call delete.

This is all really simple when you think about it, but it can make things very elegant indeed.

Now The Techie Stuff is Finished

So I've been at university for nearly two years now, and am really getting a kick out of it. I *really* like being in an environment where knowledge for it's own sake is considered a Good Thing. I have learnt an incredible amount, and am still learning more. I'm flying out to Barcelona for TechEd Europe 2001 in the first week of July, which will kick ass (since we're there at Microsoft's expense, they like to show us students a good time. Cue the corporate American Express accounts, and a lot of beer... :) After that I've got a couple of projects to complete for my own interest, but Microsoft pay me to do them. I'm going to be looking at a game-centric distributed object model and runtime environment, and generally messing around with some bits and pieces. I just got hold of a free Compaq Ipaq, which is a fanstastic machine (200MHz and a frame buffer pointer. Old school heaven :) so I'll be playing with that. I also want to tart up my raytracer, and do some more tech-file writing. I plan (although I'm loathe to make definite commitments given my track record) to do some introductory level stuff on some cool techniques - i.e. an A* intro, a simple raytracer, basic toon rendering, that sort of thing. It would be good if some interested parties would write me and let me know what sort of stuff they'd like to see. Consider this an apology for leaving my tech-file to gather dust for so long :)


  • 06/21/2001 - Tech File Update
  • 06/07/2000 - Tech File Update
  • 08/25/1999 - Scripting Ideas
  • 07/18/1999 - No Subject
  • 05/13/1999 - No Subject
  • 04/26/1999 - On Scripting...
  • 04/24/1999 - Hi!

  • This document may not be reproduced in any way without explicit permission from the author and flipCode. All Rights Reserved. Best viewed at a high resolution. The views expressed in this document are the views of the author and NOT neccesarily of anyone else associated with flipCode.

    [an error occurred while processing this directive]