User Avatar Image

Meet Telltale Core Engineer Bruce Wilcox

posted by TelltaleGames on - last edited - Viewed by 405 users

Last year I featured Telltale core engineer Bruce Wilcox on the blog when he won the 2010 Loebner Award because he successfully tricked a judge into believing his chatbot was actually a human. Chatbot season is back in full-force as this year's Loebner Awards approach and Bruce is back at it with another entry. I've asked him to write up a bit about himself to give you guys a look at what he does at Telltale and beyond and he graciously accepted. Read on to find out more about Bruce! 

Note: The above video above, while awesome, is not something Bruce has worked on himself but is related to the coming Loebner awards. Bruce mentions this video in his blog, below. So take it away Bruce!

I’ve been asked to write this blog about myself, so here it is. 

I am a long-time AI programmer, both in the real world (e.g., autonomous aircraft) and in the game world (among other things I wrote the first successful Go program). You’d think there wouldn’t be a lot of call for an AI specialist in a company like Telltale. Their games are adventure story. You point and click, the system gives you a visual result. It’s not like the characters get to act autonomously. If they did, they’d always be trying to kill you. Not quite the experience Telltale sells.

Read the rest of Bruce's tale after the jump!

In fact, Telltale advertised for a core engineer. At TTG engineers come in three flavors. Core engineers work on “the Tool” in C++, the combination authoring and playback system that runs all Telltale games on all platforms. Content engineers program specific games, using the Tool and Lua. Web engineers handle the website and servers. TTG games don’t interact with servers during gameplay, but have always been downloadable.

[readmore]Read the rest of Bruce's tale after the jump!

TTG placed their standard core job ad, which for years had a phrase about “natural language skills” in it. No one before me has ever replied on that and TTG just kept it in by rote. But, there was a good reason they wanted natural language skills. Once upon a time, in a videogame far far away in time, adventure games were controlled by primitive text interfaces. “Walk north”. “Pick up screwdriver.” Sure, they’ve moved on to being mouse and graphics, but TTG firmly believes the old parser interface will make a comeback. Not as text, but as voice. Or maybe as mental control. And it won’t be the microscopic vocabulary and grammar of old. It will be large vocabulary sentences- write what you want. Hence, natural language skills.

I, meanwhile, having moved to North Bay above San Francisco, was looking for a video game job nearby, at a company that might make interesting use of my skills. And as it happens, in the prior two years I had been working in natural language processing, designing and building a new chatbot technology. In fact, last year (6 months into my job at TTG), I won the Loebner prize by fooling a judge into thinking my chatbot was a human being. So I joined TTG after a 1-hour interview and no programming test.

So what does that mean I do at TTG? Actually, I’m a strange hybrid. I’m a core engineer who is also a content programmer. I work on scripting, dialog systems, lua interfaces, etc in the Tool. But I also have my own content project, an outside-the-box project to generate Fairy Tales. When I started, it seems designers had long wanted a product that would act out user’s stories. Here’s where natural language comes in. The user might say “The king picked the rose” and the system would animate a king, have him move to a rose, pick it, and now he has a rose. Then the user might say “The king gives the rose to the princess” … It’s not necessarily a game, but an experience of some kind. What the actual game design becomes remains to be seen. First, the underlying technology has to be created (that’s my job).

Imagine the product as ScribbleNauts with verbs and more words. Then start creating limitations. OK, we handle thousands of verbs, but not ALL verbs. I mean, who really needs a lot of medical verbs I’ve never heard of? And objects, well, what will graphics limitations do to the 50,000 nouns we recognize and what scripting needs do they have? Beats me. But I wrote a prototype example of a system using my chatbot technology. I restricted things to simple sentences of Subject, verb, object. For subject, you were limited to one of the game characters. So, in my prototype I typed in “Dragon marry princess”. Since the system required “motives”, it picked one, and in text said “The dragon hands a glass of liquid to the princess. She drinks it and falls madly in love (it was a love potion). The princess marries the dragon. Not only did it require motives, but I was enthralled with the notion of twisted fairy tales. So the system tried to find a perverse interpretation of your request. If you’d said “king throw rock”, you might accidently strike and harm your own queen. Anyway… the prototype was fun, so then they said, now let’s build it using the Tool, make it graphical. And here I am in the midst of working on that with my spare cycles. If I’m lucky, it will actually ship someday, late next year at the earliest. But even if it doesn’t, it is extending core technology to handle natural language for some future product. And it makes me more sympathetic to content programmers, so I use my own experiences to improve theirs.

Meanwhile, I have a new chatbot and new chatbot technology for this year’s upcoming Loebner in October (hence the request for my blog entry). My old technology and chatbot were owned by Avatar Reality, and since they wouldn’t open source the engine, I decided to redo a new technology and bot from scratch. Which makes for a tight schedule and lowers the odds of winning this year. Rosette, my bot, did qualify first of four in the entry qualifiers, so she can’t finish worse than fourth place. J

So what does making a chatbot entail? You can find a bunch of papers I’ve written for Gamasutra in my Wikipedia entry. But the basics are:

1. Conceptualize a consistent personality for the bot. My wife, Sue, is a writer and has designed the backstory of Rosette. Who her family is, where she lives, what she does, what she likes, etc.

2. Convert that into ChatScript, deciding how to phrase a pattern that will approximate the meaning of a user input without being so specific as to miss whenever the user writes something not quite it. For example, to catch input like: Do you have any sisters
the script might be:
?: SIBLING (<< you [any ~own] [brother sister sibling] >> ) No. I was an only child. I think that made me a bit of a tomboy. I do have Jenn, who is my godmother's daughter. We grew up together.
The above script line responds to questions (?:) where the words inside << >> can be found in any order and whenever you see [ ] that means find one of them. And ~own is a concept, which stands for any verb that implies owning. And ordinary words usually match both canonical and noncanonical forms, so “brother” matchs “brother” or “brothers” . So the pattern could match: “do you have a brother?” or “do you own any siblings?” or “have you any brothers?”. When it matches, in this case it outputs text, starting with “No…”. But it could be script to do anything.

For example, if you ask what is the largest country, the system would have a pattern to “understand” your meaning, and then script that would scan a list of database facts it has about nations and their area, ordering them, and picking the largest. Or the five largest if that’s what you asked.

3. Extend the ontology of the world. An ontology is a relationship hierarchy. A dog is a mammal is a being. To shoot is to damage_with_projectile is to damage is to affect_health. Each of the levels of the tree is a concept (a list of things that do what it means). But a concept can be freestanding and nor not strictly hierarchical. For example, concept: ~aids_words (AIDS deficiency illness immune ) describes words often associated with AIDS, but they are not equal items. It’s an affiliation instead of a hierarchy.

4. Work on other natural language issues, like Parsing and Part-of-Speech tagging. Patterns may well involve knowing exactly what role in a sentence each word plays. “Are you like a dog” is quite different from “Do you like dogs”, and a vague pattern might get the wrong meaning. “like” is the main verb in the latter but is a preposition in the former.
Anyway, that’s me and TTG.
Meanwhile, there is a video about two chatbots talking with each other (above) that has gone viral and is making the rounds. It’s funny, short, and well worth watching. The two bots are “Cleverbots”, one of the contenders last year that didn’t qualify this year. The speech and avatars were done at Cornell, and they just linked to the cleverbot website.
0. Wilcox Wikipedia Entry -
1. Loebner Competition -
2. Brian Christian book-
3. Avatar Reality Blue Mars -
4. Cleverbot –
5. AIML -
6. ChatScript Open Source project -
9. Suzette - (from 8 months before Loebner)
10. Chatterbox Challenge -

This discussion has been closed.