The truth is that almost everyone in the developed world has had experience with artificial intelligence or AI. This is especially true for gamers. In fact, according to SRK Branavan of University College London, games are a test bed for AI.
Games are used as a test bed for artificial-intelligence techniques simply because of their complexity. Every action that you take in the game doesn't have a predetermined outcome, because the game or the opponent can randomly react to what you do. So you need a technique that can handle very complex scenarios that react in potentially random ways.
It turns out that some very smart people at MIT have come up with a machine-learning system that can read the user manual for Civilization and make it's game better. That might not sound impressive but it actually is. Basically the system starts with a list of actions it can take and no knowledge of the game or languages to decode the user manual. This means that the actions taken at first were completely random but depending on the action it took a word came up on the screen and it initiated a search for that word in the game and in the manual. Big deal right? That's called indexing which deals with information retrieval. Here's the kicker: the system was able to "hypothesize" the association between the words and actions and get better results in the game!
This ability to associate information and actions concluded with astounding results. Hit the break to find out just how much.
It appears that at the end of the test, the computer had simulated 80% of what a human could do after reading the manual and playing the game. This also had a profound impact on the victory rate which increased from 46% to 79%. Eugene Charniak, University Professor of Computer Science at Brown University sums it up pretty well,
If you'd asked me beforehand if I thought we could do this yet, I'd have said no. You are building something where you have very little information about the domain, but you get clues from the domain itself.