## Archive for the ‘topics in artificial intelligence’ Category

## Biomorphs and artificial intelligence

I hadn’t heard of Biomorphs until I started wandering through The Magic Machine: A Handbook of Computer Sorcery. I’m brushing up on math and coding now so I can do some new projects this year and this book is a fun way to do it.

[ z = z^3 + c]

[ z = sin(z) + z^2 + c ]

Biomorphs were discovered by Clifford Pickover at the IBM Research Center. Biomorphs are like Mandelbrot functions in that you iterate a simple function over the complex plane. The algorithm is in the 2 example Java files below. Several more besides these two are known to reside between -20 and 20. { z^z + z^6 + c; z^z + z^6 + c; sin(z) + e^z +c; z^5 + c; z^z + z^5 + c; . . . } You should be able to figure out how to create them by altering the two examples in the download file.

Some people breed biomorphs and let them evolve as an artificial life form.

Source Code

biomorphs.tar.gz ( 2 Java source files for the biomorphs above )

More information:

Fractal Geometry

Dr. Clifford Pickover home page

Mad Teddy’s Fractals #2 Biomorphs

## Fractals and artificial intelligence

Fractals are a fascinating toy, one can easily spend an afternoon lost in Mandelbrot or Julia sets. Mathematicians were aware of fractals as early as the 1700s but it wasn’t until we had computers to do the calculations that we really discovered fractals.

Benoit B. Mandelbrot doing research at IBM was revisiting Gaston Julia’s work with fractals (1917) when he discovered the Mandelbrot set. Fractals are simple equations that are recursively computed. These simple equations create complex shapes.

The Mandelbrot function is z = z^2 + c. z and c are complex numbers, z is set to zero, c is the position on the x ( x, yi ) plane. You recursively compute this function to obtain the Mandelbrot fractal. Black is for the numbers that do not escape to infinity, the other colors represent how many loops it takes to escape.

Fractals have found some use in artificial intelligence. In the world of computer games, fractals create plant life, clouds, mountains and other scenery that would not be possible in such detail. Parkinson’s patients are diagnosed by their gait. In 2004 a sensor was developed that measures the patient’s gait, and analyzes the gait using fractals. 2002 fractals were put to use to help predict natural disasters and better model hurricanes. More recently fractal patterns have been found in solar wind. It is hoped this information will allow us to better predict solar storms.

Fractals have been found in Jackson Pollacks paintings and are being used to try to identify real paintings from fakes. They are also being used in image compression. A more fun way to play with fractals is to use them to predict the stock and commodity markets.

Fractals ( Mandelbrot and Julia in Java – source code )

More information:

Fractal Geometry

Fractals

Math on Display, Science News Online

Genius: Benoit Mandelbrot

3D Mandelbrot images

Papers:

The Fractal Geometry of Nature, Mandelbrot ( pdf/ps )

## Graphical models of knowledge representation

Reasoning using graphical models is rapidly gaining popularity. Not just used in reasoning they are also becoming useful in computer vision where they are used to classify objects.

Graphical models include: constraint networks, Markov random fields, belief networks, Bayes networks and influence diagrams.

Traditionally algorithms for these networks have been either inference-based or search based. While inference algorithms are not practical time wise for most problems outside the classroom some of the search algorithms perform quite well. Combinations of these algorithms are being used to solve knowledge representation problems not otherwise practical with just inference based algorithms. Often algorithms are tailored for each specific problem.

The advantages of this method is that all possible outcomes are known and declared in the graphical model. Also the models are easily understood by humans, and the techniques used are well understood.

Nodes on the graph represent facts or states and may be known or unknown, hidden or visible.

Connections between nodes may be hidden or not, they may be probabilities or equations and represent connections between the data.

The graphical representation allows us to decompose the big problem into smaller simpler to solve problems.

More information:

Graphical Model Algorithms at UC Irvine has a large list of software and example problems using graphical models in knowledge reasoning.

There are some Power Point and PDF tutorials you can download from Microsoft

Video lecture on Graphical models

A list of links to several software tools for doing graphical models

Graphical knowledge representation for human detection ( pdf )

Mean Field Theory for Graphical Models

Graphical Models for Discovering Knowledge (excellent beginner’s paper )

See also:

Bayesian logic

Knowledge representation and predicate calculus

Hidden Markov models

## Lua scripting language

I’ve been hearing more and more about Lua and today finally had enough free time to download, install and play with it a bit.

I downloaded the source files and they compiled and installed painlessly on the Mac, just follow the directions in the INSTALL file. There is a folder full of test applications. Open up a terminal and type:> lua hello.lua at a command prompt to be sure you’ve gotten everything up and running.

It is an odd looking language. Hello World looks like this:

— the first program in every language

io.write(“Hello world, from “,_VERSION,”!\n”)

Bisection method for solving de looks like this:

— bisection method for solving non-linear equations

delta=1e-6 — tolerance

function bisect(f,a,b,fa,fb)

local c=(a+b)/2

io.write(n,” c=”,c,” a=”,a,” b=”,b,”\n”)

if c==a or c==b or math.abs(a-b)

n=n+1

local fc=f(c)

if fa*fc<0 then return bisect(f,a,c,fa,fc) else return bisect(f,c,b,fc,fb) end

end

— find root of f in the inverval [a,b]. needs f(a)*f(b)<0

function solve(f,a,b)

n=0

local z,e=bisect(f,a,b,f(a),f(b))

io.write(string.format(“after %d steps, root is %.17g with error %.1e, f=%.1e\n”,n,z,e,f(z)))

end

— our function

function f(x)

return x*x*x-x-1

end

— find zero in [1,2]

solve(f,1,2)

( * These and several other example programs are included in the Lua download. )

You can keep your programs in text format and run them as scripts ( lua program.lua ) or you can compile them using the luac compiler into binary form. ( luac -o hello hello.lua )

Lua is dynamically typed, it can call and use C functions, it has it’s own threads so you can run routines concurrently * these are not OS threads.

The key word list for Lua is surprisingly small ( and, break, do, else, elseif, end, false, for, function, if, in, local, nil, not, or, repeat, return, then, true, until, while ) and it is a case sensitive language. All else is defined in lua_functions of which there are all the things you’d expect.

I found projects including Apache modules so you can use Lua for webscripts, Palm versions so you can write Palm programs with Lua, and more than a few games. Lua is used in game scripting and interactive AI scripting.

See also:

Cognitive Code develops software personal assistant using Lua

More information:

Lua Reference manuals

O’Reilly article, ‘Introducing Lua’

Introduction to Lua Programming

GameDev, ‘An Introduction to Lua’

## Data Mining

Most datamining is either done using private corporate databases, online government databases, or with web bots, spiders and scrapers. RSS has made data mining the web trivial with PERL. I’m told it is trival with PHP also, I’m still experimenting with that.

Currently, with the help of computers, most fields of science, the government and businesses are collecting data faster than they can comb through it. Some agencies have what would be hundreds of years worth of data if it had to be parsed by humans. So we need to use artificially intelligent datamining to sort the data, develop useful informative rules about the data, and or put it into useful formats for us humans.

This task of artificial intelligence is often put under the category of ‘machine learning’. Sometimes a set of rules is used. The rules may be created by an expert ( domain knowledge ) or discovered through machine learning using statistics.

Problems with this type of machine learning include coming up with an insane number of rules that are far too specific ( over fitting ) and using example data that skews the learning. Other problems include when do you decide you have the best set of rules? At what point is your algorithm good enough? Do you want all possible out comes or is it only specific outcomes you need? An example would be do you need 40 categories of healthy plant or only descriptions and diseases for unhealthy ones?

Datamining has four main styles of sorting through data. Classification: classes are presented and future data is to be sorted into one of the given classes. Association: associations between data are sought. Clustering: data is sorted into clusters usually using various traits as vectors. Prediction: in which some specific information, usually numeric is to be output.

Data details for data mining is often stored in ARFF Attribute-Relation File Format

More information:

Applications of Machine Learning and Rule Induction

Machine Learning ( Theory )

UT ML Group: Text Data Mining ( several papers here )

UCI Machine Learning Repository has over 160 data sets for you to use to test and develop your AI.

See also:

Electronic cop solves crimes

Finding new diseases for known cures