Tuesday, January 20, 2009

Multi-threads program

Before I always think Multi-thread program is difficult and a little bit huge !^_^

Today I realize one simple parallelize program by two threads.Although it did not fulfill my idea,but it do I told it..
That's not so difficult to know what's the multi-thread and write a simple example program..
Yes it's still huge because it want to do more work and fulfill more ideas. But for me or some task ,,we just need one pieces of it..So dont fear any more ..

Fear is because you know nothing about it..Just sit down and concentrate to read books and google. Then you will find it's not so difficult to do it like other guys..

Here is a very nice tutorial about pthreads.

Ps:Under linux c++, you can not define function like void* function(). Maybe you can do it in c language under c compiler. But for c++, GCC or G++ will give you wrong information about the function's parameter. I solved it by define function like void* function(void * p);
Here p is no used but you do need it to compile your program...

Monday, January 12, 2009

chat with friends by mobile phone freely

We can chat with our friends on internet and just need pay the net fee. Now we can visit internet by mobile phone, why should not we chat with friends on phone and use internet..Then we dont need pay the fee for every second. We just need pay the wireless fee.
Now google develop the open source system --Android.. May be we can try this.
Building a software like MSN ,QQ or OICQ, then we can chat to friends and see each other on the video .

Wednesday, January 7, 2009

Solve the problem : stack overflow

I am a poor guy in computer programming.
Today when I run a recursive loop , my program crash!! It told me: stack overflow!

If I use Vagrind to check memory leak, it seems like no more memory can be added to this stack.
I have thought the memory of my computer is so big that it can not crash like this. But ....it happens as, when I run a program ,the complier will tell computer how much space this program need . If it need more ,then stack crash .
To solve this problem:
1, make the size of stack bigger..
2, change the method to decrease the program space .

I choose 2th method..Then I find the solution:
2.1,change the recursion to iteration
2.2,change the recursion to tail recursion

As I use the tree recursion , one function will call itself many times in its loop. For every call, the stack will save the path to parent function call , and save the local variable .

I find my program maybe difficult to be changed to iteration or tail recursion(it will wrong as I can not return for every function).

So the only choice might be elimination the local variable..Then I use more function and input parameters instead of using local variable..

Finally, for this ,it works...and become faster....It help me to do many works ,like region grow ,label connect component and morphological algorithm..

For more information:

Friday, January 2, 2009

Where is the giant's shoulder?

Samuel Taylor Coleridge, in The Friend (1828), wrote:
"The dwarf sees farther than the giant, when he has the giant's shoulder to mount on."
That mean "One who develops future intellectual pursuits by understanding the research and works created by notable thinkers of the past";
If you want to see further, first you should find the giant, second you should find the shoulder of the giant,third you should mount on the shoulder.
For dwarf , maybe it's not so easy to do these stuff.. But it's a good way to see further than anyone before.
For me ,as a dwarf, just starting to think this...Work hard ,and mount on the shoulder of giant.

Classifiers: Bayes, PCA, ANN,SVM

May be it's difficult for me to write all I get from book about classifier .But I think it will be useful for me to write notes here.
These classifiers need be trained before use.

For Bayes: we need know the class conditional probability density function(f(feature/class)), and we can get it by two ways:
1, Assume the function is normal distribution. Using the training data to get mean and variance of normal distribution;
2, Get the class conditional density by doing much experiments(training);
If we get the density function, we can calculate posterior probability (f(class/feature)) by Bayes rule(See Computer vision :a modern approach for detail);

For PCA: PCA(principal component analysis) in fact is not a classifier, but a tool to get new better feature vector from original feature vector, which have the most variance between these feature vector that mean no useless information in new feature vector.
We can get new feature vector by : First, calculating the eigenvectors of original feature vector variance matrix ; Second, projecting the original feature to the direction of eigenvectors, then we get new feature vectors;

For ANN: ANN(artificial neural network) is a method using iteration to get new parameters that can making the error between real output and the ideal output becoming smaller. We can use stochastic gradient descent minimizes the error and use backpropagation to compute the derivatives;

For SVM: SVM(support vector machines) is a classifier that using the training data to get a hyperplane which can separate the classes, and the minimum distance between hyperplane and class is same.Why we call it SVM, because not all sample data will affect the parameters of this hyperplane, only some points can determine hyperplane parameters that's the support vectors;