Sunday, January 30, 2005

Context Free Grammar

In the context of context free grammar i thought of one thing which may help someone in making a grammar/testing ones own while developing. There are terminal and non terminals in a grammar. we start with a starting non-terminal(S) then apply any of grammatical rules any number of times to it to finally create a string of alphabets with terminals only. eg. S->aAb S->ab will give all those non empty string which have a's followed by equal number of b's.

Now the thing is we can look the other non-Terminals here (A in this case) as representative of the property of the string that we want to produce (equal a's followed by equal b') So what all can we do to maintain this property. add a's and b's before and after the string

Now one can propose the grammar S->A and A->Ab A->aA but it fails in one fact mentioned above (any of the grammar rules any number of times) so if i apply S->Ab 3 times and S->aA 4 times then the numbers dont match. So the rules can be applied in any order and any number of times.

We can imagine so because in context free grammars except for null transistion (A->e e is null string) all other possible rules can only increase the length of unprepared string. And it is a qood visualization technique also it looks like loop invariance i learned in formal program developement course...

Friday, January 28, 2005

Played CounterStrike for the whole day at least I fagged lesser today because of that. Now I can say that I can play okay. Atleast I can kill one or two people in that.

Then "Meri Aatma Jag Gayi" I had to do my BTech Project work. The optimization algorithm was a bit easier as given in CLR. Atleast I could get a hang of it what was going on. The name of algo was simplex i though it was because of some simplicity it should :-) but simplex is some sort of n-dimensional figure obtained by all the points that satisfy all the constraints. In 2-d it may form a polygon

Basically the problem is Combinatorial Auctioning. Insted of auctioning individual items now the auctioneer shows them multiple items and each one of them have to submit "a" price for a subset of items they want. Note that they dont price the items individually but price the set of items they want. Now there are two problems to it. One is given a bidder has some amount of money she/he wants to invest it in ""best"" manner. The other computationally hard task is to determine who the winner is given the bids.

No there are two approaches to the problem 1. Constraint Programming 2. Linear Programming (LP)
After much deliberations I have to follow the second one. Actually a variant of the second one called Mixed Integer Programming (MIP).

Though i have studied MIP a little but how to model our problem as MIP is still foggy
Adios .....

Tuesday, January 18, 2005

Industrial Revolution and Computer Architecture

Integration/Centralization/Concentration was one of the key features of Industrial Revolution (IR). By this term I not only mean changes in methods of production, distribution or economy in general, but the whole change that took in the world view during those times. Alvin Toffler termed it as Second Wave because it was not a single phenomenon or a single independent variable over which other depended.

Nation States, Factories, Corporate Organizations show high degree if concentration activities. Earlier the geography was divided amongst kingdoms which were more or less a power mesh(mess) amongst king and nobles and allegiances of landlords. The control of the powerful diminished with the distance from the power source. Second Wave gave birth to nation state were the control was strictly defined with barb wired or precisely defined geographical areas of a nation, similar laws of a single nation, an integrated economy. A behemoth represented by single entity a monarch/president/prime minister.

Factories where a huge place where all the raw material was "concentrated" for production and then distributed there after. The power structure of corporate was clearly defined in a central head/group with clear cut heirarchical roles.

We can find few analogies in Computer Architecure as well. What we have as a computer is a three stage device Input/Processing/ Output. For processing we have a microprocessor or a Central Processing Unit. where all the inputs are concentrated to be dealt with.

Although we are now talking of distributed computing where the processing load is distributed but this processing is different from that one. This one is a more higher level of information dealing where in a high level query or a http request may be distributed. But it seems more of the same we have lots of CPUs distributed over.

The earler devices are "peripherals" or the boundaries of this central scheme. Minial state subjects that are to be kept at the "periphery".

What i want to say is that we have to move to a different type of processing at the most basic level . We have to change our strictly defined views of computation as number crunching or binary jugglery and secondly find computation in different physical processes may be.

We can have a host of devices each a separate entity in itself with minimal, small processing unit and through this decentralized, fragmented network a convergence/meaning should emerge.

I know the last part of this blog is highly vague and absurd but these are jsut my thoughts not a whitepaper..............

Monday, January 17, 2005

Starting

Just started blogging here do look out some of them in thequark.blog.com