Wednesday, 28 October 2015

On Decisions and Learning (Part II)

In the previous post, we have seen how the space and time constraints can affect the decision  making.  In this part, we extend the model to one of practical interest. Such a model should be robust across space and time variations. In effect we cover up the space and time and replace them by agents/players who affect our decision process. To illustrate with an example we take the case of a boy in say, Brazil and a similar aged boy in India. It is more probable that the Brazilian boy latches himself to football and the Indian boy would choose cricket with higher probability.

                     

The agents here is a representation of the culture and tradition - hence depends on the space. There can be agents representing time as well. For instance the decision on taking an umbrella before leaving the home depends on whether it is the rainy season or not. Rain is the agent here and hence represents time. Thus agents are the physical representation of the time-space (real) parameters. Another aspect of agents is that, there are certain agents that aid to the decision with more weight than certain other. Now there are decisions taken in accordance with logic, past experience, and predictions. Each of them has a correspondence with time - present, past and the future.

The process of making the decision can be summed up as \( f : (A_1,A_2, \cdots, A_k) \rightarrow  \mathcal{D} \). Here \(A|_1^k\) are the agents to be considered while taking the decision and the set \(\mathcal{D}  = d_1,d_2, \cdots, d_n\) are the decision choices. This model lacks the past learning that has been accumulated over time. To explain the learning in decision making, we can take the example of the man going out while it is raining (Agent = Rain). The man does not take the umbrella, and gets wet (Decision : No Umbrella). The next time he goes out in the rain he is reminded by the fact that he has got wet (which assumes a cost). Now it is possible to assign a cost of for the decision taken over a action set. Assume that the decision maker has learnt the decision taken over a set of agents till time \(T\). Now the system is opened and the learned costs are considered while taking the decision.

Wednesday, 21 October 2015

On Decisions and Learning (Part I)

An inquisitive mind always seeks the answer to this question, is the thing that I decide right? Has the trained mind in me educated enough to reach the "right" decision.

Decision making is a ever happening process in the world around us. Decisions may be taken for important as well as less important topics. The rightness of a decision is always debatable and often has a subjective element. So how does one take a decision, what elements are included in the process, and how the decision affects others. The three questions here has a strong theory in statistics. Let us parse each of them and look it in detail.

How does one take a decision?


Decision is a selection of a particular choice, from the given set of available choices. A decision may be prompted by a situation, which is both depending on space and time. So, deriving cues from the situation, any decision should particularly abide by time and space. Let us see a little example. Identically aged people live in a city and a village. Both fell ill (almost in the same time), and was taken to a doctor. The doctor in the city gave medicines in the form of tablets, while the doctor in the village asked the man to collect some herbs from the garden and have them. One can tell that space is the prominent factor in this example. The doctors decision complies with the locally available medicines. One may also expect in a closer analysis, that the domain of the doctor also plays a significant role in decision making. For this discussion we can avoid this possibility.

Now consider another example where time is the decision driver. A boy injured his knee while playing in the evening . Seeing the bleeding his parents took him to the local clinic for first aid. Imagine the same situation happening in the night. At this time, the parents have no access to the clinic, so they prefer to give whatever first aid is at home and later take up to the doctor if required. Time (and to an extend space) played the crucial role in this example.

We try to introduce some modelling. Consider a situation and \(n\) possible choices. A particular choice has to be picked. If one were to derive ideas from the examples presented, the decision \(D(S,t)\) is a function of the location and the time. Generally, space and time functions are difficult to generalize. So we somehow, avoid these by introducing a probability matrix over the choice. The simplification seems good, but did we handle the reality efficiently? Not really isn't it! What we did was to look into the decisions taken by us in the past and presented a distribution over those choices. From the point of view of an analyzer, this seems to be good. While the model seems to blind the decision makers view? One way to handle this is to bring in some dependence of space-time. The model should be robust to analyze and useful for the decision maker.

I leave this at a point for the reader to ponder a little :

Image Courtesy : olivergearing.com
Most of us act more or less according to this Importance vs Urgency model (logical planner!). We have a set of priority and the decision scheduling is performed by the urgency or the time requirement. Seems to convince why certain events require, us to wait in patience.