今天历时一个学期的EA答辩结束了,随手写几点吧。
1.对一个方向特别感兴趣,和真正钻进去研究和学习还是很不一样的。有时候我们满足于自己在某些方面有点见解,可是真正投入到一个方向上,那绝对是一个全新的战场。自己是菜鸟,而其他人都是遍地老手。这个时候才是检验是否是真爱吧?
2.我就是这样一个例子。想着要做随机几何想了好久,终于有机会上手试试了,上述就是我的一些真实感受。然而,当中有那么一段时间全身心投入,拼了命想折腾点东西的劲头还是感动了自己。以及最后写报告时发现好些证明似是而非,只能一一自己补全,也算是一种科研锻炼吧。
3.终了,还是非常喜欢这个方向。有机会让我再去画些奇奇怪怪的东西。以后还要加油干呢。导师也说现在既然已经略窥门径了,要再接再厉啊。比如说做个什么问题或者证明吧,不能说是等着人家告诉你能不能证明和怎么证明,要是这样岂不是变成了DM了么?
4.答辩的时候,PPT要少弄一点。老师说一分钟看一张,快了大家就不开心了……哦不开心了额。
5.以后读论文,粗读一遍看大意,略读一遍掌握思路,然后必须要精读一遍(如果是钻研论文的话)验算过程啊!!!
6.毕竟后来没有拍照合影。我觉得我还是需要一个自己的成果才能填满欲望。定个小目标,3A结束前写出一篇论文吧。
就说这么多了,加油!
2016年12月13日星期二
2016年12月4日星期日
Erlang and Jackson Network
This is a note for reviewing the MAP554 and some points about network.
M/M/1, M/M/∞, birth and death
The basic model of queue theory. M/M/1 has just one server and has an invariant measure like geometric law,
π(n)=pn(1−p),p=λμ
M/M/∞ has infinite server and Poisson law
π(n)=e−ppnn!,p=λμ
where λ is the rate of arrival and μ the rate of waiting. A more general case can be done like change of power.
π(n)=pn(1−p),p=λμ
M/M/∞ has infinite server and Poisson law
π(n)=e−ppnn!,p=λμ
where λ is the rate of arrival and μ the rate of waiting. A more general case can be done like change of power.
Erlang network:
This is just an application for truncated technique. That is if we have already a network with reversible invariant measure, we can generate a new by changing the power of that part. That is
˜q(x,y)=Cq(x,y),∀x∈A,y∈S−A˜q(x,y)=q(x,y), otherwise Then the new invariant measure becomes
˜π(x)=Kπ(x),∀x∈A˜π(y)=KCπ(y),∀y∈S−AK=1π(A)+π(S−A)
The application is that we make C=0 then the network is defined in just the part A. For example, in the network of route with restriction R, we can just do the case without restriction to get π, which is just the case of several M/M/1 independent, then we do restriction and normalization.
˜π(x)=Kπ(x),∀x∈A,K=1∑x∈Rπ(x)
Jackson network
A more general model of network is like that. Each station has rate λi of arrival and ϕi(ni)μi rate to tackling the service. Here ϕi(ni) can be considered as the power of server, in the case M/M/1 it is always 1 and M/M/∞ it is always ϕi(ni)=ni. However, the difference is that after each service of station i, it has possibility rij to go to the station jThe key is to find a equivalent ˜λi which satisfies that
˜λi=λi+∑j˜λjrji
then the station looks like independent and has the invariant measure
π(n)=Πi˜pniiΠnim=1ϕi(m),˜pi=˜λiμi
2016年12月2日星期五
Levy characterization, representation of martingale and change of probability
I am preparing for the final, so I write some notes for the course maths finance.
Levy characterization for Brownian motion:
If ϕTsϕs=Id, then
If ϕTsϕs=Id, then
Bt=∫t0ϕsdWs
is a standard Brownian motion
This theorem is very useful and it describes the nature that after a con-formal transform, the BM keeps its properties.
Representation of martingale:
This theorem has different version. The most general version is that for a Ft adapted local martingale Mt, it can be written as
Mt=E[Mt]+∫t0HsdWs
where Ht∈H2loc.
This is a mathematical version of perfect duplication theorem. The proof starts from the case L2martingale→L1martingale→local martingale. It has many application in the stochastic calculus.
Change of probability:
First we define ZT=exp(∫T0ϕsdWs−12ϕ2sds). Generally, it's only a local martingale and if it satisfies E[ZT]=1, we can define a change of probability
dQdP=ZT
then under the new probability Q, we can define a new BM in the form
˜Bt=Bt−∫t0ϕsds
We remark that in the case ϕ is deterministic, then the there is no problem since in this case, ZT is well defined of expectation 1. Otherwise, the expectation is not so clear but there is a theorem Novikov, says that if exp(∫T012ϕ2sds)<∞, then all the condition is satisfied.
The change of probability can simplify the formula and has important applications on Monte-Carlo algorithms.
订阅:
博文 (Atom)