[Olsr-dev] Thoughts about an ETX-FF-Metric plugin

Henning Rogge (spam-protected)
Thu Jun 19 14:35:12 CEST 2008

Am Donnerstag 19 Juni 2008 14:18:44 schrieb Markus Kittenberger:
> imo its a bad idea to start routing again over a link 15 seconds after he
> was disastrous, even when the link is disastrous every 30 seconds, just
> because we have forgotten this already,..
That's right... especially for networks with mobile nodes.

> if al link only once is lossy for some seconds, my approach would let him
> recover to nearly the LQ he had before quite fast.

> ...

The example looks fine. :)

> > hmm... maybe we could use the "hello-timeout" value here...
> ack
> > The algorithm should work for small networks too... and in small networks
> > you have only a few packets per second.
> it will work in small networks, if the base for calculating the LQ values
> which goes into the exponential window is >= the hello timeout
> and if the algorithm for the first averageing (on interval doubling) checks
> how much data it has before deciding which fields it averages,..

Let's see... we have to tuning parameters for links. Hello-Interval (should be 
related to the mobility of the node and it's neighbors) and Hello-Timeout 
(something like the "maximum memory" of a link).

Do you think we could use this two arguments to get reasonable values for your 
model ?

I will see if I can implement this algorithm during the weekend or maybe next 

> MArkus
> #
> for the event of an restarting neighbouring olsrd, detectable through
> (unexpectedly) new/low sequence numbers, the link sensing could handle
> this, with deleting the history (at least the parts containing 100% loss
> due to the reboot of the device)
If it was just a reboot we might just "jump" to the new sequence number, no 
need to kill the history.

If it was a long downtime, the link will have timed out and we will start with 
a fresh history.

> ##
> how deep the LQ sinks, and partly how fast he recovers depends on the
> paramters for the function which calculates the LQ from the x (in my
> examlpe 4) worst time slots (wLQ)
> and the other time slots oLQ
> LQ=(wLQ+oLQ)/2
> could be parametrized to
> LQ=(wweight*wLQ) + ((1-wweight)*oLQ)
> wsw stand for worst slot weight and reasonable values would be between 0.4
> and 0.7 i guess
I'm very interested in this idea because it's independant of the link-quality 
metric we use. Even with better metrics we could still use this algorithm 
to "smooth" the incoming data.


Diplom Informatiker Henning Rogge
Forschungsgesellschaft für
Angewandte Naturwissenschaften e. V. (FGAN) 
Neuenahrer Str. 20, 53343 Wachtberg, Germany
Tel.: 0049 (0)228 9435-961
Fax: 0049 (0)228 9435-685
E-Mail: (spam-protected)
Web: www.fgan.de
Sitz der Gesellschaft: Bonn
Registergericht: Amtsgericht Bonn VR 2530
Vorstand: Dr. rer. nat. Ralf Dornhaus (Vors.), Prof. Dr. Joachim Ender 

More information about the Olsr-dev mailing list