qpimd-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [qpimd-users] Unable to route multicast video streams using qpimd.


From: Everton Marques
Subject: Re: [qpimd-users] Unable to route multicast video streams using qpimd.
Date: Mon, 26 Oct 2009 11:44:25 -0200

Hi,

You did not mention whether you got a source-specific IGMPv3 join to the
channel (S,G)=(192.168.4.60,239.255.255.250). Please notice qpimd is
unable to program the multicast forwarding cache with non-source-specific
groups. Usually the key issue is to instruct the receiver application to
join the source-specific channel (S,G).

Regarding the config, the basic rule is:
1) Enable "ip pim ssm" everywhere (on every interface that should pass mcast).
2) Enable both "ip pim ssm" and "ip igmp" on interfaces attached to
the receivers (IGMPv3 hosts).

An even simpler config rule to remember is to enable both commands
everywhere. They should not cause any harm.

Hence, if your mcast receiver is attached to Node 2 at  ra_ap0, I think you will
need at least the following config:

!
! Node 1
!
interface ra_ap0
 ip pim ssm
interface ra_sta0
 ip pim ssm

!
! Node 2
!
interface ra_ap0
 ip pim ssm
 ip igmp
interface ra_sta0
 ip pim ssm

Hope this helps,
Everton

On Mon, Oct 26, 2009 at 4:42 AM, Yoda geek <address@hidden> wrote:
> Hi Everton & Fellow  qpimd users,
>
> We're trying to stream multicast video traffic between a Tversity server and
> a multicast client separated by 2 nodes (node1 and node2). Each node is
> running quagga suite (version 0.99.15) along with qpimd (version 0.158)
> running on top of Linux 2.6.26.
> Node 1 has 3 network interfaces - eth0, ap0 and ra_sta0
> Node 2 has 2 network interfaces - ra_sta0 and ra_ap0
> The Tversity server talks to interface ra_ap0 on Node 1 and the multicast
> client talks to interface ra_ap0 on Node 2
> Nodes 1 and 2 talk with each other over their ra_sta0 interfaces
>
> Below is a graphical depiction :
>
> Tversity server   -----------ra_ap0--> Node 1
> --ra_sta0-----------------ra_sta0-->Node
> 2-----ra_ap0------------------------> Video Client
> ===========             ======================
> ======================                      =============
>
>
> Node 1 pimd.conf file
> ==================
> !
> ! Zebra configuration saved from vty
> ! 2009/08/01 20:26:06
> !
> hostname node1
> password zebra
> enable password zebra
> log stdout
> !
> interface eth0
> !
> interface eth1
> !
> interface lo
> !
> interface ra_ap0
> ip pim ssm
> ip igmp
> ip igmp query-interval 125
> ip igmp query-max-response-time-dsec 100
> ip igmp join 239.255.255.250 192.168.4.60
> !
> interface ra_sta0
> ip igmp
> ip igmp query-interval 125
> ip igmp query-max-response-time-dsec 100
> !
> !
> ip multicast-routing
> !
> line vty
> !
>
> Node 2 pimd.conf configuration file
> ============================
> !
> ! Zebra configuration saved from vty
> ! 2009/08/02 21:54:14
> !
> hostname node2
> password zebra
> enable password zebra
> log stdout
> !
> interface eth0
> !
> interface eth1
> !
> interface lo
> !
> interface ra_ap0
> ip igmp
> ip igmp query-interval 125
> ip igmp query-max-response-time-dsec 100
> ip igmp join 239.255.255.250 192.168.4.60
> !
> interface ra_sta0
> ip igmp
> ip igmp query-interval 125
> ip igmp query-max-response-time-dsec 100
> !
> !
> ip multicast-routing
> !
> line vty
> !
>
> From the above configuration you can see that interface ra_ap0 on node 1 is
> configured to be multicast source (ip pim ssm).
> We do see some multicast join requests in wireshark from both the server and
> the client however no data flow. Initially we started qpimd without
> the entry "igmp join ..." on either client side node or server side node.
> Looking at node 1 configuration through "show  ip igmp groups" we didn't see
> the group membership for "239.255.255.250" while this group membership was
> observed on node 2. I put this group membership on both nodes to force them
> to join this multicast group - however without success.
>
> Just to give you a background - when both client and server are talking to
> same node - say node 2 and same interface ra_ap0 (without qpimd running)
> multicast video gets served flawlessly from Tversity server to client
> through the node.
> But with the 2 node setup we aren't able to see the video streams go through
> to the client.
>
> Could you please review  the above configuration for errors or have any
> suggestions to reseolve this issue ? Any help would be greatly appreciated.
>
> Thanks,
>
>




reply via email to

[Prev in Thread] Current Thread [Next in Thread]