Monday, May 4, 2009

Summary of the Internet2 Meeting in Arlington, Virginia, April 27-29, 2009

Internet2 http://www.internet2.edu/ is a not-for-profit high-speed networking organization. Members are 200+ U.S. universities, 70 corporations, 45 government agencies, laboratories and other institutions of higher learning. Internet2 maintains relationships with over 50 international partner organizations, such as TERENA in Europe. Internet2 has working groups that focus on network performance and middleware that is shared across educational institutions to develop applications. For example, InCommon focuses on user authentication in federated environments, Shibboleth - on web single sign-on and federations, Grouper – on groups management toolkit, and perfSonar – on performance monitoring.

Internet2 members meet twice a year. The spring 2009 meeting took place in Arlington, Virginia, just outside Washington, D.C., and gathered about 640 participants from 280 organizations. 96 of the participants were from corporate members. Here is a short video from the Interent2 reception on Monday, April 27: http://www.youtube.com/watch?v=C13uAKg7omQ.

Polycom has been a member of Internet2 for years, and has contributed equipment and sponsored events. Six HDX systems were used in Arlington to connect remote participants, e.g. from Kenya and Ecuador. I have been involved in Internet2 since 2007, and presented at the several meetings. At the event this week, my two presentations addressed telepresence. The first one was on Tuesday – I was part of a large panel of vendors in the telepresence industry http://events.internet2.edu/2009/spring-mm/agenda.cfm?go=session&id=10000509&event=909. I shot a short video during the preparation of the telepresence panel http://www.youtube.com/watch?v=NvYwF-HqdWo.

One thing I do not like about vendor panels is that folks tend to jump into product pitches and competitive fighting. In telepresence panels there also the tendency to define telepresence in a way that matches vendor’s own products, and exclude everything else. Instead, I focused on the broader definition of telepresence and the different levels of interoperability that we should look at as an industry. In my view, telepresence is an experience (as if you are in the same room with the people on the other side) that can be achieved with different screens sizes, codecs, and audio technologies. For example, people using Polycom RPX may consider Cisco CTS not immersive enough. Properly positioned and with the right background a single-screen HDX system can provide more immersive experience than a three-screen system on a multipoint call. All speakers seem to agree that the remote control is not part of the telepresence experience. Some insisted that the cameras have to be fixed but I do not really agree with that. If a three-screen system is connected to a three-screen system, the cameras have one angle. If you connect the same three-screen system to a two-screen system, changing the camera angle could deliver better experience for the remote (two-screen) site. So in my view, moving cameras is OK as long as it happens automatically and the user is not involved in the process.

Signaling level interoperability is important as we have systems that use H.323, SIP, and proprietary protocols in the market. But using the same signaling does not mean interoperability. There is no standard for transmitting spatial information, e.g., where the screens are located, which audio channel is on what side, and what is the camera angle. While video interoperability is easier due to the wide adoption of the H.264 standard, audio interoperability is still a problem. There are several competing wideband and full-band speech and audio codecs that are incompatible, so systems from different vendors today negotiate down to the low-quality common denominator which does not support a telepresence experience. I got a lot of positive feedback after the panel; Internet2 attendees are much more interested in balanced analysis of the interoperability issues than in products. Presentation slides are posted on the session page. The session was streamed and the recording should be available for viewing in a few days.

My second telepresence presentation was a joint session with John Chapman from the Georgetown University http://events.internet2.edu/2009/spring-mm/sessionDetails.cfm?session=10000467&event=909. John Chapman described the history of Georgetown’s remote campus in Qatar and the attempts to connect it back to the main campus in Washington D.C. via video conferencing and collaboration tools. He then described the decision process that led to the selection and installation of two Polycom Real Presence Experience (RPX) systems.

My presentation provided an overview of the existing telepresence options from Polycom (different sizes of RPX 400 series, RPX 200 series and the TPX system) that can meet the requirements for immersive interaction of up to 28 people per site. I focused on the technologies used in creating the telepresence experience – monitors, cameras, microphones, speakers, and furniture. Then I talked about the new functionality in the recently released TPX/RPX V2.0 and about the differences between the Video Network Operations Center (VNOC) service and the newly announced Assisted Operations Service (AOS). Using video clips proved very efficient in this presentation and made it very interactive. Presentation slides will be available for viewing at the session web page.

Now a couple of highlights from the meeting …

The IETF chair Russ Housley talked about successful protocols http://events.internet2.edu/speakers/speakers.php?go=people&id=2546 – this seems to be a recurring theme at IETF, as you can see in my summary of the last IETF meeting. Russ focused on the main challenges for the Internet: increasing demand for bandwidth, need to reduce power consumption in network elements, creating protocols that run well on battery-powered mobile devices, support of new applications (like video streaming and real-time video communication), and, finally, the issue with the empty pool of IP V4 addresses and the urgent need to migrate to IP V6. Russ Housley also called for more academic researchers to become involved in IETF. This only reinforces my observation that IETF has been taken over by vendors, and researchers are now in minority; see summary of the last (74th) IETF Meeting here http://videonetworker.blogspot.com/2009/04/summary-of-74th-ietf-meeting-in-san.html.

I know Ken Klingenstein from previous Intrenet2 and TERENA meetings. His primary focus is federated identities and he presented about successful implementation of federation on national level in Switzerland: http://events.internet2.edu/2009/spring-mm/agenda.cfm?go=session&id=10000483&event=909. I talked to him in the break. The InCommon group http://www.incommonfederation.org/about.cfm wanted to create a mechanism for user authentication in federated environments. They looked at Kerberos, SIP Digest Authentication, etc. but none fit federated environment. InCommon therefore developed a mechanism that replicated web HTTP authentication. For example, when the user agent sends an INVITE, the SIP server challenges it with a message, and points at authentication server that is recognized by this SIP server. The user agent connects over HTTP to the authentication service (which can be anything, e.g., Kerberos, NTLM, or Digest), gets authenticated and then sends its authenticated information (name, organization, location, email address, phone number, etc. combined in a SAML assertion) to the destination. They need a standard mechanism to send the SAML assertion to the destination – in a SIP message or out- of-band (through another protocol). In Switzerland, SWITCH created an ID card with that information and the destination user agent displays this ID card to the user who decides whether and how to respond, e.g., accept the call. This authentication mechanism is very important for video endpoints that connect to a federation. Endpoints today support digest authentication in pure SIP environment or NTML in Microsoft environment while H.235 is not widely implemented in H.323 environments. As stated above, universal method for authentication is required in federated environments.

During the general session on Wednesday morning, there was also a demo of a psychiatrist using single-screen ‘telepresence’ system from Cisco to connect to a veteran, and discuss possible mental problems. On one hand, I am glad that Cisco is using its large marketing budget to popularize telepresence - this helps grow the video market as a whole. On the other hand, the whole demo implied that only Cisco provides this technology, and I addressed the issue in my presentation on Wednesday afternoon. The HD 1080p technology used in the demo is now available from Polycom and other vendors. The presentation introducing the demo referred to hundreds of installed video systems but failed to mention that the interoperability between Cisco telepresence and other video systems is so bad that it cannot be used for tele-psychiatry or any other application demanding high video quality. The demo itself was not scripted very well – the veteran did not seem to have any problems and the psychiatrist did not seem to know how to use the system. The camera at the remote location was looking at a room corner and did not provide any telepresence experience (it looked like a typical video conferencing setup).

I attended a meeting of the Emerging National Regional Education Networks (NREN) group. The One Laptop Per Child (OLPC) program distributed millions of laptops to children in developing countries http://laptop.org/. These laptops do not have much memory (256MB) and the CPU is not very fast (433MHz). Their only input/output interface is Wi-Fi.

Ohio University decided to test what kind of video can be enabled on these laptops, so that children can participate in virtual classes. The laptops can decode H.263 video (not H.264) and the team therefore installed VLC media player over the IP network, and used it to decode streaming video in H.263 format. An MCU converts H.264 video into H.263. The streaming protocol between the MCU and the laptop is Real Time Streaming Protocol (RTSP) http://www.ietf.org/rfc/rfc2326.txt. Here is how it looks http://www.youtube.com/watch?v=vjh14l-60Pc. To allow feedback (questions from the children to the presenters), they use Pidgin chat client http://www.pidgin.im/ that talks to different chat services: AIM, Google Talk, Yahoo!, MSN, etc. Children can watch the streaming video and switch to the Pidgin application to send questions over chat.

In summation, the Internet2 meeting in Arlington was very well organized and attended. It provided great opportunities to discuss how education, government and healthcare institutions use video to improve their services.

1 comment:

  1. To address the comment from 'studio di registrazione': I would love to provide more information and share my thoughts on any issue within the blog's scope. I just need to know the specific area you are interested in.

    I will continue posting articles on regular bases and cover key issues in the industry.

    ReplyDelete