The Spring 2010 Internet2 Conference was superb! I have witnessed the increase in quality and diversity of the Internet2 conferences over years, and hope that I have contributed to these changes, too. Fact is that Internet2 events are larger and more diverse, and that they now include not only educational and research institutes but also government and health organizations as well as growing international attendance. Polycom’s participation in these events has also increased over time. In addition to numerous presentations I have given (links are below in the ‘Speaking Engagements’ section), we have done amazing demos, including the TPX three-screen telepresence system that we built at the Fall 2009 Internet2 Conference, as part of the telepresence interoperability effort. The spring event last week gathered 700 participants and was another excellent opportunity to experience collaboration tools with video capabilities, including Polycom CMA Desktop and PVX soft clients, while Polycom HDX equipment was used in many sessions to connect remote participants from all around the world.
But nothing can compare to the astonishing video and audio quality used to connect LIVE both the former Homestake gold mine near Lead, South Dakota and the office of the Governor of South Dakota to the conference hotel Marriot Crystal Gateway in Arlington, Virginia. All attendees gathered in the big ballroom for the general session "Science Discovery and Advanced Networking 1.5 Miles Below the Earth's Surface" which focused on the plans to convert the Homestake mine in South Dakota into a Deep Underground Science and Engineering Lab (DUSEL), where physicists, biologists and geologists could research fundamental questions about matter, energy, life and the Earth.
It looks like every kind of scientific research would benefit from the underground lab, for example, geologists want to study the rocks and figure out why there is no more gold in the mine, while physicists want to study neutrino and dark matter, and hide from the cosmic radiation that seems to screw up a lot of the experiments. Whatever they end up doing in this lab, it will result in a lot of data that has to be transported to research institutes around the world over a very fast network. And since getting in and out of the mine is not easy, advanced voice and video communication is needed for scientists underground to stay in touch with their peers on the surface. The general session gave a preview of what Polycom audio-video technology can do in the tough mine environment characterized by dust, water, and wide temperature variation.
The mine itself is up to 8,000 feet or 2,438 meters deep (and therefore the deepest in North America) but most of the work today is done at 4,850 feet / 1,478 meters underground, and that’s exactly where the Polycom HDX 8000 system was installed. Optical fiber goes to the surface, and connects to the South Dakota’s Research, Education and Economic Development network (REED), which supports two 10 gigabit/second waves and links the state’s six public universities. REED also connects with the Great Plains regional research and education network at Kansas City, which peers with Internet2. Internet2 links with the Mid Atlantic regional network, which had a 1 Gigabit per second link to the conference site in Arlington. Pretty much the same network – except the underground part - was used to connect the second remote participant in the session: the Governor of South Dakota Michael Rounds. The original plan to have him in the mine was scrapped because of safety concerns and another Polycom HDX 8000 system connected the governor’s office to Arlington.
I have seen many demos of Polycom technology over good networks. The Polycom corporate IP network is designed for audio and video and provides very good quality. BUT nothing I have seen compares to the perfect network used during the general session last week. Not a single packet was lost and the delay was just not there, so that the interaction among on-site and remote participants was flawless. The HDX 8000 systems worked at High Definition 1080p video quality and full-band (22 kHz) audio quality over connections of 6 megabits per second. On one hand, the audience could see, hear, and almost smell the thick air in the deep mine. On the other hand, the pristine quality delivered a fully immersive experience, and made everyone in Arlington feel ‘in the mine’. It felt surreal to be so close and so far away at the same time. 700 conference attendees joined me in that experience.
It is impossible to capture the immersive experience during the session but I will try to at least give my blog readers some feeling of the event.
I took a picture of the Governor of South Dakota Michael Rounds speaking about the creation of an underground science lab in the Homestake mine. I also shot a short video of this part of the session.
When Kevin Lesko, DUSEL Principal Investigator, spoke from the Homestake mine, I took a still picture and shot a short video of him, too.
The Q&A part of the session used a split screen to allow conference attendees to see both the Governor and the team underground at the same time, and engage in live discussion. Here is a picture and a video clip from that part of the session.
The interaction in the Q&A session was spectacular. The Governor and the team in the Homestake mine answered numerous questions from the audience and the interaction across distances was just spectacular. In conclusion of the session the President and CEO of Internet2 Doug Van Houweling thanked all contributors to the session. He thanked Polycom for providing the video equipment for this incredible discussion that highlighted both the advances of audio-video technology and the enormous capabilities of the Internet2 network.
Throughout the 75 minute session, the audio and video quality was impressive. Several attendees came to me after the session to share their surprise and excitement about the immersive experience. Most of them wanted to know how to make their own video conferencing systems deliver similar quality, which of course led to discussion about the recent advances of audio and video technology including compression, cameras, microphones, and networking.
I am sure several of my blog followers attended the session "Science Discovery and Advanced Networking 1.5 Miles Below the Earth's Surface", and I would love to get their comments.
This blog discusses collaboration market and technologies including video conferencing, web conferencing, and team collaboration tools.
Showing posts with label Internet2. Show all posts
Showing posts with label Internet2. Show all posts
Monday, May 3, 2010
Sunday, October 25, 2009
PART 8: ‘TELEPRESENCE INTEROPERABILITY IS HERE!’
The results from the telepresence interoperability demo were discussed on October 7 in the session “Telepresence Interoperability is Here!” http://events.internet2.edu/2009/fall-mm/agenda.cfm?go=session&id=10000758&event=980. Bob Dixon used visual and sound effects (including love songs and Hollywood-style explosions) to explain interoperability to people who are less involved in the topic. His presentation inspired me to write about telepresence interoperability for less technical and more general audience. (I hope that my series of blog posts achieved that). Bob highlighted that this was not only the first multi-vendor telepresence interoperability but also the first time systems on Interent2, Commodity Internet, Polycom’s, Tandberg’s, and IBM’ networks successfully connected.
Gabe connected through an HDX video endpoint to RSS 2000 and played back some key parts of the recording from the interoperability demos on October 6 (http://www.flickr.com/photos/20518315@N00/4015164486/). I was actually pleasantly surprised how much information the RSS 2000 captured during the demos. I later found out that Robbie had created a special layout using the MLA application on RMX2000, and this layout allowed us to see multiple sites in the recording.
Robbie (over video from Ohio State) commented that connecting the telepresence systems was the easier part while modifying the layouts turned out to be more difficult. He was initially surprised when RMX/MLA automatically associated video rooms 451, 452, and 453 at Ohio State into a telepresence system but then used this automation mechanism throughout the interoperability tests.
Jim talked about the need to improve usability.
Gabe talked about monitoring the resources on RMX 2000 during the tests and reported that it never used more than 50% of the resource.
I talked mainly about the challenges to telepresence interoperability (as described in Part 2) and about the need to port some of the unique video functions developed in H.323 into the SIP, which is the protocol used in Unified Communications.
Bill (over video from IBM) explained that his team has been testing video interoperability for a year. The results are used for deployment decisions within IBM but also for external communication. IBM is interested in more interoperability among vendors.
During the Q&A session, John Chapman spontaneously joined the panel to answer questions about the demo call to Doha and about the modifications of their telepresence rooms to make them feel more like classrooms.
The Q&A session ran over time and number of attendees stayed after that to discuss with the panelists.
There was a consensus in the room that the telepresence interoperability demo was successful and very impressive. This success proves that standards and interoperability are alive and can connect systems from different vendors running on different networks. The series of tests were also a great team work experience in which experts from several independent, sometimes competing, organizations collaborated towards a common goal.
Back to beginning of the article ... http://videonetworker.blogspot.com/2009/10/telepresence-interoperability.html
Gabe connected through an HDX video endpoint to RSS 2000 and played back some key parts of the recording from the interoperability demos on October 6 (http://www.flickr.com/photos/20518315@N00/4015164486/). I was actually pleasantly surprised how much information the RSS 2000 captured during the demos. I later found out that Robbie had created a special layout using the MLA application on RMX2000, and this layout allowed us to see multiple sites in the recording.
Robbie (over video from Ohio State) commented that connecting the telepresence systems was the easier part while modifying the layouts turned out to be more difficult. He was initially surprised when RMX/MLA automatically associated video rooms 451, 452, and 453 at Ohio State into a telepresence system but then used this automation mechanism throughout the interoperability tests.
Jim talked about the need to improve usability.
Gabe talked about monitoring the resources on RMX 2000 during the tests and reported that it never used more than 50% of the resource.
I talked mainly about the challenges to telepresence interoperability (as described in Part 2) and about the need to port some of the unique video functions developed in H.323 into the SIP, which is the protocol used in Unified Communications.
Bill (over video from IBM) explained that his team has been testing video interoperability for a year. The results are used for deployment decisions within IBM but also for external communication. IBM is interested in more interoperability among vendors.
During the Q&A session, John Chapman spontaneously joined the panel to answer questions about the demo call to Doha and about the modifications of their telepresence rooms to make them feel more like classrooms.
The Q&A session ran over time and number of attendees stayed after that to discuss with the panelists.
There was a consensus in the room that the telepresence interoperability demo was successful and very impressive. This success proves that standards and interoperability are alive and can connect systems from different vendors running on different networks. The series of tests were also a great team work experience in which experts from several independent, sometimes competing, organizations collaborated towards a common goal.
Back to beginning of the article ... http://videonetworker.blogspot.com/2009/10/telepresence-interoperability.html
Labels:
Internet2,
interoperability,
telepresence
Friday, October 23, 2009
PART 7: TELEPRESENCE INTEROPERABILITY DEMO
The demo on October 6 was the first immersive telepresence demo at Internet2. Note that Cisco showed their CTS 1000 telepresence system at the previous Internet2 conference; however, this system has only one screen, and feels more like an HD video conferencing system than an immersive telepresence system. Also, the Cisco demo was on stage and far away from viewers while the TPX demo was available for everyone at the conference to experience.
The following multi-codec systems participated in the telepresence interoperability demo:- Polycom TPX HD 306 three-screen system in Chula Vista Room, Hyatt Regency Hotel, - Polycom TPX HD 306 three-screen system located in Andover, Massachusetts, - LifeSize Room 100 three-screen system located at OARnet in Columbus, Ohio, - Polycom RPX 200 at iFormata in Dayton, Ohio- Polycom RPX 400 at IBM Research in Armonk, NY - Tandberg T3 three-screen system located in Lisbon, Portugal (the afternoon demos were too late for Rui and Bill connected a T3 system in New York instead)
The systems were connected either to the Polycom RMX 2000 located at Ohio State University in Columbus, Ohio, or to the Tandberg Telepresence Server at IBM Research in Yorktown Heights, NY.
As for the setup in Chula Vista, TPX comes with 6 chairs, and there were additional 30 chairs building several rows behind the system. There was enough space for people to stand in the back of the room. (http://www.flickr.com/photos/20518315@N00/4014401487/)
I can only share my experience sitting in the TPX system in the Chula Vista Room. I am sure other participants in the demo have experienced it a little differently. I was tweeting on the step-by-step progress throughout the demos.
The final test plan included both continuous presence scenarios and voice switching scenarios. Voice switching is a mechanism widely used in video conferencing; the conference server detects who speaks, waits for 2-3 seconds to make sure it is not just noise or a brief comment, and then starts distributing video from that site to all other sites. The twist - when telepresence systems are involved - is that not only one but all 2, 3, or 4 screens that belong to the ‘speaking’ site must be distributed to all other sites. Voice switched tests worked very well; sites were appearing as expected.
Continuous presence – also technology used in video conferencing – allows the conference server to build customized screen layouts for each site. The layout can be manipulated by management applications, e.g. RMX Manager and MLA manipulate the layouts in RMX 2000. (http://www.flickr.com/photos/20518315@N00/4014401683/)
TPX performed flawlessly. On October 5, most calls were at 2Mbps per screen due to some bottlenecks when crossing networks. This issue was later resolved and on October 6 TPX connected at 4Mbps per screen (total of 12 Mbps). TPX was using the new Polycom EagleEye 1080 HD cameras that support 1080p @30fps and 720p @60fps. We used 720p@ 60fps which provides additional motion smoothness.
About quality: The quality of multipoint telepresence calls on RMX 2000 was excellent. A video recorded in Chula Vista is posted at http://www.youtube.com/watch?v=XpfNmJtAtVg. In few test cases, we connected the TPX systems directly to TTPS, and the quality decreased noticeably.
About reliability: In addition to the failure during the first test (described in Part 5), TTPS failed during the morning demo on October 6 (I was tweeting throughout the demo and have the exact time documented here http://twitter.com/StefanKara/status/4633989195). RMX 2000 performed flawlessly.
About layouts: Since TTPS is advertised as a customized solution for multipoint telepresence, I expected that it will handle telepresence layouts exceptionally well. Throughout the demos, Robbie Nobel used the MLA application to control RMX 2000 while Bill Rippon controlled TTPS. In summation, RMX 2000 handled telepresence layouts better than TTPS. The video http://www.youtube.com/watch?v=XpfNmJtAtVg shows a layout created by RMX 2000 – T3 system is connected to RMX through TTPS. In comparison, when the telepresence systems were connected directly to TTPS, even the best layout was a patchwork covering small portion of the TPX screens. (http://www.flickr.com/photos/20518315@N00/4014401367/) I understand that due to the built-in automation in TTPS, the user has limited capability to influence the layouts. While MLA includes layout automation, it does allow the user to modify layouts and select the best layout for the conference.
About capacity: TTPS is 16-port box and each codec takes a port, so it can connect maximum five 3-screen systems or four 4-screen systems. Bill therefore could not connect all available systems on TTPS – the server just ran out of ports. In comparison, RMX 2000 had 160 resources and each HD connection took 4 resources, so that RMX 2000 could connect maximum of 40 HD codecs, i.e., thirteen 3-screen systems or ten 4-screen systems. RMX therefore never ran out of capacity during the demo.
The morning and lunch interoperability demos were recorded on a Polycom RSS 2000 recorder @ IP address 198.109.240.221.
We ran three interoperability demos during the morning, lunch, and afternoon conference breaks. In addition, we managed to squeeze in two additional demos that highlighted topics relevant to Internet2 and the education community. In the first one, we connected the TPX Chula Vista to the Polycom RPX 218 system at Georgetown University in Doha, Qatar on the Arabian Peninsula, and had a very invigorating discussion about the way Georgetown uses telepresence technology for teaching and learning. John Chapman from the Georgetown University and Ardoth Hassler from the National Science Foundation joined us in the Chula Vista room. If you are interested in that topic, check out the joint Georgetown-Polycom presentation at the spring’09 Interent2 conference http://events.internet2.edu/2009/spring-mm/sessionDetails.cfm?session=10000467&event=909. The discussion later went into using telepresence technology for grant proposal review panels.
Another interesting demo was meeting Scott Stevens from Juniper Networks over telepresence and discussing with him how Juniper’s policy management engine interacts with Polycom video infrastructure to provide high-quality of experience for telepresence.
Throughout all interoperability and other demos, the Interent2 network performed flawlessly – we did not notice any packet loss and jitter was very low.
Stay tuned for Part 8 with summary of the test and demo results … http://videonetworker.blogspot.com/2009/10/part-8-telepresence-interoperability-is.html
The following multi-codec systems participated in the telepresence interoperability demo:- Polycom TPX HD 306 three-screen system in Chula Vista Room, Hyatt Regency Hotel, - Polycom TPX HD 306 three-screen system located in Andover, Massachusetts, - LifeSize Room 100 three-screen system located at OARnet in Columbus, Ohio, - Polycom RPX 200 at iFormata in Dayton, Ohio- Polycom RPX 400 at IBM Research in Armonk, NY - Tandberg T3 three-screen system located in Lisbon, Portugal (the afternoon demos were too late for Rui and Bill connected a T3 system in New York instead)
The systems were connected either to the Polycom RMX 2000 located at Ohio State University in Columbus, Ohio, or to the Tandberg Telepresence Server at IBM Research in Yorktown Heights, NY.
As for the setup in Chula Vista, TPX comes with 6 chairs, and there were additional 30 chairs building several rows behind the system. There was enough space for people to stand in the back of the room. (http://www.flickr.com/photos/20518315@N00/4014401487/)
I can only share my experience sitting in the TPX system in the Chula Vista Room. I am sure other participants in the demo have experienced it a little differently. I was tweeting on the step-by-step progress throughout the demos.
The final test plan included both continuous presence scenarios and voice switching scenarios. Voice switching is a mechanism widely used in video conferencing; the conference server detects who speaks, waits for 2-3 seconds to make sure it is not just noise or a brief comment, and then starts distributing video from that site to all other sites. The twist - when telepresence systems are involved - is that not only one but all 2, 3, or 4 screens that belong to the ‘speaking’ site must be distributed to all other sites. Voice switched tests worked very well; sites were appearing as expected.
Continuous presence – also technology used in video conferencing – allows the conference server to build customized screen layouts for each site. The layout can be manipulated by management applications, e.g. RMX Manager and MLA manipulate the layouts in RMX 2000. (http://www.flickr.com/photos/20518315@N00/4014401683/)
TPX performed flawlessly. On October 5, most calls were at 2Mbps per screen due to some bottlenecks when crossing networks. This issue was later resolved and on October 6 TPX connected at 4Mbps per screen (total of 12 Mbps). TPX was using the new Polycom EagleEye 1080 HD cameras that support 1080p @30fps and 720p @60fps. We used 720p@ 60fps which provides additional motion smoothness.
About quality: The quality of multipoint telepresence calls on RMX 2000 was excellent. A video recorded in Chula Vista is posted at http://www.youtube.com/watch?v=XpfNmJtAtVg. In few test cases, we connected the TPX systems directly to TTPS, and the quality decreased noticeably.
About reliability: In addition to the failure during the first test (described in Part 5), TTPS failed during the morning demo on October 6 (I was tweeting throughout the demo and have the exact time documented here http://twitter.com/StefanKara/status/4633989195). RMX 2000 performed flawlessly.
About layouts: Since TTPS is advertised as a customized solution for multipoint telepresence, I expected that it will handle telepresence layouts exceptionally well. Throughout the demos, Robbie Nobel used the MLA application to control RMX 2000 while Bill Rippon controlled TTPS. In summation, RMX 2000 handled telepresence layouts better than TTPS. The video http://www.youtube.com/watch?v=XpfNmJtAtVg shows a layout created by RMX 2000 – T3 system is connected to RMX through TTPS. In comparison, when the telepresence systems were connected directly to TTPS, even the best layout was a patchwork covering small portion of the TPX screens. (http://www.flickr.com/photos/20518315@N00/4014401367/) I understand that due to the built-in automation in TTPS, the user has limited capability to influence the layouts. While MLA includes layout automation, it does allow the user to modify layouts and select the best layout for the conference.
About capacity: TTPS is 16-port box and each codec takes a port, so it can connect maximum five 3-screen systems or four 4-screen systems. Bill therefore could not connect all available systems on TTPS – the server just ran out of ports. In comparison, RMX 2000 had 160 resources and each HD connection took 4 resources, so that RMX 2000 could connect maximum of 40 HD codecs, i.e., thirteen 3-screen systems or ten 4-screen systems. RMX therefore never ran out of capacity during the demo.
The morning and lunch interoperability demos were recorded on a Polycom RSS 2000 recorder @ IP address 198.109.240.221.
We ran three interoperability demos during the morning, lunch, and afternoon conference breaks. In addition, we managed to squeeze in two additional demos that highlighted topics relevant to Internet2 and the education community. In the first one, we connected the TPX Chula Vista to the Polycom RPX 218 system at Georgetown University in Doha, Qatar on the Arabian Peninsula, and had a very invigorating discussion about the way Georgetown uses telepresence technology for teaching and learning. John Chapman from the Georgetown University and Ardoth Hassler from the National Science Foundation joined us in the Chula Vista room. If you are interested in that topic, check out the joint Georgetown-Polycom presentation at the spring’09 Interent2 conference http://events.internet2.edu/2009/spring-mm/sessionDetails.cfm?session=10000467&event=909. The discussion later went into using telepresence technology for grant proposal review panels.
Another interesting demo was meeting Scott Stevens from Juniper Networks over telepresence and discussing with him how Juniper’s policy management engine interacts with Polycom video infrastructure to provide high-quality of experience for telepresence.
Throughout all interoperability and other demos, the Interent2 network performed flawlessly – we did not notice any packet loss and jitter was very low.
Stay tuned for Part 8 with summary of the test and demo results … http://videonetworker.blogspot.com/2009/10/part-8-telepresence-interoperability-is.html
Labels:
demo,
Internet2,
interoperability,
telepresence
PART 6: TELEPRESENCE INTEROPERABILITY LOGISTICS
Bringing a TPX to San Antonio required a lot of preparation. We had to find a room in the conference hotel Hyatt Regency that had enough space for the system and for additional chairs for conference attendees to see the demo.
Another important consideration for the room selection was how close it was to the loading dock. The TPX come in 7 large crates and we did not want to move them all over the hotel. And the size of the truck had to fit the size of Hyatt’s loading dock.
It was critical to have the IP network on site up and running before the TPX system could be tested. Usually a lot of the work and cost is related to bringing a high-speed network connection to the telepresence system. This was not an issue at the Internet2 conference since I2 brings 10Gbps to each conference site. We needed only about 12 Mbps (or approximately 0.1% from that) for TPX.
Timing was critical too. The Polycom installation team had to do the installation on the weekend, so that everything would work on Monday morning. The room that we identified was Chula Vista on lobby level. It was close to the loading dock and had enough space. The only issue was that the room was booked for another event on Wednesday, so TPX had to be dismantled on Tuesday, right after the last interoperability demo finished at 4:30pm.
Stay tuned for Part 7 about the telepresence interoperability demo at the Internet2 Conference on October 6, 2009 … http://videonetworker.blogspot.com/2009/10/part-7-telepresence-interoperability.html
Another important consideration for the room selection was how close it was to the loading dock. The TPX come in 7 large crates and we did not want to move them all over the hotel. And the size of the truck had to fit the size of Hyatt’s loading dock.
It was critical to have the IP network on site up and running before the TPX system could be tested. Usually a lot of the work and cost is related to bringing a high-speed network connection to the telepresence system. This was not an issue at the Internet2 conference since I2 brings 10Gbps to each conference site. We needed only about 12 Mbps (or approximately 0.1% from that) for TPX.
Timing was critical too. The Polycom installation team had to do the installation on the weekend, so that everything would work on Monday morning. The room that we identified was Chula Vista on lobby level. It was close to the loading dock and had enough space. The only issue was that the room was booked for another event on Wednesday, so TPX had to be dismantled on Tuesday, right after the last interoperability demo finished at 4:30pm.
Stay tuned for Part 7 about the telepresence interoperability demo at the Internet2 Conference on October 6, 2009 … http://videonetworker.blogspot.com/2009/10/part-7-telepresence-interoperability.html
Labels:
Internet2,
interoperability,
logistics,
telepresence,
TPX
Monday, October 19, 2009
PART 1: WHY TELEPRESENCE INTEROPERABILITY?
On October 6, 2009, Bob Dixon from OARnet moderated successful telepresence interoperability demonstration at the Fall Internet2 meeting in San Antonio, Texas. It included systems from Polycom, LifeSize, and Tandberg, and the short version of the story is in the joint press release http://finance.yahoo.com/news/Polycom-Internet2-OARnet-iw-1109370064.html?x=0&.v=1. While the memories from this event are still very fresh, I would like to spend some time and reflect on the long journey that led to this success.
First of all, why is telepresence interoperability so important?
The video industry is built on interoperability among systems from different vendors, and customers enjoy the ability to mix and match elements from Polycom, Tandberg, LifeSize, RadVision and other vendors in their video networks. As a result, video networks today rarely have equipment from only one vendor. It was therefore natural for the video community to strive for interoperability among multi-screen/multi-codec telepresence systems.
Most industry experts and visionaries in our industry subscribe to the idea that visual communication will become as pervasive as telephony today, and it has been widely recognized that the success of the good old Public Switch Telephone Network (PSTN) is based on vendors adhering to standards. Lack of interoperability, on the other hand, leads to inefficient network implementations of media gateways that transcode (translate) the digital audio and video information from one format to another thus increasing delay and decreasing quality. While gateways exist in voice networks, e.g. between PSTN and Voice over IP networks, their impact on delay and quality is far smaller than the impact of video gateways. Therefore, interoperability of video systems – telepresence and others – is even more important than interoperability of voice systems.
The International Multimedia Teleconferencing Consortium (IMTC) has traditionally driven interoperability based on the H.323 protocol. At the IMTC meeting in November’08 http://www.imtc.org/imwp/download.asp?ContentID=14027, the issue came up in three of the sessions and there were heated discussions how to tackle telepresence interoperability. The conclusion was that IMTC had expertise in signaling protocols (H.323) but not in the issues around multi-codec systems.
In February’09, fellow blogger John Bartlett wrote on NoJitter about the need for interoperability to enable business-to-business (B2B) telepresence and I replied on Video Networker http://videonetworker.blogspot.com/2009/03/business-to-business-telepresence.html, basically saying that proprietary mechanisms used in some telepresence systems create obstacles to interoperability.
In April’09, Bob Dixon from Ohio State and OARnet invited all telepresence vendors to the session ‘Telepresence Perspectives and Interoperability’ at the Spring Internet2 conference http://events.internet2.edu/2009/spring-mm/agenda.cfm?go=session&id=10000509&event=909. He chaired the session and, in conclusion, challenged all participating vendors to demonstrate interoperability of generally available products at the next Intrenet2 event. All vendors but HP were present. Initially, everyone agreed that this was a great idea. Using Internet2 to connect all systems would allow vendors to test without buying each others’ expensive telepresence systems. Bandwidth would not be an issue since Internet2 has so much of it. And since the interoperability would be driven by an independent third party, i.e. Bob Dixon, there would be no competitive fighting.
In June’09, I participated in the session ‘Interoperability: Separating Myth from Reality’ at the meeting of the Interactive Multimedia & Collaborative Communications Alliance (IMCCA) during InfoComm in Orlando, Florida http://www.infocommshow.org/infocomm2009/public/Content.aspx?ID=984&sortMenu=105005, and telepresence interoperability was on top of the agenda.
During InfoComm, Tandberg demonstrated connection between their T3 telepresence system and Polycom RPX telepresence system through the Tandberg Telepresence Server. The problem with such demos is always that you do not how much of it is real and how much is what we call ‘smoke and mirrors’. For those not familiar with this term, ‘smoke and mirrors’ refers to demos that are put together by modifying products and using extra wires, duct tape, glue and other high tech tools just to make it work for the duration of the demo. The main question I had around this demo was why a separate product like the Tandberg Telepresence Server was necessary? Couldn’t we just use a standard MCU with some additional layout control to achieve the same or even better results? To answer these questions, we needed an independent interoperability test. Ohio State, OARnet, and Internet2 would be the perfect vehicle for such test; they are independent and have a great reputation in the industry.
Stay tuned for Part 2 about the challenges to telepresence interoperability … http://videonetworker.blogspot.com/2009/10/part-2-telepresence-interoperability.html
First of all, why is telepresence interoperability so important?
The video industry is built on interoperability among systems from different vendors, and customers enjoy the ability to mix and match elements from Polycom, Tandberg, LifeSize, RadVision and other vendors in their video networks. As a result, video networks today rarely have equipment from only one vendor. It was therefore natural for the video community to strive for interoperability among multi-screen/multi-codec telepresence systems.
Most industry experts and visionaries in our industry subscribe to the idea that visual communication will become as pervasive as telephony today, and it has been widely recognized that the success of the good old Public Switch Telephone Network (PSTN) is based on vendors adhering to standards. Lack of interoperability, on the other hand, leads to inefficient network implementations of media gateways that transcode (translate) the digital audio and video information from one format to another thus increasing delay and decreasing quality. While gateways exist in voice networks, e.g. between PSTN and Voice over IP networks, their impact on delay and quality is far smaller than the impact of video gateways. Therefore, interoperability of video systems – telepresence and others – is even more important than interoperability of voice systems.
The International Multimedia Teleconferencing Consortium (IMTC) has traditionally driven interoperability based on the H.323 protocol. At the IMTC meeting in November’08 http://www.imtc.org/imwp/download.asp?ContentID=14027, the issue came up in three of the sessions and there were heated discussions how to tackle telepresence interoperability. The conclusion was that IMTC had expertise in signaling protocols (H.323) but not in the issues around multi-codec systems.
In February’09, fellow blogger John Bartlett wrote on NoJitter about the need for interoperability to enable business-to-business (B2B) telepresence and I replied on Video Networker http://videonetworker.blogspot.com/2009/03/business-to-business-telepresence.html, basically saying that proprietary mechanisms used in some telepresence systems create obstacles to interoperability.
In April’09, Bob Dixon from Ohio State and OARnet invited all telepresence vendors to the session ‘Telepresence Perspectives and Interoperability’ at the Spring Internet2 conference http://events.internet2.edu/2009/spring-mm/agenda.cfm?go=session&id=10000509&event=909. He chaired the session and, in conclusion, challenged all participating vendors to demonstrate interoperability of generally available products at the next Intrenet2 event. All vendors but HP were present. Initially, everyone agreed that this was a great idea. Using Internet2 to connect all systems would allow vendors to test without buying each others’ expensive telepresence systems. Bandwidth would not be an issue since Internet2 has so much of it. And since the interoperability would be driven by an independent third party, i.e. Bob Dixon, there would be no competitive fighting.
In June’09, I participated in the session ‘Interoperability: Separating Myth from Reality’ at the meeting of the Interactive Multimedia & Collaborative Communications Alliance (IMCCA) during InfoComm in Orlando, Florida http://www.infocommshow.org/infocomm2009/public/Content.aspx?ID=984&sortMenu=105005, and telepresence interoperability was on top of the agenda.
During InfoComm, Tandberg demonstrated connection between their T3 telepresence system and Polycom RPX telepresence system through the Tandberg Telepresence Server. The problem with such demos is always that you do not how much of it is real and how much is what we call ‘smoke and mirrors’. For those not familiar with this term, ‘smoke and mirrors’ refers to demos that are put together by modifying products and using extra wires, duct tape, glue and other high tech tools just to make it work for the duration of the demo. The main question I had around this demo was why a separate product like the Tandberg Telepresence Server was necessary? Couldn’t we just use a standard MCU with some additional layout control to achieve the same or even better results? To answer these questions, we needed an independent interoperability test. Ohio State, OARnet, and Internet2 would be the perfect vehicle for such test; they are independent and have a great reputation in the industry.
Stay tuned for Part 2 about the challenges to telepresence interoperability … http://videonetworker.blogspot.com/2009/10/part-2-telepresence-interoperability.html
Labels:
Internet2,
OARnet,
Ohio State University,
telepresence
Subscribe to:
Posts (Atom)