Home
Building an Infrastructure for Synchronous and Asynchronous Video Streaming Degree Programs
by Stuart B. Gordon
April 1, 2002

This seminar was presented for the 34th NTU Engineering Conference, Las Vegas, NV on April 8, 2002 and for the EDUCAUSE 2002 Conference, Atlanta, GA on October 3, 2002.

Introduction

How many of you currently use, or plan to use, video streaming in your mix of distance learning delivery methods? May I see a show of hands? Thank you. How many of you are required to add new programs within the constraint of limited resources? Thank you.

Our Challenge

Old Dominion University in Norfolk, Virginia, was forced to answer these two questions four years ago. Our TELETECHNET distance learning program was begun in the 1980’s, and by the mid 90’s, had developed into the largest distance learning network of its kind in the country. Any student in the Commonwealth of Virginia was within 50 miles of access to a four-year degree. That accessibility changed in 1998 when our president mandated that we would make our programs available to students in their homes and businesses using the Internet and streaming media, a technology that was just being introduced at that time.

Needless to say, we were very surprised to hear the mandate and not totally sure how to turn it into reality. We did realize, however, that we would be required to add new technology to our existing infrastructure to make it happen. So our engineers, technicians and designers put their heads together and built an answer to the mandate.

TELETECHNET History

Before I describe the various pieces of the infrastructure that we developed, let me explain how TELETECHNET works so you will understand why we made certain decisions when developing the video streaming architecture.

Distance learning has been an integral part of Old Dominion University since 1984. TELETECHNET is the brand name we use for a variety of distance learning networks that utilize different technologies. The implementation of the satellite delivery network in 1994 caused a dramatic increase in the scope of our distance learning activities.

Currently, the University broadcasts approximately 300 live, interactive courses per year in up to 30 undergraduate and graduate degree programs to more than 60 sites throughout Virginia, as well as to the District of Columbia and a half dozen other states in the U.S. Our network utilizes one-way video and two-way audio for student interactivity. Most of our students take the first two years of their undergraduate degrees at a community college campus. After the first two years, the students remain at the community college site and complete their last two years with Old Dominion through interactive technology.

The university also offers both undergraduate and graduate degree programs via two-way video. We use our satellite network and CD-ROM to deliver undergraduate programs to naval personnel on bases and on ships, and a graduate degree program to submarine officers while deployed under the world’s oceans. We’re also establishing a new network between the United States and Turkey to deliver medical and engineering programs.

And we’ve added video streaming to our mix of technologies. In the current academic year 2002-2003, it is estimated that TELETECHNET will register over 20,000 course enrollments.

Streaming Infrastructure

Part of our original directive was to develop a video streaming architecture that piggy-backed on our TELETECHNET satellite system, because it was successful, had an excellent reputation and was accredited. In 1998, accrediting boards didn’t know anything about this new delivery method called video streaming, and were not sure it would provide an acceptable education for distance learning students. So we built an infrastructure that would provide the same educational experience that our satellite students received.

In each of our six video streaming capable classrooms, cameras and microphones capture the learning experience and feed the signals to a sub-control room. Incoming questions from students at each distant site are fed through an audio bridge and mixed in the sub-control room as well. The audio and video are also recorded on a Digital Video tape deck in the sub-control room. This provides a base backup for the video streaming process if the archive encoder has a problem during the live class delivery. The sub-control signals are routed through Master Control where supervisors monitor all incoming and outgoing signals for satellite delivery.

As the signals are routed to the MPEG-2 encoders for satellite delivery, they are simultaneously routed to a set of three video streaming encoders. We have five sets of three encoders making fifteen total, plus one hot swappable spare. Actually, the video signal is sent directly to the encoder; the audio signal is sent to a dbx unit where the high and low frequencies are stripped out and is then sent to the encoder.

We made the decision early in the project to only offer a broadband, 220 kbps stream. Two of the encoders are set up to provide a redundant high-bandwidth live stream to our Real Media server for synchronous streamed classes. The third encoder is set up to create an archived Real Media file for asynchronous video-on-demand delivery.

Each encoder is a Gateway Pentium III, 600 MHz computer with 256 Mb RAM, a 30 Gb hard drive, an Osprey 100 video digitizing card and an audio capture card. We added an ADS TV Superscan NTSC-out card so that the encoding process can be monitored by the sub-control room operator. Each encoder is attached to a 100 Mbps Ethernet connection for delivery of the data signals to the Real Media server. Each encoder runs a stripped-down version of Windows NT 4.0, Real Producer Plus encoder software, and Stream Scheduler, our video streaming automation software.

Stream Scheduler Development

We began to create Stream Scheduler almost as soon as we started to build the video streaming infrastructure. We realized that once we figured out how to stream, we would have to replicate the process and scale up rapidly, but might not have the personnel to do so. We needed to automate to save time and money. We also realized that the procedure of turning on the encoder, encoding the stream, turning off the encoder and transmitting the archive to the server would become monotonous very quickly. We had to automate to save our sanity!

Stream Scheduler is actually five products in one and handles all aspects of the synchronous and asynchronous video streaming process. Stream Scheduler is comprised of:

  1. a scheduling database
  2. an encoding engine
  3. an FTP transmission program
  4. a communications manager
  5. a post production development program.

Scheduling Database is created
The first step is to create a centralized database of the courses to be streamed for a specific semester. This database resides on a server, and is downloaded to each encoder at pre-determined times during the day. The database contains a unique record for each class to be streamed and includes the data we need to create a unique record for each stream.

Stream Scheduler encodes streams
Based on the encoder’s internal clock, Stream Scheduler then starts the live stream for a particular class. At Old Dominion, we start all streams 10 minutes before the actual class begins so that as students sign in, they can verify that their Real Player is working properly.

Stream Scheduler also stops the stream at the conclusion of the class period. We allow the live stream to continue to 3 minutes following the end of the satellite transmission time slot for faculty who run longer than the allotted time. Stream Scheduler monitors the status of the Real server during encoding and reports any errors if it becomes unavailable.

Stream Scheduler transmits streams via FTP to server
The live streams are automatically transmitted to the Real server as they are created. The archive file is created and stored on the encoder’s local hard drive.

When the class session is completed, Stream Scheduler checks to see if another class is scheduled to start for a particular encoder set. If there is a class scheduled, the archive file is placed in a queue for transmission later when there is adequate time. If there is no subsequent class, Stream Scheduler transmits the archived file to the Real server via File Transmission Protocol (FTP). Each 2 hour-45 minute lecture creates a file that is 275-300 megabytes and takes 10-15 minutes to transfer to the Real server.

The Stream Scheduler FTP program establishes a connection with the server, and transmits the file. It also checks for errors and other problems during the transmission, closes the connection when the transfer is complete, and records all of this data in a transmission log for reporting.

Stream Scheduler communicates status and errors
Stream Scheduler is a totally automated system that does not require any human interaction. Everything that takes place in Stream Scheduler is recorded as an event in a log. This log provides extensive data for creating reports, for monitoring runtime conditions (encoder status, server status, network accessibility, etc.), and for troubleshooting problems that occur.

For example, when an encoder encounters a problem, Stream Scheduler sends a digital message to our on-call engineer’s pager. With this information, the engineer at home can correct problems on the telephone with non-video streaming personnel who are on-site.

Re-Encode Manager handles post-production encoding
When errors take place or when archive file have to be changed, Re-Encode Manager enables us to fix these problems in our post-production studio. Re-Encode Manager uses the same database of semester course information to encode the DV tape master. Re-Encode Manager was developed using the Active X control provided by Real Networks as part of the Real Software Development Kit so it takes advantages of the codecs and features of the Real Networks family of products.

Return on Investment

Stream Scheduler has taken an enormous investment of time to develop. We continue to create new pieces as dictated by the program’s requirements, but the return on investment has far exceeded the development cost.

  • Stream Scheduler has cost approximately $40,000 to develop over the past 2.5 years.
  • We have streamed an average of 30 3-hour courses each semester for each of the past two years.
  • Accounting for the live streams and the post production time, we estimate that we have saved 750 hours of personnel time per semester for the past two years.
  • At an average hourly rate of $12, we have saved a total of $54,000, and realized net savings of $14,000.

We plan to re-engineer Stream Scheduler into a generic commercial product that can assist other universities in their media streaming and distance learning initiatives. This re-engineering has already begun, and we plan on the software being available during the 1st quarter of next year.

Your Backyard

Now that I have described our video streaming challenges and solutions, how can YOU use this information in your own backyard?

From a technical standpoint, the video streaming puzzle has three pieces for you to put together:

  1. Acquire the Signal: As you have seen, we were able to utilize our distance learning broadcast infrastructure that was already in place to acquire the signal in each of our classrooms and send them to our encoders. You also may already have this component in place. If not, you will have to determine how you are going to capture the classroom experience (cameras, microphones, audio-visual tools, switching system, etc.) and send those signals to your encoder.
  2. Encode the Signal: Many of your video streaming technical decisions must be made here.
    • You must decide what hardware and software to use to encode your video signal, and which streaming platform – Real Media, Windows Media, or QuickTime - makes the most sense for your audience.
    • You must decide if you will offer live streams only, archived streams only, or both.
    • You must decide if you will encode for a broadband-modem student only, or for students who access the stream using both dial-up and broadband connections.
    • And you must decide who in your organization will be responsible for the daily operation of your encoders. This last decision, for us, resulted in the development of our Stream Scheduler software, which permitted us to both automate the process and control its quality.
  3. Distribute the Signal: Once you have captured the video signal and encoded it, you must decide how to distribute it. At ODU, we chose to purchase the Real Server software and to control the distribution process ourselves. You may choose to follow a similar path, or you may decide to partner with an outside video streaming distribution company to deliver your streams to your students. We use an outside firm for special events if the anticipated demand is too great for our servers so that our regular audience isn’t adversely affected. Many of these distribution companies can also handle the entire acquisition-encoding-distribution process if you do not wish to develop it yourself.

Since adding video streaming to our TELETECHNET distance learning program, we have discovered a way to offer complete degree programs to students on their personal computers at work and at home. As we’ve built the infrastructure to support video streaming, we’ve had the opportunity to develop Stream Scheduler, a new breed of automation software that enables us to add new programs within the constraint of limited resources. We will continue to build upon the excellent reputation of our TELETECHNET distance learning program, and explore new ways to use video streaming to expand our markets and increase our enrollments.

Home
Last Updated 3/6/03 10:47 AM