jump to navigation

Timeshifting A Conference: Can we all agree? Please? August 21, 2009

Posted by gordonwatts in computers, Conference, DeepTalk, Video.

A video feed or recording of a big physics conference is a mixed blessing.


If there is a video recording of a huge conference – like DPF – it would be 100’s of hours long. Many of the parallel sessions describe work that is constantly being updated – so it isn’t clear that if you posted the video how long it would be relevant. I’ve seen conferences just post video of plenary sessions and skip the parallel sessions for I imagine this very reason.

I definitely appreciate it when one of the big conferences does furnish video or streaming. But I have a major problem: time shifting. Even if I’m awake during the conference it is rare I can devote real time to watching it. Or if there is a special talk I might have to try to arrange my schedule around the special talk. But, come on folks – we’ve solved this problem, right? Tivo!?!? Or for us old folks, it is called a VCR!!!

Which brings me to the second issue with conference video. Formats. For whatever reason the particle physics world has mostly stuck to using RealMedia of one form or another. Ugh. I was badly burned back in the day with the extra crap that RealMedia installed on my machine so I’m gun shy now. But the format is also hard to manipulate. I tried a recent version of their player (maybe about 6 months ago) and they have a nice recording feature – exactly what I need here. But I couldn’t figure out how to convert its stored format to mp4 or other things to download to my mp3 player! There are some open source implementations out there – but I’ve never encountered one that has been good enough to reliably parse these streams.

This year’s Lepton-Photon is trying something new. They are streaming in RealMedia, but they also have a mp4 stream. And the free VLC player can play it. What is better is the free VLC player can record it! And convert it! Hooray!!! I can now download and convert these guys and listen/watch them on my commute to work and back, which is perfect for me (the picture above is a screen capture of the stream in VLC). The picture isn’t totally rosy, however. VLC seems to loose the stream every now-and-then. So when I’m recording it I have to watch the player like a hawk and restart it. Sometimes it will go two hours between drops, and other times just 10 minutes. It would be nice if it would auto-restart.

Which brings me to the last problem. Discoverability. I really like the way my DeepTalk project puts up a conference as a series of talks. But the only reason it works is because the conference is backed by a standard agenda/conference tool, Indico. My DeepTalk tools can interface with that, grab the agenda in a known format, and render it. We have no such standard for video.

Wouldn’t it be great if everyone did it the same way? You could point your iTunes/Zune/RealMedia/Whatever tool at a conference, it would figure out the times the conference ran, schedule a recording for streams, or if the video was attached, it would download the data… you’d come back after the conference was over, click the “put conference on my mp3 player” and jump on that long plane flight to Europe and drift off to sleep to the dulcet sounds of someone describing the latest update to W mass and how it has moved the most probably Higgs mass a few GeV lower.

Would that be bliss, or what!?



1. chimpanzee - August 21, 2009

For the Strings ’07 conference, I assisted them in getting their .mp4 videos (converted them to iPod/iPhone compatible .mp4) to a “Strings ’07” iTunes video-podcast I setup for them:


That way, conference attendees could subscribe to above video-podcast, (selectively) download lectures to their iPod/iPhone, & watch it at their convenience. Or, watch it leisurely on AppleTV (in living room).

Strings 07 was using Apple’s QTSS (Quicktime Streaming Server), & did an excellent job of recording/uploading to the web within a few hours of the lecture. They were really “with it”, in terms of turnaround & using the available Technology. It seems Lepton Positron 09 is also tech-savvy, they used 2 solutions (Real Media & .mp4 over VLC). It’s the competitive market out there, which creates these multiple solutions & seemingly tortuous task for end-users. Who want a SINGLE UNIFIED solution, not a buffet of formats & playback methods.

There was a recent development in the business world, which may change the landscape. Google bought On2/VP8:

For those who love “competition” when it comes to codecs – the competitive war just heated up. Google, just bought On2/VP8 and apparently is going to Open Source VP8. Is it an end run around H.264 – to monopolize HTML5 video via Chrome, Google, YouTube, Mozilla and Opera? Jan Ozer ran tests several months back which showed that VP8 is at least as good as – or better than H.264.

A few years ago I mentioned here, Mark Cuban’s comment, that “video over the web would be a 7 trillion dollar a year industry (or did he say 3 trillion?) in the near future”. There were many “Doubting Thomases” here at that point about handheld video devices. A year later came the start of several generations of the iPod, and then the iPhone/Touch. And now everyone and his brother and sister are churning out iPhone look-a-likes. Then add 20 or so brands of netbooks using GPU and Broadcom chippies to run HD video. It’s no wonder that 20% of all television advertising has moved to the web in the last year or so.

Google has to keep producing more and more revenue (to keep stock prices up or at least look they are going to make more and more money); and one way to do that is with video businesses of one stripe or another. Whomever produces the “best” (cheaper, quicker and better) delivery system will rule. It’s no secret that M$, Job$ and Adobe/Fla$h all want the $7,000,000,000,000/year honey pot.

Is Google going to use HTML5 (with an open source VP8) as a vehicle to drive their dominance? Last thing that M$ or Job$ or Adobe/Fla$h wants. As you likely know Apple and Google both took off their “kid gloves” with each other a few days ago. Be interesting to see how M$ or Job$ or Adobe/Fla$h realign themselves over the next few months (friend or foe?). Of course I could be all wet (again)

I would expect that most universities and NGO’s would appreciate a “cheaper, quicker and better” method of delivering video – either to campuses and/or to the web in general.

Over 12 years ago I had this sign on my desk:
Choose two out of three: cheap, fast and good.
Can you now have all three?

It also comes down to the fact that if you want to play with the big boys (Adobe, Apple, Microsoft) you better have some good toys (and big balls) in the video delivery area. That was missing until now with Google.

iPhone’s Safari browser won’t play Flash (Apple’s competitor). Flash is the dominant video player in the market, i.e. how Youtube delivers its video content.

2. tim head - August 21, 2009

You mean something like: http://us.pycon.org/2009/conference/schedule/ ?

Something I don’t understand is why conferences like PyCon (and others, organised by volunteers. I pick on pycon just because I had the page open anyway) manage to produce such easy to navigate and comprehensive “archives” of a conference and we in HEP struggle to have live streams from conferences.

3. Chip Brock - August 22, 2009

The question I have is the on-site effort and expense. Take the PyCon setup: any clue what synch software they used? Because of the zooming, they had a person with a camera. Maybe I’ve not noticed, but having the slides small and the person large is an interesting idea. With the slides separately available in full-resolution, one could use the on-screen slide images as just a key to tell you when to actually click on the full size ones. Usually, it’s the other way with the person being very small and the slides larger. In fact, pedagogically, having the viewer then have to manipulate something during the talk would keep them in the game, so to speak.

I want to do something like this in lectures. We’re going to try to make a live and a remote-version of a course for general education students on HEP and I’m currently trying to research the delivery options for synching the human and the slides. Mediasite seems the best, but it’s expensive.

4. tim head - August 22, 2009

The short answer to what they used for synchronization is: I don’t know.
The slightly longer one: Ask on the python mailing list.

Why don’t you just record the video from the camera and the input to the projector? This would seem like an easy way to get synchronised slides.

My guess is that the cameras were manned by volunteers.

5. Time shifting a Conference: Video Formats « Life as a Physicist - August 31, 2009

[…] wanted to change the topic a little bit – motivated by two things: Lepton/Photon and a comment by chimpanzee. And sorry if this gets a little technical… I’m on a rant […]

6. Gordon Watts - August 31, 2009

Thanks for the comments – I was away for a full week enjoying bumming around the French countryside with the family. Replies will come slowly – some on the blog…

7. Time shifting Video: Recording « Life as a Physicist - September 6, 2009

[…] September 6, 2009 Posted by gordonwatts in Conference, computers. trackback In my first post on video there were a few comments on the effort required to record the video in the room. The basic […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: