Not logged in, Join Here! or Log In Below:  
 
News Articles Search    
 

 Home / 3D Theory & Graphics / DVD playback technical question Account Manager
 
Archive Notice: This thread is old and no longer active. It is here for reference purposes. This thread was created on an older version of the flipcode forums, before the site closed in 2005. Please keep that in mind as you view this thread, as many of the topics and opinions may be outdated.
 
jag

February 28, 2005, 02:03 PM

Hey,


I need to be able to play a dvd at 1920x1080.... I also need access to each of the frames to do my image manipulations (high dynamic range), which poses the question. Would I be able to do this in software or does it need to be done through a hardware decoder that is onboard?


If I try to do it in software, theoretically it would not work because of bus limitations, (1920x1080 x 30 frames/sec...) something like 120 megs a second.

I saw the 30 inch G5 display the other day and it was playing a dvd at like 2100x1700+, only getting rid of about 300x300 pixels so it was alot more than 1920x1080. This display only works with a few NVidia cards, which means its going through the hardware to the display without touching the pc bus. In hardware, is there a way I can get access to the frame by frame data after it is decoded by the GPU?

 
Nils Pipenbrinck

February 28, 2005, 04:36 PM

No

 
Reedbeta

February 28, 2005, 06:43 PM

Why do you need to play a DVD at such a huge resolution? DVDs are only stored at 720x480, you know. Why can't you do your manipulations on the 720x480 image and then use the GPU to blow it up to the size of your display?

 
jag

February 28, 2005, 09:43 PM

Nils: Thnx for the insight.... FAG

reedbeta: Well the display we are working with only supports 1920x1280, right now we would blow it up a bit and stick in some bars and the top and bottom but ideally for HD we would need this to be 1920x1280 realtime, aswell as for post production in the movie industry.

Blowing up would defeat the purpose of creating quality images by stretching the dynamic range of the image. Since our display has the effect of giving a 16 bit range per pixel, we would actually like to start out with some 16 bit 1920x1280 images( current displays only support 8bpp ) and work with that all the way through. To be realtime i've figured there's no way in software unless we have a multi-raid configuration, I think it was something like 200 megabits per second we needed. Basically I just want to know how the current hardware is able to play an HD movie at high resolutions like that of the G5 i talked about. Hopefully with the current configuration we would be able to somehow access the data per frame aswell to do our own stuff before passing it on to the display.

 
Victor Widell

March 01, 2005, 04:08 AM

So you show an image of 720x480, 24bit on a 1920x1280, 48bit display?

Makes no sense. There will be no more information to show than in the original input, hence: no enhancement. You could try some fractal analyzation on the image to add artificial detail, but that's not really a realtime algorithm.

 
Sp0rtyt

March 01, 2005, 04:32 AM

This guy is drowning in his own technical jargon. Get a freakin clue Jag.

 
jag

March 01, 2005, 09:31 AM

I said it has the effect of giving a 16 bit color range, we need to do 1920x1280 per pixel calculations to get this. if we scale a 720x480 after we worked with it, it will just show up as garbage or we can only show a portion of the screen at a time. Its not a 48 bit display your thinking rgb, its per pixel. Right now we can only display 0-255 per pixel(brightness values), but ours does this with the addition of an array of LED's at the back(also 0-255 brightness values). Mathematically the algorithm does give theoretical output of a 16 bit color range. Hopefully someone experienced with the hardware decoder stuff can reply.

 
Rui Martins

March 01, 2005, 10:07 AM

I said it has the effect of giving a 16 bit color range, we need to do 1920x1280 per pixel calculations to get this. if we scale a 720x480 after we worked with it, it will just show up as garbage or we can only show a portion of the screen at a time. Its not a 48 bit display your thinking rgb, its per pixel. Right now we can only display 0-255 per pixel(brightness values), but ours does this with the addition of an array of LED's at the back(also 0-255 brightness values). Mathematically the algorithm does give theoretical output of a 16 bit color range. Hopefully someone experienced with the hardware decoder stuff can reply

You seem to have two problems:
- you have to convert the RGB info (8R8G8B, 24Bits), into some hardware output format (Unknown to us)
- you need to scale a DVD movie (720x480) up to your hardware Display Resolution (1920x1280)

These are two distinct problems, and they should be handled distinctly too.
I'm assuming that the post processing you want to do is the Color Conversion, and not some special FX of some kind.

Since The only clue you give on the hardware is the resolution, and how you control brightness (LED array), we can't help you much here, we don't even know what is the color composition of the system. It could be 3, 255 LED Arrays, one for each RGB component, or something completly different.

As for the scaling, you have several options, that can go from dummy scalling, you just do a linear scale of the pixel (square block, rectangle), or you can go about doing, bilinear or triliear scaling, which uses neighbor pixels info to improve the output.

If my assumptions were correct I sugest you dump 720x480 frames to the hardware, and then scale it there. Color convertion can be done either before or after sending it to the hardware, which implies a software or hardware implementation respectivelly. My sugestion here is if the color convertion is simple enough and the pixel depth is less than the original (24->16 ?) then convert before sending it to hardware since this will make the frame size smaller. If not, implement it in hardware.

I wish I had this toy projects to do, day are usually a lot of fun to do.

As far as the screen you saw, they are probably scaling the image on hardware, since most boards today support that, hence the process the DVD movie at 720x480, and then send it to hardware to be scaled to fill the screen.

 
Nils Pipenbrinck

March 01, 2005, 10:59 AM

folks,

It does not work: The dvd data get transfered in YCbCr format to the gpu, the gpu will do the colorspace conversion and upscale. Not a big deal for a gpu.

You won't be able to do the same on the cpu however. I guess it's possible to do the scale and format conversion to main memory, but you will never transfer such an amount of data to the graphics card, not even on a pci-express system (and I'm talking about raw 24 bit, if I'm not mistaken he's talking about a system where he's transfering 32 bit, 24 rgb + superbright).

AND: Do your math, just modulating the backlight of a pixel won't give you 16 bits of luminosity unless the backlight-LED has a linear, dynamic range 256 times higher than the display. The bits simply don't add up.

If you want to do serious hd-video postproduction look here:

http://www.avid.com/products/video/





 
Victor Widell

March 02, 2005, 05:24 AM

Ahhh! I've read about this technique (using a grid of LEDs to enhance contrast on a LCD display).

You really should not need to send 16 bits per channel/pixel to the screen. The grid of LEDs behind the LCD screen will not give you any true "High Dynamic Range", but just some more contrast. This mens the good old 8 bits per channel/pixel will look just fine. Also, remember that your source (the DVD) is not HDR either, so even if you could show HDR on your display, you wouldn't have any HDR data to show.

 
jag

March 03, 2005, 09:30 PM

Victor:

We are using 8 bits per channel, but 8 bits for the led values aswell. The algorithm has already been implemented, you can come see our displays at siggraph 2005 in LA. Do a google on sunny brook technologies aswell, last years siggraph editorials talk about us or visit www.sunnybrooktech.com. Its not just higher contrast, it IS a higher dynamic range.

The display takes in a 8/16 bit source and will run on it to create values for the LED's, so it does not matter if the data is HDR or not, we are only using 8 bits per pixel for luminosity as output.


Nils I know it seems impossible I thought so too but I had a technical overview of the algorithm and how it achieves a higher dynamic range and the math does add up, can't get into details yet though, but hopefully you'll see the results as a high end consumer product in a few years, until then we are focusing on post production, medical and other niche markets. Just think of it this way, each pixel has the 0-255 pixel value and the 0-255 led value attached to it. The result is a very high color range measured in cd/m2(4000 white, 0.05 black) , the range is something like 20x more than a conventional display. Measuring output in light intensity is the true indicator of a higher dynamic range. Trust me once you see it you will want one, but the sticker price is still a little high...... for now anyways...


Jag

 
This thread contains 11 messages.
 
 
Hosting by Solid Eight Studios, maker of PhotoTangler Collage Maker.