On Tue, Mar 31, 2015 at 05:02:20AM -0400, Yang Kuankuan wrote:
Besides, as you are going an dw_hdmi cleanups, I want to point another bugs that relate to the HDMI CTS test. There are somethings wrong with General Control Pack, as for now the encoder color depth is 8-bit packing mode, so the color depth only support 24 bits per pixel video, In this case the CD filed in GCP should set to "Color Depth Not indicated".
I'm not sure I follow.
According to the iMX6 documentation, setting the CD field to either 0000 or 0100 indicates that the color depth is 24 bits per pixel, 8 bits per component, 8 bit packing mode - there's no documented difference between these.
Are you saying that you wish to pass something other than 24 bpp video to your HDMI encoder?
In the end we should keep the *csc_color_depth(HDMI_CSC_SCALE)* & *color_depth(HDMI_VP_PR_CD)* to zero, code should modify like this GCP would test pass:
From what you're describing, you want CD field = 0 and CSC_SCALE = 0.
It looks like hdmi_video_packetize() will set the CD field to zero if hdmi_data.enc_color_depth = 0, but that would cause hdmi_video_sample() and hdmi_video_csc() to fail. Maybe those two functions should be fixed to accept a color depth of zero, and maybe you need to set enc_color_depth to 0?
Interestingly, HDMI_CSC_SCALE_CSC_COLORDE_PTH_24BPP is defined to be zero, but again, in the iMX6 data, it could take a value of either 0x00 or 0x40. I think hdmi_video_csc() should set this to 0x40 if hdmi_data.enc_color_depth = 0, and 0x40 for hdmi_data.enc_color_depth = 8.