Intel has notoriously sucked for HTPC applications. They constantly promise all this awesome media-centric stuff then deliver integrated solutions that just don't work.
"Unlike previous Intel chipsets, the G45 series offers full hardware decode capabilities for MPEG2, VC-1/WMV9, and now AVC/H.264. CPU utilization rates are down significantly now with H.264 dropping down around 20% compared to 75% in our testing, results similar to Intel."
Don't you mean "results similar to ATI"?
(P.S. I'd have used the Quote button, but it doesn't seem to be working for me).
I hope a few questions can be answered in the G45 article.
Using 20% CPU for full decode is a lot better then previous Intel iGfx 75%. But that is under which CPU unit? A Core2Duo E8xxx? Or a Celeron?
Compare to other iGFX which can decode 1080p with a single Digit CPU usage it isn't that much better.
And did Intel promise to put more resources into driver development? Their hardware can only ever be as good as their drivers. Even tough Intel have nearly 50% of graphics market share they are putting the LEAST effort into their drivers.
I'd like to see an option for Desktop levels to be set to 16-235 as well as Video levels to 16-235. With current drivers for ATI/Nvidia and I assume Intel, Desktop levels are always 0-255, and Video can be set to 16-235, however when connected to a TV that is expecting Video levels(16-235), Desktop work or playing games will always be clipped. It'd be great to have the option to set everything to Video levels when sent to a TV set.
Intel seems oblivious to the fact that the CoreAVC codec in software has been beating "hardware" solutions for sometime. Not saying I'd rather have that than a G45, cause Intel has a G45 mini-ITX board for $150.
Did Intel happen to mention at these meetings why their Intel branded P43/45 and G43/45 motherboards are not available from anywhere while most other motherboards manufactured by other companies are?
"Also featured is version 4 of DPST (Display Power Saving Technology) that dynamically controls picture brightness by influencing backpanel lighting."
So I have to ask, is there *anyone* out there that likes dynamic contrast/backlighting algorithms? Every display/laptop I've tested that offers the technology results in an unpleasant darkening/lightening of the screen. It's extremely distracting and one of the first things to get shut off if it comes enabled by default.
Maybe Apple's algorithms are less aggressive, but I've never had a problem with how OS X controls the backlit. At least it darkens the screen in low-light conditions accurately. When it's light again, I usually just bump the brightness to max. Although I know the my MacBook Pro seems to use 2 light sensors, one of each side of the laptop, compared to an ASUS M50VM I have which only uses 1 in the center. I think the 2 sensors mean the backlit doesn't change unless both detect a lighting change, and since they are on opposite sides more likely indicate an actual global lighting change, rather than 1 sensor which may just get occasionally shadowed.
If they had a question period it would be interesting to ask if GMA X4500 or other Intel IGPs can support GPGPU operation. The rumours that Apple may want to move away from Intel chipsets are centered around Intel IGPs not being able to support OpenCL. Seeing Apple and Intel's seemingly friendly relationship, it'll be interesting if it's derailed by something as small as IGP feature support.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
12 Comments
Back to Article
ianken - Sunday, August 24, 2008 - link
Intel has notoriously sucked for HTPC applications. They constantly promise all this awesome media-centric stuff then deliver integrated solutions that just don't work.ianken - Sunday, August 24, 2008 - link
Replying to my own post: one more thing: dymanic picture adjustment BS features are lame.Video is a well speced technology. How you decode and render broadcast video is not up for interpretation. Any "dynamic" bs feature is just that.
As long as they allow you to turn it off, then fine. Anyone who's struggled with AMDs "AVIVO" video mangling noise reduction knows what I mean.
LoneWolf15 - Friday, August 22, 2008 - link
"Unlike previous Intel chipsets, the G45 series offers full hardware decode capabilities for MPEG2, VC-1/WMV9, and now AVC/H.264. CPU utilization rates are down significantly now with H.264 dropping down around 20% compared to 75% in our testing, results similar to Intel."Don't you mean "results similar to ATI"?
(P.S. I'd have used the Quote button, but it doesn't seem to be working for me).
ltcommanderdata - Friday, August 22, 2008 - link
I believe they meant that their own independent results were the same as Intel's. So Intel's claims about the G45 are accurate.iwodo - Thursday, August 21, 2008 - link
I hope a few questions can be answered in the G45 article.Using 20% CPU for full decode is a lot better then previous Intel iGfx 75%. But that is under which CPU unit? A Core2Duo E8xxx? Or a Celeron?
Compare to other iGFX which can decode 1080p with a single Digit CPU usage it isn't that much better.
And did Intel promise to put more resources into driver development? Their hardware can only ever be as good as their drivers. Even tough Intel have nearly 50% of graphics market share they are putting the LEAST effort into their drivers.
gwynethgh - Thursday, August 21, 2008 - link
"However, 1080P/24Hz playback support is still not working properly"Any clues when this will be corrected? Is it a G45 hardware bug or just software
Badkarma - Thursday, August 21, 2008 - link
Gary,I'd like to see an option for Desktop levels to be set to 16-235 as well as Video levels to 16-235. With current drivers for ATI/Nvidia and I assume Intel, Desktop levels are always 0-255, and Video can be set to 16-235, however when connected to a TV that is expecting Video levels(16-235), Desktop work or playing games will always be clipped. It'd be great to have the option to set everything to Video levels when sent to a TV set.
sprockkets - Thursday, August 21, 2008 - link
Intel seems oblivious to the fact that the CoreAVC codec in software has been beating "hardware" solutions for sometime. Not saying I'd rather have that than a G45, cause Intel has a G45 mini-ITX board for $150.Chriz - Thursday, August 21, 2008 - link
Did Intel happen to mention at these meetings why their Intel branded P43/45 and G43/45 motherboards are not available from anywhere while most other motherboards manufactured by other companies are?JarredWalton - Thursday, August 21, 2008 - link
"Also featured is version 4 of DPST (Display Power Saving Technology) that dynamically controls picture brightness by influencing backpanel lighting."So I have to ask, is there *anyone* out there that likes dynamic contrast/backlighting algorithms? Every display/laptop I've tested that offers the technology results in an unpleasant darkening/lightening of the screen. It's extremely distracting and one of the first things to get shut off if it comes enabled by default.
ltcommanderdata - Thursday, August 21, 2008 - link
Maybe Apple's algorithms are less aggressive, but I've never had a problem with how OS X controls the backlit. At least it darkens the screen in low-light conditions accurately. When it's light again, I usually just bump the brightness to max. Although I know the my MacBook Pro seems to use 2 light sensors, one of each side of the laptop, compared to an ASUS M50VM I have which only uses 1 in the center. I think the 2 sensors mean the backlit doesn't change unless both detect a lighting change, and since they are on opposite sides more likely indicate an actual global lighting change, rather than 1 sensor which may just get occasionally shadowed.ltcommanderdata - Thursday, August 21, 2008 - link
If they had a question period it would be interesting to ask if GMA X4500 or other Intel IGPs can support GPGPU operation. The rumours that Apple may want to move away from Intel chipsets are centered around Intel IGPs not being able to support OpenCL. Seeing Apple and Intel's seemingly friendly relationship, it'll be interesting if it's derailed by something as small as IGP feature support.