Man, PC case designers are really waiting for that stopped clock to be right again.
This is why I buy exclusively from Lian Li. I want something that looks like an adult owns it. How many buyers today really care what's under the hood, rather than how it looks? Judging from the popularity of Apple products, a diminishing amount. But I guess WOW players pay the bills, so...
Unless you are a major gamer, who needs these huge ATX cases these days. Manufacturers need to take a hint from the growth of lap/netbook...smaller is better.
The Mac Mini is beautiful, the regular mac is all in one (No huge box on the floor), Dell has the Zino.
Newegg has cr@p for small cases - we hardly use CD/DVD drives anymore let alone a floppy disk drive. Why do have the mini/small cases have two external drive openings.....looks stupid!
Give me a nice 8x8x3 (or smaller) case with a SINGLE slot for a DVD/Blueray drive (I don't need any freakin floppy drive!). Perhaps an external power supply or internal at 200-250 watts. A nice board with an AMD or Intel dual/triple core, 2 gigs mem, 250-500 gig drive and I would be happy (as long as the board does not cost 2x what a micro board costs!!!!!!)
Give me a small, nice looking case! That is what the vast majority of users actually need...and would actually purchase.
Do I really need a 3 foot tall case with 5 external bays, a 600 watt power supply, 8 fans, lights all flashing, ready to take off into space?
Gaming and overclocked rigs will use whatever cooling you an throw to them.
You could also imagine someone wanting to have both and DVD-RW and a BluRay unit, and two hard drives. This makes a normal ATX case necessary (most mATX cases would be very cramped, have inferiour airflow, be hard to work into in most cases, might not fit big, tower-like coolers and so on.
[quote]The demo also used NVIDIA's sterescopic 3D technology - 3D Vision. We're hearing that the rumors of a March release are accurate, but despite the delay Fermi is supposed to be very competitive (at least 20% faster than 5870?). The GeForce GTX 265 and 275 will stick around for the first half of the year as Fermi isn't expected to reach such low price/high volume at the start of its life.[/quote]
"We're hearing that the rumors of a March release are accurate, but despite the delay Fermi is supposed to be very competitive (at least 20% faster than 5870?"
That 20% is quite pessimistic. For nvidia just 20% faster for 40% (also at those die areas the defects doesnt increase lineary) more transistors would be worse than gt200. And we doesnt even take into acount power draw and heat of the gt300. Second time to make the same mistake is quite stupid i think.
Unfortunately, all 40% of the new die isn't for graphics. A fair bit of it's going to be for the GPGPU stuff. They're not making the same mistake twice, just seeing if they can make a new one. ;)
Two 1.5 bilion transistor chips conected on one package would have much lover defect rates than gf100. Nvidia can dream about giant chips with zero defects but i think the future is in smaler dual, quad core gpus that can be efectivly manufactured. Similar road than the cpu-s had taken years ago with one difference. Gpu-s are super parallel so performance could be near linear with added cores all the time.
There are some issues here:
1. NVidia wants Fermi to be faster overall, but it might be possible that it won't be faster than 5870 in DX11 for now. Drivers will help in the long way, and game engines optimised for Fermi will help also (maybe a lot).
2. NVidia wants Fermi to succeed on the desktop, but their big hopes are for Fermi in the supercomputing arena. They might have chosen a poorer performing design for the desktop in order to crush everything in the race for teraflops.
There are some issues here:
1. NVidia wants Fermi to be faster overall, but it might be possible that it won't be faster than 5870 in DX11 for now. Drivers will help in the long way, and game engines optimised for Fermi will help also (maybe a lot).
2. NVidia wants Fermi to succeed on the desktop, but their big hopes are for Fermi in the supercomputing arena. They might have chosen a poorer performing design for the desktop in order to crush everything in the race for teraflops.
That GMA is finally on par with equivalent offerings from AMD/NV (about which I still have my doubt) is the most exciting news of the CES 2010 for you?
As a side note, the integrated graphics being able to handle AutoCAD isn't very impressive, as my High school computers with integrated GMA 950 graphics could handle AutoCAD and AutoCAD Inventor, however they would BSOD if you made ~10,000 lines and tried to zoom in and out really quickly. It was pretty fun.
Intel is fighting in the graphics market by price alone (pretty solid integrated hardware and drivers for it at very low quality and price). AMD and NVidia fight in this market by not-so-low prices but better quality.
This will bring at least two advantages:
1. Most of the computers will come with better integrated graphics (Intel owns more than half the integrated market graphics)
2. NVidia and AMD are forced to either price-war with Intel (which they won't do in the integrated graphics department) or have better products.
Like some others, i'm not to impressed by Intels new graghics in itself.
The graphics has direct access to the memory controller and STILL can't beat 2 year old solutions except in a few areas. To me that is pretty pathetic.
Obviously though, others are quite impressed, like anand but I would expect much more from the gpu having ondie memory controller access, even Intels typical trash.
I don't think anyone caught the two links Anand posted. The two Intel items that Anand was most impressed with are the wireless HD transmitter and the Moorestown smartphone demonstration. Arrandale and the new Intel IGP isn't one of the most impressive items from CES.
Personally, I'm moderately impressed that Intel has finally put some real effort into improving their IGP. This is evidenced by the fact that a 20% increase in the number of shader cores/pipelines resulted in up to a 100% increase in performance.
The GMA 900/950 series was horrible in comparison to the NVIDIA and ATI IGPs of the time (Radeon Xpress 1200/1250 and GeForce 6100). The next generation GMA 3000 stuff was still less than 1/4 the performance of the newer NVIDIA/ATI IGPs (Xpress 3200 and 7100 series). Then we had a lot of stagnation and the 8100 and 9300/9400 from NVIDIA and the HD 3200/4200 really didn't do much more compared to their last IGPs. The GMA 4500MHD has much of the HD decode support and with appropriate drivers is only about half the performance of ATI/NVIDIA IGPs. Now they have closed the gap, and I'll be surprised if ATI and NVIDIA do much more than about a 10-20% performance boost on IGPs (though naturally ATI will have a DX11 IGP).
When you get right down to it, though, all IGPs still pretty much suck for 3D performance. They can handle minimum detail settings at 800x600 most of the time, and on less demanding titles you can even get 1366x768 with medium detail. Any $100 discrete GPU ends up being at least twice as fast, and with hybrid graphics really kicking off now that Win7 is out, if you want 3D performance there's really no reason not to find a laptop with a discrete GPU.
While I agree with you that the new IGP is impressive for an Intel product, cheering it on is a little bit like cheering on a paraplegic’s first bunt while other players are busy hitting home runs.
In the past when there were other options I would have been less critical, but now that it’s Intel’s IGP’s and chipsets or nothing things become more serious. I’m worried that Intel is going to only provide “adequate” solutions, bumping up performance and features simply to match competitors when they get too far ahead instead of significantly advancing the market all by themselves. As an example, when is there going to be an Intel chipset with SATA 6gbps or USB 3.0? When is their IGP going to get DX11?
I realize that for the foreseeable future discrete will always be the best way to go, but with so much of the market only using IGPs their performance will always limit and delay what gets developed for the rest of us. I’m not talking about games so much as things like Microsoft’s Aero interface and sweet little Mini-ITX HTPC platforms (maybe not be the best examples, but they should illustrate the point.)
As an aside, I did catch the links. I always value Anand’s (and AnandTech’s) opinions on what is significant and impressive even if I don’t agree with you – otherwise I wouldn’t be here. ;)
Really it's a balancing act in many ways. More transistors means more power, so you don't want every graphics chip out there to ship with 128 SPs/Stream Processors/Pipelines. Ideally, you'd make it so you can totally shut down the unneeded parts of the chip, but even so... what do you do if you want to make an ultraportable laptop and the power envelope is 15W for the CPU and GPU? You need a chip that will never use more than that amount of power (i.e. generate more than that amount of heat).
So the way I see it, while we will continue to see improvements to IGPs over the coming years, they will never be a replacement for discrete GPUs. We're now at the point where they're able to do just about everything we need, such as H.264 video decoding, 2560x1600 output, etc. We could get 12-bit color processing and two dual-link capable outputs (or HDMI 1.4), but for gaming we'll always want a discrete GPU.
Which brings us to the final point: now that we have ATI and NVIDIA both supporting GPU switching so you can turn the discrete GPU on and off, the IGP should be viewed more as a power saving solution than a performance part. You want battery life and low power, you use the IGP (whether it's Intel or someone else); when you want 3D performance, you enable the discrete GPU.
Now the only thing I really need from Intel is Arrandale ULV... and drivers that work in every situation (i.e. all games where the hardware meets the requirements run without crashing, utilities like DXVA Checker don't fail, and that sort of thing).
Thank you for the clarification. Yeah I thought GMA couldn't possibly an exciting subject even for Anand, its technical prowess (or the lack thereof) notwithstanding. I couldn't even 'give away' a G35 based system to a neighbor's kid due to the fear of possible embarrassment.
No complaint on improvement, but I have a couple of questions on the new Intel HD Graphics:
1) Does 2560x1600 work?
2) Dual displays?
Normally I wouldn't expect such things on GMA, but I thought I saw these being 'advertised' by Intel. I haven't seen any review verifying them. I'd much appreciate if you could tell me how/whether these work.
I don't know why Anand finds Intel stuff the most exciting tech at this year's CES. The thing that I still find exciting is SSDs. Finally after so many years of computing we are getting quick responsive computers (on both desktops and laptops). For years people have been looking at hour glasses and beach balls wondering what's taking so long. Finally we have a solution in the working (and it's nothing to do with CPUs or memory because they aren't the bottlenecks anymore).
Caught up to what? 780G/785G? Terrific, Intel is where AMD was 2 years ago. Once AMD drops the 8-series chipsets, I'd say there's a better than good chance Intel will once again find itself where it's most comfortable - years behind the curve.
This move might force AMD to improve even more the performance of the 8-series integrated graphics - remember that the 3200-based integrated graphics from the AMD 690 (I think) had only half the power of the discrete card based on the same chipset.
I look forward to a time when the integrated graphic is at least as powerfull than the entry-level discrete card
Wouldn't that make that discrete card worthless? as in why would anyone release a card that only matches integrated graphics?
And on a side note on smartphone graphics, I would have no intention of playing 3d games on my phone and would hope the graphics could be shut down to preserve battery life.
That would make the discrete card worthless for anything _already having_ that discrete graphics. For the rest (cough Intel IGP cough), it will still be an important upgrade.
That won't happen until it gets the same memory bandwidth as a real video card.
I don't think there are many "sideport" memory motherboards.
Alternatively you could use triple-channel memory and have one stick exclusively for the graphics (as well as sharing the regular memory.) Then you would be in the market for really fast DDR3 in sizes of 256mb to 1GB.
Intel seems to be on the "right" track in getting the GPU the least memory latency, but only for casual gaming and anything else the GPU can do.
I would like to wait until they have on-core NB and graphics, instead of just on-package.
I'm just saying...
Entry level card (6200 based) - 350 MHz clock, 64-bits, 256 MB video RAM (DDR2 275 MHz) with access to up to 512 MB of computer RAM.
Compare with an IGP (Radeon 1250, AMD 690G) from AMD: 400 MHz clock and 64-bits access to lowly PC-5300, DDR2-667 MHz in a base configuration.
yeah, it just dumb. There was a lot of uses that I could think of until they said it can only drive the embedded pc, which limited all uses I had for it to zero. Now raises the question why stuff it in a keyboard in the first place and just make it some kind of Ultra SSF nettop with a screen?
More like a missed opportunity. Had they integrated a KVM so that the keyboard could also drive a regular PC, this could have been a useful, if niche, gaming accessory. There are a few low-intensity apps (such as IM'ing and browsing FAQs/Walkthroughs) that it would be nice to run on a little "side screen" so that you aren’t constantly tabbing out of your game. If they made the screen able to "break out" and position anywhere, so much the better.
As it is, though, calling it useless is being charitable.
I dunno... With the built-in ability to stream video to a TV, I could see people using it like those old skool Internet Browsing on your TV devices. Niche, sure. Useless, no.
Thanks for providing some pictures Anand, we haven't seen enough yet of CES coverage. This has been the most memorable show since the dot-com bust days
The features in those Antec ATX cases are fantastic, shame the looks are hideous :(
I wonder who's idea it was to place the screen on the side of the E-Keyboard? It looks extremely awkward to use that way. You'd think you'd automatically place the screen on top as a detachable unit.
or just get rid of it all together, which is what I think the other keyboard next to it is. I don't need extra buttons on a random screen on my keyboard.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
40 Comments
Back to Article
Herrterror - Tuesday, January 12, 2010 - link
Man, PC case designers are really waiting for that stopped clock to be right again.This is why I buy exclusively from Lian Li. I want something that looks like an adult owns it. How many buyers today really care what's under the hood, rather than how it looks? Judging from the popularity of Apple products, a diminishing amount. But I guess WOW players pay the bills, so...
DrApop - Sunday, January 10, 2010 - link
Unless you are a major gamer, who needs these huge ATX cases these days. Manufacturers need to take a hint from the growth of lap/netbook...smaller is better.The Mac Mini is beautiful, the regular mac is all in one (No huge box on the floor), Dell has the Zino.
Newegg has cr@p for small cases - we hardly use CD/DVD drives anymore let alone a floppy disk drive. Why do have the mini/small cases have two external drive openings.....looks stupid!
Give me a nice 8x8x3 (or smaller) case with a SINGLE slot for a DVD/Blueray drive (I don't need any freakin floppy drive!). Perhaps an external power supply or internal at 200-250 watts. A nice board with an AMD or Intel dual/triple core, 2 gigs mem, 250-500 gig drive and I would be happy (as long as the board does not cost 2x what a micro board costs!!!!!!)
Give me a small, nice looking case! That is what the vast majority of users actually need...and would actually purchase.
Do I really need a 3 foot tall case with 5 external bays, a 600 watt power supply, 8 fans, lights all flashing, ready to take off into space?
Mr Perfect - Monday, January 11, 2010 - link
Silverstone released a new Mitx case from the Sugo line at CES, but have only seen one SFF site mention it so far. http://sffclub.com/index.php?option=com_content&am...">http://sffclub.com/index.php?option=com...&cat...It fits a 5970, so it's only a matter of time before some Mitx fans cram a high end game rig in there. :) I'm looking into this myself.
Calin - Monday, January 11, 2010 - link
Gaming and overclocked rigs will use whatever cooling you an throw to them.
You could also imagine someone wanting to have both and DVD-RW and a BluRay unit, and two hard drives. This makes a normal ATX case necessary (most mATX cases would be very cramped, have inferiour airflow, be hard to work into in most cases, might not fit big, tower-like coolers and so on.
FlyTexas - Monday, January 11, 2010 - link
There are a few cases (no pun intended) when you need that multibay, lights, ready for lift off case...1. Gaming rigs that have big long video cards installed and need lots of room for cooling the highest end OCed CPUs.
2. Servers with lots of hard drives in them.
Totally - Sunday, January 10, 2010 - link
[quote]The demo also used NVIDIA's sterescopic 3D technology - 3D Vision. We're hearing that the rumors of a March release are accurate, but despite the delay Fermi is supposed to be very competitive (at least 20% faster than 5870?). The GeForce GTX 265 and 275 will stick around for the first half of the year as Fermi isn't expected to reach such low price/high volume at the start of its life.[/quote]GTX 265?
spacedude - Sunday, January 10, 2010 - link
did you guys get a chance to talk to any of the mobo manufactures about the foxconn socket problems???????????filotti - Sunday, January 10, 2010 - link
What's the deal with the dell tablet and all the secrecy surrounding it?Zool - Sunday, January 10, 2010 - link
"We're hearing that the rumors of a March release are accurate, but despite the delay Fermi is supposed to be very competitive (at least 20% faster than 5870?"That 20% is quite pessimistic. For nvidia just 20% faster for 40% (also at those die areas the defects doesnt increase lineary) more transistors would be worse than gt200. And we doesnt even take into acount power draw and heat of the gt300. Second time to make the same mistake is quite stupid i think.
Mr Perfect - Monday, January 11, 2010 - link
Unfortunately, all 40% of the new die isn't for graphics. A fair bit of it's going to be for the GPGPU stuff. They're not making the same mistake twice, just seeing if they can make a new one. ;)Zool - Tuesday, January 12, 2010 - link
I think that gf100 is different from the tesla die and that they have two gpu dies this time. The tesla die with maxed DP calculations and a similar GF100 die without the DP functionality. This site http://www.semiconductor.net/article/438968-Nvidia...">http://www.semiconductor.net/article/43...en_Calls... says 3.2 bilion transistors is the chip and that they have problems with the defect rates at TSMC with those sizes.Two 1.5 bilion transistor chips conected on one package would have much lover defect rates than gf100. Nvidia can dream about giant chips with zero defects but i think the future is in smaler dual, quad core gpus that can be efectivly manufactured. Similar road than the cpu-s had taken years ago with one difference. Gpu-s are super parallel so performance could be near linear with added cores all the time.
Calin - Monday, January 11, 2010 - link
There are some issues here:1. NVidia wants Fermi to be faster overall, but it might be possible that it won't be faster than 5870 in DX11 for now. Drivers will help in the long way, and game engines optimised for Fermi will help also (maybe a lot).
2. NVidia wants Fermi to succeed on the desktop, but their big hopes are for Fermi in the supercomputing arena. They might have chosen a poorer performing design for the desktop in order to crush everything in the race for teraflops.
Calin - Monday, January 11, 2010 - link
There are some issues here:1. NVidia wants Fermi to be faster overall, but it might be possible that it won't be faster than 5870 in DX11 for now. Drivers will help in the long way, and game engines optimised for Fermi will help also (maybe a lot).
2. NVidia wants Fermi to succeed on the desktop, but their big hopes are for Fermi in the supercomputing arena. They might have chosen a poorer performing design for the desktop in order to crush everything in the race for teraflops.
lopri - Saturday, January 9, 2010 - link
That GMA is finally on par with equivalent offerings from AMD/NV (about which I still have my doubt) is the most exciting news of the CES 2010 for you?Someguyperson - Monday, January 11, 2010 - link
As a side note, the integrated graphics being able to handle AutoCAD isn't very impressive, as my High school computers with integrated GMA 950 graphics could handle AutoCAD and AutoCAD Inventor, however they would BSOD if you made ~10,000 lines and tried to zoom in and out really quickly. It was pretty fun.Calin - Monday, January 11, 2010 - link
Intel is fighting in the graphics market by price alone (pretty solid integrated hardware and drivers for it at very low quality and price). AMD and NVidia fight in this market by not-so-low prices but better quality.This will bring at least two advantages:
1. Most of the computers will come with better integrated graphics (Intel owns more than half the integrated market graphics)
2. NVidia and AMD are forced to either price-war with Intel (which they won't do in the integrated graphics department) or have better products.
I for one am expecting the better products :)
- Sunday, January 10, 2010 - link
he's been drinking to much of the Intel Kool aid - it's now a reflexasH
lopri - Saturday, January 9, 2010 - link
It's a question, btw. :) I thought you had covered it prior to CES.formulav8 - Sunday, January 10, 2010 - link
Like some others, i'm not to impressed by Intels new graghics in itself.The graphics has direct access to the memory controller and STILL can't beat 2 year old solutions except in a few areas. To me that is pretty pathetic.
Obviously though, others are quite impressed, like anand but I would expect much more from the gpu having ondie memory controller access, even Intels typical trash.
Jason
JarredWalton - Sunday, January 10, 2010 - link
I don't think anyone caught the two links Anand posted. The two Intel items that Anand was most impressed with are the wireless HD transmitter and the Moorestown smartphone demonstration. Arrandale and the new Intel IGP isn't one of the most impressive items from CES.Personally, I'm moderately impressed that Intel has finally put some real effort into improving their IGP. This is evidenced by the fact that a 20% increase in the number of shader cores/pipelines resulted in up to a 100% increase in performance.
The GMA 900/950 series was horrible in comparison to the NVIDIA and ATI IGPs of the time (Radeon Xpress 1200/1250 and GeForce 6100). The next generation GMA 3000 stuff was still less than 1/4 the performance of the newer NVIDIA/ATI IGPs (Xpress 3200 and 7100 series). Then we had a lot of stagnation and the 8100 and 9300/9400 from NVIDIA and the HD 3200/4200 really didn't do much more compared to their last IGPs. The GMA 4500MHD has much of the HD decode support and with appropriate drivers is only about half the performance of ATI/NVIDIA IGPs. Now they have closed the gap, and I'll be surprised if ATI and NVIDIA do much more than about a 10-20% performance boost on IGPs (though naturally ATI will have a DX11 IGP).
When you get right down to it, though, all IGPs still pretty much suck for 3D performance. They can handle minimum detail settings at 800x600 most of the time, and on less demanding titles you can even get 1366x768 with medium detail. Any $100 discrete GPU ends up being at least twice as fast, and with hybrid graphics really kicking off now that Win7 is out, if you want 3D performance there's really no reason not to find a laptop with a discrete GPU.
GeorgeH - Monday, January 11, 2010 - link
@JW:While I agree with you that the new IGP is impressive for an Intel product, cheering it on is a little bit like cheering on a paraplegic’s first bunt while other players are busy hitting home runs.
In the past when there were other options I would have been less critical, but now that it’s Intel’s IGP’s and chipsets or nothing things become more serious. I’m worried that Intel is going to only provide “adequate” solutions, bumping up performance and features simply to match competitors when they get too far ahead instead of significantly advancing the market all by themselves. As an example, when is there going to be an Intel chipset with SATA 6gbps or USB 3.0? When is their IGP going to get DX11?
I realize that for the foreseeable future discrete will always be the best way to go, but with so much of the market only using IGPs their performance will always limit and delay what gets developed for the rest of us. I’m not talking about games so much as things like Microsoft’s Aero interface and sweet little Mini-ITX HTPC platforms (maybe not be the best examples, but they should illustrate the point.)
As an aside, I did catch the links. I always value Anand’s (and AnandTech’s) opinions on what is significant and impressive even if I don’t agree with you – otherwise I wouldn’t be here. ;)
JarredWalton - Tuesday, January 12, 2010 - link
Really it's a balancing act in many ways. More transistors means more power, so you don't want every graphics chip out there to ship with 128 SPs/Stream Processors/Pipelines. Ideally, you'd make it so you can totally shut down the unneeded parts of the chip, but even so... what do you do if you want to make an ultraportable laptop and the power envelope is 15W for the CPU and GPU? You need a chip that will never use more than that amount of power (i.e. generate more than that amount of heat).So the way I see it, while we will continue to see improvements to IGPs over the coming years, they will never be a replacement for discrete GPUs. We're now at the point where they're able to do just about everything we need, such as H.264 video decoding, 2560x1600 output, etc. We could get 12-bit color processing and two dual-link capable outputs (or HDMI 1.4), but for gaming we'll always want a discrete GPU.
Which brings us to the final point: now that we have ATI and NVIDIA both supporting GPU switching so you can turn the discrete GPU on and off, the IGP should be viewed more as a power saving solution than a performance part. You want battery life and low power, you use the IGP (whether it's Intel or someone else); when you want 3D performance, you enable the discrete GPU.
Now the only thing I really need from Intel is Arrandale ULV... and drivers that work in every situation (i.e. all games where the hardware meets the requirements run without crashing, utilities like DXVA Checker don't fail, and that sort of thing).
lopri - Monday, January 11, 2010 - link
Thank you for the clarification. Yeah I thought GMA couldn't possibly an exciting subject even for Anand, its technical prowess (or the lack thereof) notwithstanding. I couldn't even 'give away' a G35 based system to a neighbor's kid due to the fear of possible embarrassment.No complaint on improvement, but I have a couple of questions on the new Intel HD Graphics:
1) Does 2560x1600 work?
2) Dual displays?
Normally I wouldn't expect such things on GMA, but I thought I saw these being 'advertised' by Intel. I haven't seen any review verifying them. I'd much appreciate if you could tell me how/whether these work.
semo - Monday, January 11, 2010 - link
I don't know why Anand finds Intel stuff the most exciting tech at this year's CES. The thing that I still find exciting is SSDs. Finally after so many years of computing we are getting quick responsive computers (on both desktops and laptops). For years people have been looking at hour glasses and beach balls wondering what's taking so long. Finally we have a solution in the working (and it's nothing to do with CPUs or memory because they aren't the bottlenecks anymore).The other thing I just saw is the Livescribe pen http://lifestyle.hexus.net/content/item.php?item=2...">http://lifestyle.hexus.net/content/item.php?item=2.... This thing is amazing. I know this site doesn't do peripherals but this thing is much more than this (much better than the Asus keyboard I would say).
Oh and 3D TVs... blegh.
PorscheRacer - Saturday, January 9, 2010 - link
20 yeas later they caught up, yeah I'd be excited too. It's a long time coming.GeorgeH - Saturday, January 9, 2010 - link
Caught up to what? 780G/785G? Terrific, Intel is where AMD was 2 years ago. Once AMD drops the 8-series chipsets, I'd say there's a better than good chance Intel will once again find itself where it's most comfortable - years behind the curve.Calin - Monday, January 11, 2010 - link
This move might force AMD to improve even more the performance of the 8-series integrated graphics - remember that the 3200-based integrated graphics from the AMD 690 (I think) had only half the power of the discrete card based on the same chipset.I look forward to a time when the integrated graphic is at least as powerfull than the entry-level discrete card
strikeback03 - Monday, January 11, 2010 - link
Wouldn't that make that discrete card worthless? as in why would anyone release a card that only matches integrated graphics?And on a side note on smartphone graphics, I would have no intention of playing 3d games on my phone and would hope the graphics could be shut down to preserve battery life.
Calin - Monday, January 18, 2010 - link
That would make the discrete card worthless for anything _already having_ that discrete graphics. For the rest (cough Intel IGP cough), it will still be an important upgrade.nubie - Monday, January 11, 2010 - link
That won't happen until it gets the same memory bandwidth as a real video card.I don't think there are many "sideport" memory motherboards.
Alternatively you could use triple-channel memory and have one stick exclusively for the graphics (as well as sharing the regular memory.) Then you would be in the market for really fast DDR3 in sizes of 256mb to 1GB.
Intel seems to be on the "right" track in getting the GPU the least memory latency, but only for casual gaming and anything else the GPU can do.
I would like to wait until they have on-core NB and graphics, instead of just on-package.
Calin - Monday, January 18, 2010 - link
I'm just saying...Entry level card (6200 based) - 350 MHz clock, 64-bits, 256 MB video RAM (DDR2 275 MHz) with access to up to 512 MB of computer RAM.
Compare with an IGP (Radeon 1250, AMD 690G) from AMD: 400 MHz clock and 64-bits access to lowly PC-5300, DDR2-667 MHz in a base configuration.
jigglywiggly - Saturday, January 9, 2010 - link
eee keyboard == uselessTotally - Sunday, January 10, 2010 - link
yeah, it just dumb. There was a lot of uses that I could think of until they said it can only drive the embedded pc, which limited all uses I had for it to zero. Now raises the question why stuff it in a keyboard in the first place and just make it some kind of Ultra SSF nettop with a screen?GeorgeH - Saturday, January 9, 2010 - link
More like a missed opportunity. Had they integrated a KVM so that the keyboard could also drive a regular PC, this could have been a useful, if niche, gaming accessory. There are a few low-intensity apps (such as IM'ing and browsing FAQs/Walkthroughs) that it would be nice to run on a little "side screen" so that you aren’t constantly tabbing out of your game. If they made the screen able to "break out" and position anywhere, so much the better.As it is, though, calling it useless is being charitable.
ksherman - Sunday, January 10, 2010 - link
I dunno... With the built-in ability to stream video to a TV, I could see people using it like those old skool Internet Browsing on your TV devices. Niche, sure. Useless, no.Taft12 - Saturday, January 9, 2010 - link
Thanks for providing some pictures Anand, we haven't seen enough yet of CES coverage. This has been the most memorable show since the dot-com bust daysThe features in those Antec ATX cases are fantastic, shame the looks are hideous :(
buzznut - Sunday, January 10, 2010 - link
Nice article. I have to agree, I don't care for the direction Antec is going with their cases. The whole plastic grid thing just looks cheesy. Ugly.AznBoi36 - Sunday, January 10, 2010 - link
You can see pretty much everything about CES at Engadget: http://www.engadget.com/ces">http://www.engadget.com/cesLocut0s - Saturday, January 9, 2010 - link
I wonder who's idea it was to place the screen on the side of the E-Keyboard? It looks extremely awkward to use that way. You'd think you'd automatically place the screen on top as a detachable unit.afkrotch - Sunday, January 10, 2010 - link
or just get rid of it all together, which is what I think the other keyboard next to it is. I don't need extra buttons on a random screen on my keyboard.