Monday, October 09, 2006

Power Draw

I spoke not too long ago about heat. The amount of heat that is in a case is directly linked to how much power a machine draws, the more the power draw, the higher the heat output. With last weeks news of two new graphic cards from NVIDIA, I felt I'd touch on the insane power requirements that the industry plans to throw on us.

I remember reading a story a while back about a reviewer who was trying to get a top-of-the-line system with some new components up and running. The system had the now-normal complement of two video cards, a dual-core CPU, and the other standard components. The funny thing was that he didn't have an issue getting the machine to power on, but he had an issue with keeping it on. As soon as he would enter a benchmark or high-stress application, the machine drew enough power to actually blow the circuit breaker in his office. The only fix was to run an extension cord from an outlet on another circuit, and power the system with 2 power supplies.

Current systems in the Enthusiast category, or Exotic as I like to call it, can require upwards of 600 watts of power. That's 10 60-watt light bulbs, and that's the output from the PSU. It's AC input may well exceede 750 watts. Now these are only the most Exotic of the Exotic, but with NVIDIA's recent announcement, things may change.

The next generation of graphics processors, or video cards, will indeed be more hungry than anything we've seen. Current cards can draw upwards of 150 watts, but this new category of cards will push the limits of 250 watts. What's worse, these new cards will be doing more processing, which means that draw will be more constant than current cards. With a pair of these, you could easily be pushing 500 watts, just in video processing.

While the Exotic market will feel the power draw more than the standard consumer, the roll-out of Windows Vista promises to push the power up another notch across the board. While preliminary testing with high-end equipment is showing a minimal power increase with the new Windows, the percentages will be much higher on consumer equipment as they struggle to get the fancy Aero interface drawn.

What power requirements will come in the future is still up in the air. Core 2 Quad, AMD's 4x4 initiative, SLI and CrossFire all bring to our table incredible performance advancements, but at the cost of electricity. Atleast you won't need a space heater anymore to keep your feet warm, just put your PC under the table.

Friday, September 15, 2006

Vista All Over

It seems Windows Vista is all the talk these days. Just about everyone knows that there's a new Windows on the way, but many people don't know what it's all about. I'm here to tell you, it's almost usable in the Release Candidate 1 that was shipped out a couple weeks ago, but your hardware selections are going to either make it or break it.

In order to make things easier on smaller OEM's, NVIDIA has launched their Essential Vista program. When you see the Essential Vista logo, you should be fairly confident that you can run Vista in all of the Aero Glass glory that you should be able to, with performance close to that of Windows XP.

For the past few months, I've been testing Vista, from the early Betas all the way through the current RC1. With RC1 came working HDTV Output, and thus it's been running almost non-stop on my home machine. For the record, my home machine is one that would fit into the Essential Vista program, with a GeForce 7900GT video card and a NForce 4 SLI Chipset.

As is usual, Microsoft is bundling the latest 3rd party drivers with their new operating system. This makes setup a breeze on NVIDIA hardware. There was one driver update for the PCI-Express bridge, but other than that, everything is there. The first boot was to a properly configured display, even on my HDTV. Sound, networking, everything was good right out the box.

Performance with this build is very good. It is definitely not as snappy as XP, but most of that is by design. Windows now morph around the screen as you minimize and maximize them, much like Apple's OS X. This adds a little waiting time, but hopefully we'll see a TweakUI-esque utility to control time duration on things like that.

So the moral of the story is that you don't have to spend a ton to get a machine right now that will work with Windows Vista. All you need is to talk to someone who has experience with the OS to make sure that you get everything that you'll need. The NVIDIA Essential Vista program helps you out even more by giving you a good starting point. In fact, I would look for this logo over a "Windows Vista Capable" anytime.

Monday, September 04, 2006

Coprocessors all over!

Back a few months ago, I predicted that coprocessors were about to make a come back. They have in the corporate world, with Cray's introduction of the XD1 supercomputer, using Xilinx's Virtex FPGA alongside AMD's Opteron. I went a few steps further, and predicted that coprocessors would start to make some waves on the desktop side, just like the 3d accelerators that we take for granted these days.

So today I'm reading some news and come across this new company called Aiseek. They make a product called Intia which is a hardware/software combo that aims to give a whole new level of realism in computer games. This product has yet to really get picked up yet, and honestly I hope it never does.

Even though I like the idea of better AI in games, I don't think that we need another $300 card to make it happen. Just like the Ageia Physx card, which has yet to show off any improvements, I think this Intia card is just a bunch of hogwash.

What we need is a field-programmable gate array (FPGA) that can be programmed on the fly to do whatever acceleration is needed. Say you're playing a game that has very high physics demands, turn that FPGA into a physics accelerator and you're off. Shutdown that game and start up one that needs more AI, not a problem. Doing some serious Photoshop? We have that covered too.

ATI is starting to make some moves in that direction. They want to use a second or third graphics card for it's floating-point abilities to help speed up Havok's new physics engine. There's also been talk of using the GPU's power to research complex protein folding. If we can do this without another card, I'm all for it, but don't make me buy a card that I only need for one game.

One add-on board I didn't talk about yet is the Killer NIC from Bigfoot networks. Now this is actually a good product I think, as network processing is something that everyone can use. As broadband speeds near 10Mips, our aging machines are having trouble sorting out all that data. Purchasing a card that takes over that load could potentially help everyone out.

So the coprocessor is back, and I'm sure that Physx, Killer NIC and Intia are just the tip of the iceberg. Let's just hope we don't get stuck swapping cards depending on the game we want to play.

Thursday, August 24, 2006


It really seems like Dell is always coming up for something. This time, it's the 4.1 million laptop batteries that were just recalled. What's great is when a company as huge as Dell recalls so many parts, it's all over the news and everyone wants to talk about it. So here ya go, my take on the Dell battery situation.

First off, this isn't the first time Dell has recalled laptop parts, such as batteries or power adapters. In fact, I did a little research and was registered almost a year ago, probably in response to a recall in December of 35,000 batteries. These were also recalled for heat issues, just like the current recall.

What inspired me to write this article was a certain reaction from The Professional Inventors Alliance. Their president, Ronald Riley, was recently quoted as saying "Dell does not in my opinion have the engineering expertise which other companies who actually advance the technology have." He continued to say that Dell cuts costs across the board, including the R & D department, in order to keep costs down, and that practice is why batteries are igniting. With a little innovation, this problem would not have happened. It seems Lenovo, HP, Gateway and the others who have Sony batteries would agree.

That brings me back to my last topic about heat. (In case you were wondering, 23 minutes on the phone, 4 days later a tech shows up, can't fix it, new machine about a week later. That's next-day on-site for ya.) Dell's own XPS-700 machine, the biggest dog they've ever launched, was repeatedly pushed back because of heat issues. Is it their fault or Intel/NVIDIA's for making such hot products? In a desktop, it's Dell's fault because I can build one fine.

But what about laptops? There are about 5 or 6 actual laptop manufacturers now. Dell gets most of their machines made by a company named Quanta. They basically take a stock chassis and put pretty Dell plastic all over it. Quanta then takes that same chassis as sell it to Alienware, who gets Alien plastic all over. So is it the plastic? No. The true answer is pretty simple actually:

Dell sold millions of laptops last year. They are the laptop kings, and there's significant hatred for them because of support issues like the one I just went through. When The Inquirer broke a story about an Inspiron bursting into flames, it was only a few days before more and more stories about exploding Dell's showed up. Shortly thereafter, Dell bit the bullet and dropped ~$250 Million in a PR stunt.

That's all there is too it. Whether it's cell phones, laptops, PDA's, or anything with a rechargeable battery, they are all vulnerable to battery explosion. In a cell phone, the battery is hidden away, and when it blows, it probably doesn't do too much damage. In a laptop though, the battery is mostly exposed and when it catches, it burns like the Dickens. Yes there are design cues we should take to keep damage minimal, but these things are just going to blow, no way around it.

So the big question is: "What about laptops on planes now that we know they blow up?" Well, I'll tell you one thing, the answer isn't to kick all Dells off planes. That would only take care of Dell's share. I think it's going to require a redesign of both the batteries and the machines they plug into.

Friday, August 04, 2006

Man, it's hot!

Yeah I know, I'm pretty much stating the obvious here. However, heat is currently a big deal in computers. The faster and more complex that our machines get, the more heat they put out, or at least attempt to.

Getting the heat out is the most important part of a chassis design, however the major manufacturers have apparently forgotten that golden rule.

I was working with a particular design from the most major manufacturer out there, and was utterly shocked by what I found. This particular machine has a serious airflow issue, and we can thank the new BTX format for that. This revolutionary design that the big guys have taken to recently makes for a very quiet machine, that's true. They can use just one fan to effectively cool the processor, and supposedly all the other components inside.

The problem was that under heavy I/O operation, like installing a large program, this design really doesn't do very well. See, the one big fan blows air over the processor first and foremost. While that's all great for the processor, it's not too great for everything else inside, as the air that's supposed to be cooling components is now very hot. The old design, ATX, had the air move over everything else first, and then get to the CPU and out the case.

So in this particular instance, I was installing Peachtree Accounting 2007 on this brand-new machine and it could never make it through the install. Not only would it spontaneously reboot, it wouldn't run at all for about 3-5 minutes afterwards, because the chipset was just too hot.

The funny thing is that I experienced this exact same problem in a machine I recently built and had to install a side fan just to keep the thing running. However, since I do massive testing on each and every machine in near-85 degree conditions, I located the issue and fixed it before the machine left the bench.

I wonder how the poor call center person I have to call to fix the other one will react...

Friday, July 28, 2006


With all the hubbub surrounding the Core 2 from Intel, some of you may have missed out on the bigger news in the industry: AMD is planning to buy ATI in a $5.4 billion purchase. If you look even just a little out there, you'll see that this deal was recently approved by AMD's stockholders, and pending SEC approval, it looks like it is good to go.

Actually, even if you've been following the Core 2 launch yesterday, you will have no doubt heard of the emergency revamping of machines from ATI graphics to NVIDIA graphics. There was supposed to be an ATI Crossfire exhibition right on stage using the Intel Bad-Axe motherboard, however early this week those machines were pulled. You really can't blame Intel, the media would blow it up.

What's more interesting to me about the merger is the direction that AMD will move with the addition of GPU's and high-performance chipsets to their line up. Back in January, I wrote about the return of the co-processor. In that article, I said that we would begin to see field programmable arrays that could fit into the standard Opteron socket to speed up things such as Java applications. We have indeed seen that happen, and Cray has announced a super computer using a mix of FPGAs and Opterons.

Of course, how many of us use multi-million dollar supercomputers? I went on to describe how the movement would affect the average consumer. I talked of NVIDIA creating chips to accelerate TCP/IP interactions and encryption, or rendering chips to accelerate CAD and 3d modeling. I still think this is in the works, however it may be an ATI product now.

Then onto the standard desktop. I can see that in just a few years there will be the option to use socketed graphics chips, physics chips, audio/video processors or network accelerators in the home machine. The beauty of these chips will be their ability to be upgraded. No more having to buy an entire PCB, complete with power management features, memory, and all the headers and the like. You can just buy the chip and swap it out on a board.

What this has to do with the merger is that since AMD has access to Hypertransport, so does ATI. AMD is already talking heavily about multi-socketed motherboards with their 4x4 initiative, and it will not be difficult to make an ATI physics processor for gaming. As usual, everything will start in the enthusiast market and trickle it's way down to the home user.

Overall, I'm very excited about the future of computing. Windows Vista is a huge change from XP, and any time you have such an impressive change, you will have hardware adapting to best utilize what's available. It's a great time to get into computers!

Wednesday, July 19, 2006

intel's Big Strike

While not shipping to vendors yet, intel's newest Core 2 Duo chip is making quite the waves all over the industry. Unfortunately, as this is not a hardware review site, I haven't had hands-on experience with one yet, but what I've read backs up everything I've ever said. Everyone in the industry is claiming a revolution in processing power.

A lot of people asked me if this chip was worth waiting for, and honestly, I think it might just be in certain groups. Fact is, under normal circumstances, the Core 2 is not all that much faster than the latest AMD offerings, maybe 5 to 10 percent. However, once the variables of over-clocking come into play, this thing is killer.

The Core 2 to get is the E6600 chip. This is the least expensive chip with the full 4MB of cache on board. At stock speeds, this chip runs at 2.4ghz, on a 266mhz system bus. The fastest of the Core 2 line currently is the X6800, running a cool 2.93ghz on the same bus. For a quick reference, CPU speed is derived by multiplying the bus frequency by a number. In the case of the X6800, that number is 11, and the E6600's is 9. To get the $319 E6600 to the $999 X6800's speed, you'll simply need to get the bus to go faster. For 3ghz, you'll need a 333mhz bus, which is exactly what intel's 975x chipset was designed to push.

Reports on the web show that bus speeds on some motherboards are in the 450mhz range, giving an impressive 4ghz with this little chip. Interestingly enough, this little guy can run up there with a little more voltage and proper cooling. At 4ghz, there's nothing even close to this chip.

So what this means for the consumer is that there will be enthusiast companies putting out super-clocked machines that can obliterate the competition. It also means there's a lot of headroom for the X6900 and beyond. It will be very easy for intel to ship faster chips when they need to.

Cheers to intel for finally pushing the processor market forward.

Tuesday, June 20, 2006

The Apple-Intel Experience

If you've read the other posts around here, you may be thinking I'm kinda like John Hodgman from his new commercial for Apple where he claims to be a PC. (by the way, John is one of my favorite authors and contributors to the Daily Show, and you should check him out) I am in no way a "fan-boi" of either camp. Computers are to me as hammers and saws are to a carpenter, or like guns are to Robert DeNiro in Ronin. It's a toolbox, and I use what I need.

With my position over here at the newspaper, I find myself constantly needing both a Mac and a PC. When the intel-based Macs started shipping en-masse, I figured it would be the perfect tool to drop in my toolbox, and I asked very nicely if I could have one. One of the reasons I got one was to look at the performance of this machine compared to the G5 models that preceded it. While I wrote a massive 6 page evaluation for the company complete with graphs, charts, and tables, I figured I'd just drop a couple of the important things here in case anyone has thought "I'm gonna get a Mac" recently.

As anyone who is into either Apple Computer, Intel Corp, or the stock market in general knows, Apple is attempting to turn around a three-year slum by switching from IBM's PowerPC line of processors to intel's Core line. This change brings about a need to recompile all of our programs to what Apple is calling a "Universal Binary." You can find out just how many Universal Apps you have just by looking in the file properties.

Apple has bundled in a little program called Rosetta to make the transition easier. However, even though the emulation software is very good, it's still emulation, and still quite slow. With Adobe promising not to release any new Universal Binaries until Creative Suite 3 ships and Quark requiring the upgrade to version 7, most media companies are going to steer clear until they need to update their software.

Luckily, with Boot Camp, we can just buy and install a copy of Windows XP and use everything at light speed. This honestly is the most functional environment, in my humble opinion, as there is no Rosetta to slow you down. Everything just works like it should, which is nice. Granted, the webcam doesn't work, and the buttons feel a little weird on the keyboard, but other than that, everything's just peachy.

So should you buy a new Mac or not? Maybe... In a professional deployment I would have to say no. In a home environment? I can't say I wouldn't. Even Dell would have trouble beating the price on this thing for what you get. Remember, it's a keyboard, mouse, monitor, speakers and a computer that's going to end up faster than most towers out there in everyday usage. Not to mention, it is a very attractive package.

Tuesday, June 13, 2006

Windows Vista On Hand

I recently picked up a copy of Vista Beta 2 to play around with on a couple different machines. I've used several versions from Beta 1 on up to the various CTP's that have been released to this point, but the public release of Beta 2 means that the product is just about done. There will probably be a release candidate 1 and 2 after this, and then the retail version should ship out to manufacturers by the end of December. With my earlier piece describing what you want in a PC today, I felt an update would be good.

First off, there are two ways to use Vista: first is with Aero Glass and second is just plain Aero. Once you see Aero Glass in action, you'll probably want to have the ability to run it. It's just freaking cool. Without a video to show it in action, it's hard to really give you the idea of what a great User Interface Microsoft has put together for us. Windows kind of spin down to the taskbar when you minimize them. When you want it back up, it's a quick pop back up. On one system I installed the software in, the movements were silky and very nice, on the other, not so much.

Just to recap, Microsoft currently requires a modern 1.0ghz CPU for Vista Premium. After watching the beating that was put on my Athlon 64 at 2.6ghz (a tick above 4000+ model rating), I highly recommend a dual-core CPU. My system was literaly stopped dead a few times while just listening to music and surfing the web. With the extra core to take the overhead, you'll have one left to use for real work.

Microsoft also requires 1GB of system memory. Both systems I used have a lowly gigabyte of RAM, and it really should be 2GB for this system. Vista is a huge OS, using up about 50% of my memory constantly. I've seen the usage go up as high as 75%, and I think that included the 1.5GB page file. Whether or not Vista utilizes all of your memory any better than XP did is still to be seen, but having the extra will always be a good idea, especially if you are going to be running Office 2007.

The last requirement that we really need to worry about is the "graphics processor that runs Windows Aero" bit. Just below is 128MB of graphics memory. The main difference between the two systems that I used was the graphics processor, and man, it's a big one. In one system an integrated GeForce 6150 was used, the highest end of the integrated GPU spectrum, with 128MB of shared memory. The other has a GeForce 6800 with 256MB of memory. Both cards are from the same generation, with the 6150 being significantly cut down with much slower memory and a fraction of the pixel pipelines of the 6800. On the 6800, everything was silky smooth, however with the 6150 things are choppy. It's not quite a disaster, but it's annoying to see one system run smooth and the other not.

One reason why the 6150 system ran so much slower is the monitor that was used with it. It was a 17" LCD panel with a native resolution of 1280x1024. That's a whopping 1.3 megapixels, and probably too much for a 128MB frame buffer. At 1024x768 or lower, things would probably smooth out a bit, but Vista is really designed for widescreen monitors anyways, at a 1280x720 or higher resolution. You can tell that when Sidebar is set to stay on top, everything looks out of whack on a 4:3 display.

You may have thought it was all bad news, but wait... there is something good to come from this: Vista is remarkably stable so far. Previous builds were about as buggy as you would expect beta Windows to be, but Beta 2 has surpassed my hopes so far. If it wasn't for an issue with my LCD HDTV not hooking up right, I would probably be using it on a 24/7 basis at home. There are some other little things, like the network connection likes to drop out every now and then, and sound controls are less than stellar, but things will get better with time. The fact is, the redesigned device driver model is working great. Where in XP you would see a blue screen and a reboot, now you just simply hear the device being unhooked and reinstalled, all automagically.

So you may need to get a new machine to run this beast, but if you have the performance, you'll be very happy. Now if I can just get this TV hooked up, I'll be very happy.

Wednesday, May 31, 2006

What's to come with Firefox 2.0

As a true goober, or professional geek, I try to check out everything I can before my users start asking me questions about it. This includes everything from new platforms like Apple's Intel platform and AMD's new AM2 platform, all the way through to new software. Today, I'm looking at a preview of Mozilla's Firefox 2.0, currently known as Bon Echo.

One of the most handy things that this release will eventually bring us is "Text Field Spell Checking." While most people are thinking "Why do you need spell checking in a web browser?", I personally can attest to it's helpfulness in posting things like blog entries and forums posts. While my grammar may still be a little sub-par, at least I know my spelling will be good without cutting and pasting into TextEdit first.

Other big additions to the Firefox realm are an improved search field and a couple of security features. Soon Firefox will compare the URL address of a website to a list maintained by Google to help prevent phishing. Also, by using McAfee's SiteAdvisor, we can see if a website is known to sell out your email address to spammers and the like.

All in all, this release of Bon Echo is very stable, and has enough enhancements for me to finally drop Safari as my browser of choice for Mac OS. I will also be using it exclusively on my Windows machines, and I can only hope for the speed enhancements to follow over. All ready in development is Firefox 3.0, known currently as Minefield. I have used it, however without Universal support for Mac OS X, usage was just painfully slow. Once Firefox 2 is out, I'll start playing with that one more.

Friday, May 26, 2006

Boone Night Sky

One of the things that us techy guys seem to love is electricity. This shot shows nature's most impressive electrical shot in the form of a burst of lightning. I took this shot with a fairly old Sony digital camera, and yes, with a new SLR I would've gotten much better results.

There are two important factors to catching a shot like this on your own: a long shutter speed and patience. I personally took about 30 different shots before I nailed this one, each taking 8 seconds of capture. There can be no movement while the shutter is open, so I had the camera setup on a window sill to keep it still.

With a decent camera you can have a lot of fun trying to catch nature in the act. If you are going to be shopping for a new camera any time soon, remember that most of us have no need for more than about 5 megapixels. This camera was actually a 3 megapixel model, which has always had enough quality for me. Of course, I don't print images out very often, but a 4"x6" printout would probably look great. For 8"x10" prints, you will need atleast 8 megapixels.

Monday, May 22, 2006

The Best Buy

Back when I was a young teenager, I used to run for the Sunday paper and check out all of the electronics stores' latest offerings. I knew back then that I would never buy a pre-packaged computer, but I still used the ads as a standard for what a machine should cost. When I went to pick up my first tower, custom built for me by a local shop, I was fairly shocked by what I got for the money. Needless to say, I felt a bit cheated until I needed some help. You do pay a little extra for local support, but I know it was totally worth it for me. The best part was that with just a couple hundred dollars per year, I never had to buy a whole new package ever again.

Fast forward to today. I opened up the Best Buy catalog the other day and was kinda surprised at what I saw. Having been bombarded by Dell's recent $299 deals, I expected the outlet guys to keep up. Their lowest-priced offering was an eMachine at $379. That package doesn't even include all the handy upgrades that Dell throws at you for free, like the flat-panel upgrade and free printer. So the question of today is "What's the Catch?"

The catch is what's under the hood of these packages. Dell buys such a large amount of product that they have to clear house at the end of every quarter to keep the stock holders happy. In fact, the spring is always the best time to buy a cheap computer, as this is always the slowest time of the year. This year this rule applies double.

Next year, Microsoft will release Windows Vista. If you haven't heard about it, it's a revolutionary leap some five years in the making from Windows XP. I have personally used some pre-release versions of Vista, and it was a little painful to see a $3,000 machine slow to a crawl. In fact, a recent study showed that as many as half of the machines in circulation today will not even meet the minimum requirements for Vista, much like the $299 Dell. Luckily, I'm here to tell you what not to waste your money on this year.

Budget machines come in a variety of flavors, but for most manufacturers this means cutting back on parts as much as possible to maximize profit. Integrated graphics save at least $50 per machine, and this is the first thing to watch out for. Microsoft's new Windows will need a dedicated graphics card with some pretty serious horsepower. While you don't need one now, you do want to make sure you have a PCI-Express x16 slot for expansion when you get into Vista.

Second up is 64-bit computing. Right now it seems a little silly to need 64-bit computing, and, in fact, it is. Vista changes everything, though, and 4GB of memory will fill up quickly. Today's machines should all have 1024MB of memory, but for true Vista readiness, look for 2048MB. Also make sure that your processor is 64-bit, like AMD's offerings or select intel processors, so that when Vista comes, you'll be ready to go.

Lastly is the advent of "Dual-Core" processors. For the most part, the difference between processors in the same series is minimal, with only those that really want the best of the best paying out for the top of the line. With Dual-Core processors, like the Athlon X2 or Pentium-D, there is a huge jump from their single-core counterparts. Vista will be designed to utilize these even better, with advancements to DirectX and the like. I highly recommend paying out a bit now to get one of these. Windows XP will have a nice speedup, as well.

While there is so much more out there to touch on about buying a machine, the main point is that the cheapest thing just won't cut it this year. The security enhancements and user interface with Vista will make it worth having, so don't get something now that won't run it.