Total Hits: 774937
|Past 7 days: 767 hits|
I was going to write a sort of comparison between the Sony, Toshiba, and IBM Cell processor, and the recently announced PWRficient from PA Semi, but I felt I should first take a look at recent years changes in the microprocessor industry.
Until a few years back, every processor targeted at the desktop and entry to mid level server markets where designed and optimized to execute single threaded applications sequentially as fast as possible. The mantra was to increase the raw performance of the processor when running a single threaded application. Even tasks that could be massively paralleled where mostly coded in single threads to optimize their execution on those processors. Not that the programmers lacked the skilled, like some like to think, or that its a hard task to break up an application into multiple threads that can run in parallel. Any self respecting programmer should be able to do that.
This push for raw clock performance lead to a quick jump in clock speeds that was poorly matched in technology development of silicon manufacturing. By poorly matched I am not referring to gate switching speeds which enable for higher clock numbers, but rather to preserving the operational power efficiency of those processors. This rather crazy hype in raw performance by means of increasing clock speed was finally stopped by the astronomically high thermal power dissipation numbers that made current microprocessors have power dissipation figures that are close to nuclear reactors when measured on a per area unit base. Back in the mid 90s, thermal leakage comprised about 5% of the total power that a processor consumed, if not less. Now we have figures that are over 60% for thermal leakage of the total power that current processors require to operate which leads to ridiculously large and in many times noisy cooling solutions that need to be used just to keep them running. And even then, these processors would be running at such high temperatures that would only require a few seconds to fry those processors when the fan of that 750g heat sink fails despite the heat sink having an area of over 1 square meters.
Hitting these simple physics barriers, or more precisely, knowing that they will hit them sooner than they expected, the microprocessor giants went back to the drawing board and decided to change the way things work. The new marketing theme they introduced, albeit too late (better late than never, right?), was parallelism. Even then, the move was to tape out products as soon as possible without necessarily redesigning those products the way they are supposed to be made. Basically, all they did was glue a pair of the processors that they are currently selling and let the marketing department find a way to convince the public that this was, as always, the way things were meant to be. There was no real going back to the drawing board to design a real parallel processor that could take the computing world to the next level.
If you asked me, why would anyone want a 3.8GHz, single core processor that demands a lot of power, and hence a lot of cooling, yet has poor Instruction Per Cycle (IPC) figures, instead of a 400-500MHz processor that is comprised of 8 or more cores or execution engines that are really and efficiently integrated (and not glued) together? Think of the graphics cards that are in the market today, and their impressive performance figures, cut down the transistor count by a quarter (which also means manufacturing costs) and you will get an idea of what I am talking about.
Now what would an old design like the 486 need to have redesigned in order to keep it up with todayís standards? I donít think that matters as long as you keep the transistor count low per core. First, the instruction set would need to be updated to today’s standards like adding SSE (1,2, and 3), maybe even 64-bit extensions. Execution units redesigned to be able to execute those instructions effectively. The addition of some extra registers, some power management abilities like dynamic clock throttling and ability to turn off unused functional units and entire cores. Throw in a nice 2MB shared cache between the cores, a wide memory interface that could nicely run in dual channel configuration for very high bandwidth memory access scheme like a dual channel 128-bit wide memory controller (even with cheap DDR400, that would give a 12.8GB total memory bandwidth, though memory modules will have to be installed in quadruples), and you will have a very happy processor that is really capable of giving the best current desktop processors a money for their run.
Now, some people may argue about the inadequacy of such a processor for some applications like games, office applications, and the such. My reply is that if the designers of those applications went back to the drawing board they would be able to find various ways to exploit parallelism in their applications. Take a word processor for example, one core could handle spell checking, another text formatting, a third can handle the user interface part, and you could even throw grammar check in the face of a fourth core. Each of those tasks is not a heavy load on a processor by itself, itís the combination. Even games could highly benefit from parallel machines, its just that there wasnít a drive on the hardware side to push game programmers to exploit that on the software side. All those parts of an application can be done in parallel while an extra core or two take care of the operating system to keep user responsiveness high, very high I would dare say.
The technology to make such a product has long existed, and due to lower clock speeds, such solutions would run on very low power figures compared to todayís mainstream processors even without implementing any form of power management.
In my next post, I want to talk about a new processor architecture that has been making the news for the past few days, which is PWRficient from PA Semi, and the way it approaches parallel thread execution in comparison to IBM’s Cell.
This is my first trial in the world of blogging. I have been a computer and technology geek for over a decade now. Actually, I was a savvy computer user since the days of the 8086 and DOS v3.3 and I can honestly say that the computing world has come a very long way since then in pricing, functionality, and ease of use.
I’ve been using PDAs for the last 4 years, including a number of PDA-phone devices. Currently I am using an HP iPAQ 6315, which is a nice consolidation device, but still not THE consolidation PDA-phone.
My first trial with a PDA-phone was the treo 180 with its Palm OS and monochrome LCD. It was VERY awkward to hold by my head and carry a phone conversation. The device felt and looked like an early 90s brick phone. But that was long ago when the idea of a consolidation device was still in its early days.
Now, we have many PDA-phones, or smartphones like some people like to call them, yet still you cannot find the perfect device even though its not that hard to make such a device with today’s technology. THE consolidation device for me would be a device that has the following features:
-tri or quad band GSM radio.
-Wi-Fi radio (802.11b or g)
-SD slot (preferably SDIO compatible).
-3″ or larger touch LCD screen with at least 320×240 resolution and 16-bit color.
-integrated thumb keyboard with backlight.
-descent picture quality camera (at least VGA) with the ability to record video.
-a 400MHz processor.
-at least 64MB of RAM (128MB is always nice).
-weight around 170g (6oz).
As for the OS that powers such a device, it can be either Windows Mobile or Palm OS. I think each of those OSs has its own points of strength and weakness.
This leads me to a discussion about the devices currently available in the market. Lets start with the Treo 650, which is a very attractive device when it comes to looks, and carries one of the best (if not the best) thumb keyboards in a pda-phone device, and the quality of the screen is quite impressive too. But its still not the consolidation device that can solve all your mobility needs, and that is because it lacks integrated WiFi or the ability to add it through an SD card even though wifi IMO is a very important aspect of a consolidation device. Also the 144mhz processor in the treo is somewhat underpowered for fancy applications like skype (assuming there was a Palm OS version for skype).
Next we have the HP iPAQ 6300 series, which if it wasnt for the rather slow processor and lack of backlight in the keyboard, it would have been almost perfect consolidation device. I say almost because the snap on keyboard makes the device awkward to hold and carry when snapped in, and feels like a small brick when carried inside a pocket. HP should have made the keyboard for the 6300 much thinner and should have added backlight. Other than that, I wouldnt have any major thing to complain about with this device.
While talking about HP, lets look at the new 6500 series iPAQs. While the so called Mobile Messenger looks and feels much better than the 6300, it comes with its own shortcomings. First, we have that square 240×240 pixel LCD. Why on earth did HP choose to use such a resolution? I would understand that the device couldnt fit a 4:3 LCD and the engineers at HP had to compromise and use a square LCD, but they could have chosen an LCD with a resolution of 320×320 instead which would have given the user more “desktop” real estate than even the traditional 320×240 screens traditionally used in PDAs. The next issue with the 6500, which IMO is more critical than the low resolution screen, is the absense of WiFi in the device. With hot spots growing in number by the day, WiFi has become very important for keeping connected. While the engineers followed the 6300 design and put the SD slot on the side of the device, this makes it a much bigger drawback than it was in the 6300 design, because if someone wanted to add a WiFi SD card it would make the device very uncomfortable to hold and use, to say the least. Then, we have the integrated GPS, which is a nice feature by itself, but factor in the fact that reception in a metropolitan are is rather weak, add the further weakening of the signal by the car in which you will be using it in, and you will end up with a solution that is practically useless. Finally, we have the rather small battery capacity, or should I say short battery life, especially when compared to the 6300.
Ok, having finished from the HP line, lets move to the other major manufacturer of consolidation devices in the Windows mobile market, which is HTC (T-mobile, O2, i-Mate, orange, Qtek, and others brand HTC products and sell them as their own). First we have the HTC Himalaya (i-Mate PDA2,tmobile MDA II,O2 XDA II, Qtek 2020) which is a very well designed piece of equipment apart from the fact that, again, it lacks WiFi access and some sort of keyboard. Then we have the HTC Blue Angel (imate PDA2k, tmobile MDA III, O2 XDA IIs, Qtek 9090, and recently, Siemens SX66), which was the other candidate for my consolidation device before I bought the 6300. What turned me away from the Blue Angel was its rather short lived battery life consuming 40-50% of juice for each hour of Wifi or GPRS surfing. Then we have its rather heavy weight of 212g (7.5oz), and don’t forget about bulkyness feel it gives when held in your hand. Apart from that, the unit is very nice. It has a nice backlit keyboard, a 400MHz Xscale processor, 128MB of RAM, a VERY nice arsenal of programmable keys, Wifi and BT (though BT 1.1 not 1.2, but that isnt a major issue for me). All in all, it almost has every feature that a consolidation device should have. Notice that there is a CDMA version of the device that cells as the Audiovox 6600, which featuers almost the same specifications of its GSM cousin. The differences are the CDMA radio and the lack of WiFi.
Recently HTC released two new models, the Universal (i-mate jas-jar) and the magician (i-mate k-jam). Universal is the successor of Blue Angel. It boasts a very nice design that reminds one of a tablet PC with its swivelling screen. While the specifications sheet is rather impressive, including a swivelling VGA screen, EDGE, WiFi, BT, a rather large and comfortable keyboard with backlight, Windows Mobile 5.0, 96MB of ROM and 128MB RAM, but this comes at the cost of a hefty weight of 285g (10oz), which is a big turn off in my opinion. Forget about dropping this brick inside the pocket of your shirt, or even a light jacket. I think the Universal is a Windows Mobile laptop wanna be.
The only device on the market now that comes close to my “ideal” consolidation device is the HTC magician (imate k-jam, qtek 9100). The devce is small enough and light enough to remind you of a phone. It comes with EDGE, 802.11b/g WiFi radio (which is to HTCs credit a first), BlueTooth 2.0, 1.3M pixel camera (though not that good), a nice and backlit keyboard, WM 5.0, a 195MHz TI OMAP processor, 128MB of ROM, and 64MB of RAM. The most impressive feature is that all that is packed in a 160g (5.6oz) device. Quite impressive. The only complaint about it until now is the not so good quality pictures of the camera. While its power consumption figures are better than the Blue Angel, its still not up to the job of getting a heavy user through the day without having to run to the nearest power outlet. My other complaint with the device is the small 2.8″ screen, which makes it a bit harder to read text, compose emails or sms messages, or browse through the net.
While there isn’t much competition in the Palm OS arena, I think that windows mobile has swept away the smartphone market because Palm didnt do its homework well back when it was the dominant in the game. So Palm is the one to blame for not doing much when Microsoft continued to develop and enhance their Windows CE based OSs (Pocket PC, Smartphone, and now Windows mobile), Palm seemed to have reached a plateau with the rather old now version 5 of its Palm OS.
And this concludes my first blog post. I would be glad to hear any comments you may have.