View Full Version : Which is Notebook is Better Mac or PC?


Scott Marvel
February 23rd, 2006, 01:09 PM
I have an HVX 200 on order and could use advice about which laptop computer to buy so that I can transfer data from the P2 card to a laptop.
I edit on Final Cut but the new Mac BookPro laptops don't have PC card slots. Could I get a cheap PC laptop and download the data and then transfer the data to my Mac later?
Any suggestions?

Zack Birlew
February 23rd, 2006, 07:39 PM
Ooooooh, that's a tough one, Scott. Currently there are plans for a PCMCIA adapter for the newer Intel Macbook slots but I haven't heard of any of them coming out yet. The trouble I see is that you're already using FCP, so if you could, I'd suggest you wait a bit until your HVX200 actually comes in as I hear they're taking a bit to come in.

If you need a laptop RIGHT NOW, then you'll have to go with a PC laptop and do as you've said, to download to the PC and transfer to Mac later.

Robert Lane
February 23rd, 2006, 09:13 PM
Scott,

In another thread (actually in a few) I've mentioned several reasons why you should not get a MacBook Pro - yet. I don't want to retype all that stuff again, but suffice it to say that if you're on the FCP platform and need a laptop you should jump on the last of the PowerBooks before they completely go away.

It will be months before the MacBook Pro & FCP become a stable platform and even then you'll still be missing some very important I/O ports that the PowerBooks have and the MacBook Pro's don't.

Read my post about configuring a PowerBook for capturing HVX footage; there's plenty of info about what you need to know.

Zack Birlew
February 23rd, 2006, 10:02 PM
Ah yes! How could I forget!? Thanks, Robert, you've helped Scott more than I did! =)

Man I'm getting too old for this stuff...

Dean Harrington
February 24th, 2006, 02:49 AM
about a dual-core AMD laptop for serious magic green screen work. I use FCP on a mac and would only be using the laptop for direct X intensive work and for authoring to windows media player.

Robert Lane
February 24th, 2006, 07:17 AM
Dean,

I'm not knocking the PC platform you're daydreaming about, but you should be wary of any AMD-based platform.

I used to own an IT/PC consulting firm on the side from my main business; the systems we saw in for repair the most, regardless if they were tower or laptop were AMD chipsets.

AMD has always given Intel fits especially in the late '90's when AMD often out-clocked them and chip prices were less than half than that of a comparable Intel. However, this speed has always come at a high price, that being heat, and lots of it. We always considered AMD to be the tuner-kid with a nitrous bottle in his Civic: Hit the button and you've got blazing speed, but often you'd burn the motor in the process.

And that's exactly what happens with AMD systems: The chips are so continuously hot that they literally pop the capacitors on motherboards. It was common even as recently as last year to see an AMD system fail within it's first year. And of course, laptops have far less venting than towers so we saw tons more dead laptops than towers.

My advice, is not to get starry-eyed over a dual AMD solution; if you're going to be using any PC laptop make sure you're looking at an Intel-based chipset. In our experience, the most reliable laptops with the fewest issues were either IBM Thinkpads, Fujitsu Lifebooks or the high end Sony Vaio (aluminum case). Everything else was a crapshoot.

Dean Harrington
February 24th, 2006, 07:40 AM
I'll keep that advise in mind. I wasn't aware of the AMD heat problem!

Zack Birlew
February 24th, 2006, 10:56 AM
Hmmm, that's strange, Robert. I haven't heard of heat problems related specifically to AMD since the old Athlon XP days. Those did have issues far beyond mere heat problems but things are much better these days with the AMD Athlon 64's, X2's, FX's, and Opterons. The capacitor thing I have no idea about unless you're mistaking the heat issue for the faulty capacitor drama that unfolded where a rash of motherboards were given faulty capacitors out of the Korean factories or somewhere in Asia, I can't remember exactly, it had happened last year. Other problems were with earlier motherboard designs where they placed the capcitors too close to the socket. Things have come a long way since then. As far as modern heat issues go, that's most likely due to user error or faulty third-party manufacturers.

If you do want to go with an AMD solution, I suggest going with a good brand-name dealer like Alienware, Falcon Northwest, Voodoo PC, or Polywell.

Jeff Kilgroe
February 24th, 2006, 11:59 AM
And that's exactly what happens with AMD systems: The chips are so continuously hot that they literally pop the capacitors on motherboards. It was common even as recently as last year to see an AMD system fail within it's first year. And of course, laptops have far less venting than towers so we saw tons more dead laptops than towers.

The current crop of AMD64 (FX and X2) CPUs all run cooler than any of the current Intel desktop CPU offerings. Prescott core Intels average a good 10 to 15 degrees C hotter than AMDX2 chips. There ware a lot of problems with AMD MP/XP CPUs "overheating" and systems going down, however this was actually not a CPU issue, but an AMD 76n chipset issue. The northbridge on these mainboards would fry itself and often take CPUs or other components with it, even with boards from bigger names like Tyan, Asus and MSI. Combine that with faulty USB controllers, improperly grounded USB interfaces and buggy AGP interfaces, it's no wonder AMD got out of the chipset business.

Right now, AMD has the best CPU options for desktop systems... Hands down, no comparison. Intel's dual-core implementation is inefficeint and has several bottlenecks, including the brain-damaged approach of non-linked cache, forcing CPU to CPU communication to exit out onto the external CPU bus. In other words, the Pentium D was a half-assed, hack job solution to get something to market ASAP. The new Core Duo isn't much better. Take another look at the numbers that Apple is claiming for the Core duo and put a little thought into it... Hmmm... It's two years newer than the G5 and about the same speed. Yes, the new MacBook Pro is "4X Faster" than a PowerBook G4. Considering it has dual CPU cores, and each one is 2X as fast as the nearly 5 year old G4 chip, I should hope it's at least 4X as fast.

As for AMD system failures... Yes, quite common with AMD XP/MP CPUs and 760/760MP[X] chipsets. Not anywhere near as common with FX and X2 CPUs, but most failures come from inadequate power supplies rather than overheating CPUs and components. I see all too many PC vendors shipping dual-core AMDX2 systems and a 7800GTX video card and they only put a 400W PSU in the thing. That system is doomed for failure within the first 250 hours of intense gaming. I build about 75 to 100 systems each year, many of them for our internal use here (render farm, workstations). All the overheat problems come from the newer Intel boxes with Prescott and newer derivative core CPUs. The Northwood core systems and equivalent Xeon based options were great... And faster per clock cycle for most operations too. I have 22 AMD 4800+ dual-core systems running as a render cluster right now. They grind on animation frames 24/7 and every single one maintains temps lower than 119F for both the CPU and internal box temp. I can't get below 127F on a Pentium D without using a peltier or liquid system.

The Core Duo chips in PC notebooks (same exact chip as in the new Macs) is failing to impress the PC crowd. It's the same Centrino chip we've all had for the last couple years... No speed increases, no improvements. Just now there are two of them glued together sharing the same CPU bus, oh joy. Intel will have CPU linkage via the L1 and L2 cache in their upcoming dual CPU offerings for desktops/workstations due later this year (think summer/fall) along with some other enhancements. Hopefully this will help relieve Intel's downward spiral. Apple may have made a mistake committing themselves to exclusive Intel usage. It could be that the PC will continue to offer the most CPU power for the money when AMD will have quad-core CPUs by the end of the year and Apple won't be using them.

As for mobile notebooks... The AMD Turion mobile CPU just can't seem to get off the ground, even though it is a superior CPU to the Pentium M in many ways. Especially for applications like video, graphics, etc... It has nearly 3X the FPU performance vs. Centrino and better power management. For some reason the whole Pentium M/Centrino platform has a captivated audience right now and nobody is willing to explore other options. ...Too bad. AMD will have dual-core Turion in full mass-production within 2 months too. The only mainstream PC vendor to even be offeirng right off will be HP and only in a few select system configs. Intel hit the mobile PC marketing jackpot with the whole Centrino campaign. Most people don't even realize that Centrino is nothing... Centrino is Intel's trademarked name for combining an Intel mobile CPU with a wireless networking solution (doesn't matter which type or brand or anything) into a compact or mobile platform.

Jeff Kilgroe
February 24th, 2006, 12:05 PM
If you do want to go with an AMD solution, I suggest going with a good brand-name dealer like Alienware, Falcon Northwest, Voodoo PC, or Polywell.

I have no experience with Voodoo, but I can say that they're nothing more than an integrator that sells fancy-painted boxes. Nothing there to justify their insane price tags. Alienware I have dealt with and I won't go into details here, but people are willing to mail me if interested. I wouldn't deal with that company ever again, and I'll gladly tell anyone else to steer well away from those guys. Horrible customer service, unethical business practices and overpriced systems built out of off-the-shelf components and placed into fancy boxes at a super-premium price tag. Good marketing on their part because they're now so well known... Crap products and service overall though.

Zack Birlew
February 24th, 2006, 12:05 PM
Oh, well, yeah. Alienware isn't the best but their systems do perform well if there isn't a problem. Tech support is based around people who can hardly speak english and don't know what they're doing. Half the time they're just looking up the FAQ section of the site for a solution. But, as I've said, their systems do perform well when they work.

Robert M Wright
February 24th, 2006, 12:19 PM
I never understood why the heck Intel replaced the Northwoods with Prescotts. What were they thinking???

Jeff Kilgroe
February 24th, 2006, 12:21 PM
Oh, well, yeah. Alienware isn't the best but their systems do perform well if there isn't a problem. Tech support is based around people who can hardly speak english and don't know what they're doing. Half the time they're just looking up the FAQ section of the site for a solution. But, as I've said, their systems do perform well when they work.

Yeah, that pretty much sums it up... I had a very bad experience with them on a business purchase of more than $45K. Not my decision, but I had to deal with the whole mess. Their tech support is worthless and they would do better to just sell tech support and warranties through a third party. Their systems do run well since they're one of the vendors that go all out to provide the latest and greatest. Unfortunately, they often do this at the expense of necessary compatibility testing and their prices are very much over-inflated. Any Joe Geek can buy all the same components and build the same system for a lot less money. But whatever... That has been Alienware's claim to fame -- as soon as a new video card is available for pre-order, Alienware has it in their system configurator. And by cracky, they'll sell it to you whether it will work properly with the rest of their system or not! :)

Jeff Kilgroe
February 24th, 2006, 12:32 PM
I never understood why the heck Intel replaced the Northwoods with Prescotts. What were they thinking???

Marketing... It's all marketing... Northwood hit the MHz wall at about 3.2GHz and yields of those were lower than expected. Intel was still trying to play the MHz game so they reconfigured the P4 design to be a more robust and heat tolerant with the intent of pushing speeds to 4GHz. So now they had the Prescott with a slightly different instruction pipe and it ran 15% hotter, but could tolerate nearly 35% more heat overall. Applications started seeing performance hits with Prescott (more than Intel anticipated), but instead of going counting their loss and going back to the drawing board, Intel re-optimized their compilers and pushed forward with the Prescott and subsequent designs.

And we still don't have 4GHz P4 CPUs because Intel never got a lot of other things to work like their faster bus speeds. The current P4 design is flawed in that it bottlenecks itself at about 3.4GHz when on an 800MHz bus. Now take a 3.4GHz P4-D on that 800MHz bus where it has to share that 800MHz FSB for inter-CPU communication with two CPU cores. Stupid, stupid, stupid!!! This is why Apple/IBM/Motorola, HP/DEC, and AMD all went to the serial interconnect bus ("HyperTransport"). Bus speed is serialized and it increases along with CPU speed. Unfortunately dual-core AMD/G5/PA-RISC/etc.. CPUs with this bus implementation still all share a single bus for two CPUs. The G5 and AMD chips at least have shared cache and direct communication between the CPUs without using the front side bus. Upcoming AMD quad-core CPUs will incorporate some sort of multiple channel hypertransport implementation, which is all still top secret. All we can do is wait and see what shakes loose at E3 this year.

Robert M Wright
February 24th, 2006, 12:47 PM
Actually, here in the frozen tundra of Minnesnowta, we do appreciate the dual purpose chips (CPU/spaceheater).

Kevin Shaw
February 24th, 2006, 02:14 PM
I have an HVX 200 on order and could use advice about which laptop computer to buy so that I can transfer data from the P2 card to a laptop.

One option I'd consider for this purpose would be a dual-core PC laptop running Canopus Edius Broadcast software, which reportedly has some of the best support available for P2 MXF files. Contrary to what some here are saying, I've found the Intel dual-core processors to be reasonably effective for the price, and definitely more powerful than their single-core predecessors. Same should hold true for AMD dual-core laptops. An HVX200 with a dual-core laptop and a large external hard drive would be a fine video production setup, compared to what you could get for the same price just a couple of years ago.

Peter Ferling
February 24th, 2006, 02:35 PM
Marketing... It's all marketing... Northwood hit the MHz wall at about 3.2GHz and yields of those were lower than expected... All we can do is wait and see what shakes loose at E3 this year.

Agree. I'm still holding onto my dual xeon box with 3.0GHz. I also have an HP at home with a 2.2GHz northwood. I really don't see any benefit coming from newer intel models as of yet (other than the PCIe graphics), and these current boxes are getting old.

We have a deployed laptop solution where I work, and some of the very newest models are locking up when playing back my training videos. Twenty minutes or so into a wmv file and bam, freeze. Can't seem to figure it out. They played fine on last years models, and the disks are error free, software versions are the same. Different disks, etc. Only similarity is the length of time. It has to be overheating chipsets, (checking on that one).

If Intel doesn't fix the issue soon, I may have to relent and go AMD, and I'll go the HP route if I don't build it myself.

Robert M Wright
February 24th, 2006, 02:53 PM
When I go to dual cores, I'm going with the AMDs. Benchmarks I've seen show them just absolutely smoking the Intel P4Ds, at the same price points, at just about any type of task.

Jeff Kilgroe
February 24th, 2006, 04:32 PM
One option I'd consider for this purpose would be a dual-core PC laptop running Canopus Edius Broadcast software, which reportedly has some of the best support available for P2 MXF files. Contrary to what some here are saying, I've found the Intel dual-core processors to be reasonably effective for the price, and definitely more powerful than their single-core predecessors. Same should hold true for AMD dual-core laptops. An HVX200 with a dual-core laptop and a large external hard drive would be a fine video production setup, compared to what you could get for the same price just a couple of years ago.

Edius is cool.. As long as it has all the features you need, it's great and has the best support and performance for the HVX200 and MXF of anything on the PC (except maybe the very top Avid options for $$$$$ more). And yes, the dual core Intel chips do deliver more performance than the single core versions, provided your software is multithreaded and ready for multiple processors. As long as you don't saturate your CPU bus doing really heavy streaming video work in realtime, you won't notice too many bottleneck issues. I know I've been doing my share of Intel bashing in this thread, but I think I can find plenty of things to gripe about concerning AMD and Apple's G5 offerings too. I'm not loyal to any one platform, but when talking overall CPU power, AMD is where it's at (for now).

Robert M Wright
February 24th, 2006, 10:33 PM
Actually, Intel has never really produced the most powerful CPUs, at least not that I am aware of.

Even way back in the early 1980s, when IBM first decided they were going to compete with a little tiny company named Apple Computer, which rather unexpectedly had, almost single-handedly and overnight, created a surprisingly thriving little niche market, for a novelty item called "the personal computer" (as seen in "Playboy" magazine - Apple's first national advertisement).

This little fruit company on the West coast, selling toy computers out of a garage, wasn't so much a threat to IBM, as a bit of an embarrassment. After all, nobody respectable bought anything remotely resembling a computer, or even many typewriters, from anyone but true blue! Okay, there were a few miscreants that purchased oddball computers from a handful of little piss-ant rouges (and often got fired for doing so), but IBM was to computers, what AT&T was to telephones, for goodness sakes!

IBM didn't particularly care if a little company in Silicone Valley sold a few souped up calculators, but it was pretty gosh-darn embarrassing that, though they be just lowly toys, NOBODY was buying ANY of these squirrel cage, so called computers from IBM, because IBM didn't even have a product in this new arena!!! Obviously, this would never do.

Well, somebody sent out a memo I guess, that must have said something like, "Johnson, here's a few measly million. Go find some starving engineers and throw em some bones (contracts) to build us a fresh new one of these thingamajigs and bring it back so we can put our name on it, call it THE PC, and sell it." So, Johnson (or whatever his name was), being the standup, navy blue suited, white collar with a tie, IBM yes man that he undoubtedly was, prolly promptly fired back a memo saying "yes, sir" and "thank you sir" and got right to work (after all, Johnson wasn't the brightest bulb on the tree, and he was grateful to have a good job like this with a company that would never lay him off, and give him a nice little pension after he puttered around his modest little middle-management office for half his life).

Well, Johnson was pretty new, so his budget was pretty tight, and besides, even IBM didn't want to be squandering a bunch of it's bazillions in readily available cash at the time, from owning the computer business, lock, stock and barrel (since basically the beginning of time), on some minor little, face-saving, jazzed up electric typewriter with a few extra wires on the inside, a couple lights on the outside, and a speaker that beeped at you when you turned it on.

Anyways, Johnson, despite his undying loyalty to IBM, being a little wet behind the ears (and a little soft between them), went about committing what became the biggest blunder, bar none, in IBM's illustrious history, promptly sewed the seeds for IBM's eventual fall from the throne of the colossal computer empire that was IBM's and IBM's alone.

One relatively minor mistake in this epic blunder, was selecting Intel's new (at the time) 8086 microprocessor, to put inside, at the heart of the box that you plug the makeshift black and white television screen and the keyboard into. Well, this little microprocessor chip wasn't the slickest one around. Motorola made a much, much better one, the 6502, which was the "brains" (besides Steve Wozniac, not Jobs) in those little "apples" that started this whole mess in the first place. Johnson liked the 6502, but being a little strapped for cash, chose the Intel chip in a pinch, despite being a rather poor performer, because Intel, being the struggling new kid on the block, needed the business and was willing to sell them just a little cheaper than the competing chips, from far more established Motorola.

The story goes on, and the history of computers, and particularly personal computers, is full of interesting, and humorous (even if I didn't tell this one well) stories, but I think I will wind this little chapter up here.

Just a few years after Intel got the contract to provide those chips for IBM's new toy, NEC came out with a slightly faster version, at a lower price, that was part of the beginning of the(drum roll please) "Clone Wars" ...a multi-sequeled story that eventually ends in the fall of the blue giant.

Intel never did produce the world's most powerful microprocessor chips, but I guess, at least at one time, they were pretty dang cheap, and that got them a contract to supply a few chips for an awkward, minor little product in the IBM lineup.

Jeff Kilgroe
February 24th, 2006, 11:28 PM
One relatively minor mistake in this epic blunder, was selecting Intel's new (at the time) 8086 microprocessor, to put inside, at the heart of the box that you plug the makeshift black and white television screen and the keyboard into. Well, this little microprocessor chip wasn't the slickest one around. Motorola made a much, much better one, the 6502

Er... Not trying to be the party pooper, but you might want to find a new source for your history lessons. The first "personal computer" did not come out of a garage, it came from IBM in January, 1975 -- over a year before Jobs and Wozniak released the Apple I. And it wasn't Intel based, either. Nobody bought the thing, so IBM scrapped the idea thinking it would never fly. The original PC (IBM PC Model 5100) used the 8bit IBM PALM (Place All Logic in Microcode) CPU and initially shipped with 16KB of RAM - upgradable or configurable to 64KB. Street prices were as "low" as $5200 on this baby... The Modle 5100 had a 5" monochrome screen integrated right into it's typewriter-like chassis with a tape-drive.

Jobs and Wozniak introduced their creation on April 1, 1976 and it caught on because it was a bit more fun to use than other kit computers at the time. People simply liked it - it wasn't the most powerful kit computer, nor was it the first by any means and it took some effort to put it in a box and attach it to a display, etc... It wasn't a complete package like what was available from IBM at the time. Although, it was more powerful using the 6502 and in terms of CPU power at this price point, the main competitor was the Zilog Z80 CPU, which was available from a few other homebrew kit makers as well as the Tandy kits.

As the kit computers of the late '70s were proving to be quite popular, IBM decided to get back into the game of "personal computers". So they resurrected the PC Model 5100 in late '77 and shipped it standard with 64KB of memory, but the PALM CPU was now over 3 years old and not as powerful as the processors included with Apple kits or Radio Shack "Tandy" kit systems. Still they pressed ahead with the Model 5100 because it was still the only complete personal computer available and the only option you could actually go to a real store and buy. Apple was still selling systems via mail order and at trade shows or other geeky meetings like HAM/shortwave radio gatherings, etc.. IBM kept pushing the price of their M5100 down and while sales held steady, they were losing ground to the $500 kits. In 1980, the M5100 still had an MSRP of $2800. IBM didn't want to sacrifice their business model and didn't want to play the game of build-it-yourself kits so they went to the drawing board for a new system -- the IBM PC Model 5150. Now that the PALM CPU was nearly 5 years old, they needed a new CPU option, and fast... They took bids and a small semiconductor company called Intel happened to have what seemed like the right component for the right price -- the Intel i8088 microprocessor. It was a sweet little unit, capable of more than 10 times the number of operations per second than the PALM and super cheap to produce in quantity. It was also about 4X as powerful per clock cycle vs. the 6502 or Z80 CPUs, which were also about 4 years old at this point. The IBM Model 5150 was released in August, 1981 with 64KB of RAM and a 160KB 5.25" floppy drive as standard equipment and ran off the i8088 CPU at a whopping 4.8MHz. IBM introduced the M5150 at $2800, or about the same price point the M5100 was selling for, street price on the thing was close to $2100.

The Intel i8086 didn't enter the Personal Computer scene until later (like '85/'86) - it was the 16bit sibling of the 8088. And came clocked at 6 and 8 MHz, IIRC. The IBM PC clone market was also well established by the time the 8086 CPU entered the scene. The 8086 was later succeeded for a very brief time by the 80186 and then the 80286, which sold like hotcakes.

Jobs and Wozniak were by no means the first to make a Personal Computer, nor were they the first board kit makers either. They simply had a product that set them apart from the crowd - people really liked it. IBM killed themselves not by choosing Intel, but by not embracing the inexpensive kit model. Ironic that Apple started as a do-it-yourself computer and is now the only non-cloned computer system on the market, other than SGI. But SGI doesn't count anymore, IMO since they got de-listed from the NYSE and they're all but bankrupt.

Robert M Wright
February 25th, 2006, 12:53 AM
What I wrote wasn't meant to be a dry, literal (or anything like thorough) accounting of the history of computing, but rather, a bit of a lighthearted look back, basically from memory off the top of my head (no careful fact checking), being a little "liberal" with the facts for the sake of humor. The gist of it is founded in reality, but names and events have been altered by the age of my brain.

It occurred to me that Intel (and their chips) have played a huge part in this wonderful revolution in computing (and they deserve a big part of the credit), but there have almost always been competing chips that offered something more in the way of performance or lower price (often both). Mostly, the competition has been bowled over by Intel since they got that first huge break, by landing the IBM deal (no more Cyrix, and a lot of others who tried a hand at producing x86 chips, but wound up losing money and throwing in the towel).

Basically Intel did get their big break from IBM, and the chip they sold to IBM was nothing spectacular. The chip Intel sold to IBM was the 8088 as you mentioned, not the 8086 as I misstated in the previous post. I also misstated the Motorola chip IBM was considering. IBM was also looking at the much faster Motorola 68000 (a true 16 bit chip) at the time, but went with the much slower 8 bit 8088, for lower cost. Apple came out with a computer based on the 68000 later on. I believe the 8086 was a hybrid (16 bit internal architecture - 8 bits to the bus) and the 80186 was the first true 16 bitter from Intel (and yes, the 80286 was the 16 bitter that took off in the IBM AT). The Motorola chips generally seemed to be a better design, as I recall, particularly for assembley language programming. Some of the stuff about the x86s was pretty strange. It's been so long since I touched assembler for either series I don't really remember hardly any detail. I just recall the x86s to be a bit counter-intuitive.

IBM pretty much considered developing the "original" PC (the first one to really sell to business - model 5150 sounds right - "original" IBM PC to the common man in the business marketplace), and shortly afterwards the PC-XT (both to compete with the Apple II, which was really making the market open up and start being taken seriously for business use) to be a relatively minor matter at the time, but actually wound up sewing the seeds for their eventual loss of the iron grip they held on the computer market at the time (much like AT&T's marvelous invention, the transistor, sewed the initial seeds for their eventual disintegration). The fundamental error IBM made, was not realizing how big a role the PC would really play in the future. They farmed out the design and the production of all of the components and didn't protect themselves by buying most (if any) of the patents and copyrights, which was wonderful for the world, but turned out to be somewhat akin to opening Pandora's box for IBM. IBM's PC sold very well, but also opened them up to a new type of competition they had never encountered before (suddenly, almost anyone with a mind to do so, could make and sell compatible machines, and did they ever!). IBM tried to put the lid back on the box with the PS/2, using some proprietary architecture, but that basically backfired on them as it was to late to undo what had been done, when they effectively set an open architecture standard for others to build on (and compete with). So I tried to tell the story a little amusingly. Didn't mean to ruffle any feathers.

IBM also gave a little guy, by the name of Bill, a pretty nice break. How Microsoft won the contract to provide the operating system for the IBM PC (and without selling the rights to IBM!) has a little humor to it too (involves flying kites). New monopoly in town!

Robert Lane
February 25th, 2006, 01:14 AM
Guys, guys, guys...

I see this all to often on thread like this one; the "which is better" question gets asked and it ends up being a mix of historical, academic and usually emotionally charged debate.

I think the original question has been answered several times over and can be put to rest with one quick caveat:

Do your homework on all the options. Pick the solution that matches your budget, work requirements and hardware compatibility needs. Beyond that and it's all about personal opinion and experience.

Check out the Sticky from Chris about forum policy - this thread has broken that several times over.

'Nuf said.

Andrew Todd
February 25th, 2006, 01:36 AM
lol... thanks for the history lesson guys...
linux 4 life :)

Robert M Wright
February 25th, 2006, 01:52 AM
Don't you want to hear about how some engineers and programmers deciding to go fly kites made Bill Gates a Bazillionaire, Andrew? ...LOL :)

Jeff Kilgroe
February 25th, 2006, 02:00 AM
What I wrote wasn't meant to be a dry, literal (or anything like thorough) accounting of the history of computing, but rather, a bit of a lighthearted look back, basically from memory off the top of my head (no careful fact checking), being a little "liberal" with the facts for the sake of humor. The gist of it is founded in reality, but names and events have been altered by the age of my brain.

Sorry if I offended. I kinda figured that was the case, I just happen to be a little anal over such topics sometimes.

Mostly, the competition has been bowled over by Intel since they got that first huge break, by landing the IBM deal (no more Cyrix, and a lot of others who tried a hand at producing x86 chips, but wound up losing money and throwing in the towel).

Funny you should mention that, Cyrix producing x86 compatible CPUs actually was an IBM-funded venture and IBM owned controlling interest in Cyrix. Sales never took off (because their CPUs sucked beyond compare), and IBM started stamping their own name on them toward the end to see if it would help drive sales a bit. I don't think IBM has ever realized that their name is now a synonym for "has-been". Cyrix CPUs sucked so bad, that IBM wouldn't even put them in their own systems with the exception of the lowest bargain basement desktops sold at Office Depot.

The chip Intel sold to IBM was the 8088 as you mentioned, not the 8086 as I misstated in the previous post. I also misstated the Motorola chip IBM was considering. IBM was also looking at the much faster Motorola 68000 (a true 16 bit chip) at the time, but went with the much slower 8 bit 8088, for lower cost.

I know I'm picking nits, but the M68K CPU came much later than the 8088 -- it was actually released in late '86 and is a hybrid 16/32 bit chip. Motorola did approach IBM with the M68K, but it was around the time they adopted Intel's proposal for the 80386 (also a 16/32bit hybrid). IBM blindly continued with Intel and stayed the course. While it may have cost them a superior CPU, it also ensured backward compatibility with previous CPU and PC models... Something that no other computer maker has ever achieved for so long and is yet another key factor to the success of the IBM PC platform. The 68K found a happy life in The Atari ST line of computers, various Apple systems, the Sega Genesis game console (and later the Nomad handheld version) and numerous stand-alone arcade units and the misfit Atari Jaguar game console. The M68K is quite possibly the most produced CPU of all time as it (and low-power derivatives) is/are still used today in everything from home theatre remotes, cell phones to microwave ovens and scientific calculators or industrial applications like intelligent servo controls, stepper motors, etc.. And you're right, the 8086 did sit on an 8bit bus, but it was true 16bit internally.

The Motorola chips generally seemed to be a better design, as I recall, particularly for assembley language programming. Some of the stuff about the x86s was pretty strange. It's been so long since I touched assembler for either series I don't really remember hardly any detail. I just recall the x86s to be a bit counter-intuitive.

M68K assembler was dirt-simple. The x86 platform is simple too, and nobody thinks anything of it these days, but back then (in the 80's) it was radically different than other "RISC" style CPUs. Over the years a lot of people have always tried to compare RISC vs. CISC, but most never realized that true RISC CPUs died long ago and the M68K series were no less a CISC style architecture than the x86. There was also the factor of little vs. big endian. The x86 platform places the low-byte first followed by the high byte (little-endian). Most other CPUs back in the day placed the high byte first (big-endian). I used to have to listen to SGI zealots try to tell me that "PCs are less powerful because they're little-endian." ...When asked, they couldn't even tell me what "little-endian" actually meant.

Didn't mean to ruffle any feathers.

Likewise.

IBM also gave a little guy, by the name of Bill, a pretty nice break. How Microsoft won the contract to provide the operating system for the IBM PC (and without selling the rights to IBM!) has a little humor to it too (involves flying kites). New monopoly in town!

Yep. They had other companies offering to provide the operating system too, but Bill (who is by far a better business man with an uncanny eye for the future, than people credit him for) made IBM an offer they thought they couldn't refuse. IBM suits thought they were buying at a price of almost nothing, a decent, experimental OS (or the rights to use it temporarily) from a college kid. Little Billie shut out companies like Digital Research, Inc. that were offering full license to their operating systems, but at a much higher price tag. DRI provided the GEM OS for the ATARI ST, TT and Falcon systems as well as produced a competing DR-DOS for the PC and later a GUI similar (and at the time, superior) to Windows. But none of it caught on and DRI faded into obscurity with the death of ATARI's computer systems.

Jeff Kilgroe
February 25th, 2006, 02:10 AM
Don't you want to hear about how some engineers and programmers deciding to go fly kites made Bill Gates a Bazillionaire, Andrew? ...LOL :)

Hehe... I think I'm going to sleep now. I look back at this thread and see all that I typed and I have only one question. "WHY???"

Robert M Wright
February 25th, 2006, 02:57 AM
I'm sure we have an audience now, eagerly awaiting further discussion on CPUs from the age of the dinasaurs and 8-track players. :)

I remember some things from back then pretty well, especially the things that made a differance to me at the time or struck me as hilarous, but I guess more memories have a good dose of blur. The weird deal with the 8086s has something to do with loading the most significant bit at the right then going left to least significant (backwards). Made you have to really think harder, not to get things royally screwed up. That was awhile ago. Haven't programmed at that level in over 20 years! Don't really want to do it again either!

Those 68000s were out quite a bit earlier than you're recalling. I remember both my brother and I being quite disappointed when the IBM PC intoduced in 1981 didn't have a 68000 under the hood. We would have bought one if they had, but when it was announced with an 8088 inside, no way. I think the first 68000 to go in a widely sold machine was in the Apple Lisa in 83 (pricey computer!), but it might have been the Atari-ST. I don't really remember when the Atari came out, if it was before or after the Lisa. Anyway, my brother bought an ST, and it was a pretty slick machine for it's day. I built my first machine from parts in 87, with one of those NEC 8088 clones, and haven't hardly touched a non-PC computer since. Got into custom building x86s and upgrading them for a few years. That was real lucrative at first, but eventually the profit dried up as the Dells of the world put the crunch on the little guy.

I haven't been in computers hard-core in years now. I still build my own, and a few for family and friends, but that's about it. I love them for video. :) :)

Jeff Kilgroe
February 25th, 2006, 12:16 PM
I'm sure we have an audience now, eagerly awaiting further discussion on CPUs from the age of the dinasaurs and 8-track players. :)

What's sad is that I'm not old enough to remember a lot of this stuff.... I've just taught too many classes. Heh.

The weird deal with the 8086s has something to do with loading the most significant bit at the right then going left to least significant (backwards). Made you have to really think harder, not to get things royally screwed up.

The reversed bit/byte order... That was the "little-endian" approach of the x86 architecture I was talking about. It wasn't any better or worse and with the execution path of the first x86 CPUs, it made a lot of sense for it to work that way. Confused the heck out of a lot of programmers though.

Those 68000s were out quite a bit earlier than you're recalling.

Yeah, I had a brain-fart at 1:00AM. The M68000 CPU first went into production in '79. I don't even know where I got '86 for anything... In '84 we had the 68020, which found its way into more Amiga computers than any other system and then the next major update to the M68K family was the 68030 in '87 and the '040 in '91.

I haven't been in computers hard-core in years now. I still build my own, and a few for family and friends, but that's about it. I love them for video. :) :)

Robert M Wright
February 25th, 2006, 01:10 PM
I remember driving muscle cars too. Now that was fun! Guess I'm a relic. LOL