![]() |
Which is Notebook is Better Mac or PC?
I have an HVX 200 on order and could use advice about which laptop computer to buy so that I can transfer data from the P2 card to a laptop.
I edit on Final Cut but the new Mac BookPro laptops don't have PC card slots. Could I get a cheap PC laptop and download the data and then transfer the data to my Mac later? Any suggestions? |
Ooooooh, that's a tough one, Scott. Currently there are plans for a PCMCIA adapter for the newer Intel Macbook slots but I haven't heard of any of them coming out yet. The trouble I see is that you're already using FCP, so if you could, I'd suggest you wait a bit until your HVX200 actually comes in as I hear they're taking a bit to come in.
If you need a laptop RIGHT NOW, then you'll have to go with a PC laptop and do as you've said, to download to the PC and transfer to Mac later. |
Scott,
In another thread (actually in a few) I've mentioned several reasons why you should not get a MacBook Pro - yet. I don't want to retype all that stuff again, but suffice it to say that if you're on the FCP platform and need a laptop you should jump on the last of the PowerBooks before they completely go away. It will be months before the MacBook Pro & FCP become a stable platform and even then you'll still be missing some very important I/O ports that the PowerBooks have and the MacBook Pro's don't. Read my post about configuring a PowerBook for capturing HVX footage; there's plenty of info about what you need to know. |
Ah yes! How could I forget!? Thanks, Robert, you've helped Scott more than I did! =)
Man I'm getting too old for this stuff... |
I'm thinking .....
about a dual-core AMD laptop for serious magic green screen work. I use FCP on a mac and would only be using the laptop for direct X intensive work and for authoring to windows media player.
|
Beware of the AMD solution
Dean,
I'm not knocking the PC platform you're daydreaming about, but you should be wary of any AMD-based platform. I used to own an IT/PC consulting firm on the side from my main business; the systems we saw in for repair the most, regardless if they were tower or laptop were AMD chipsets. AMD has always given Intel fits especially in the late '90's when AMD often out-clocked them and chip prices were less than half than that of a comparable Intel. However, this speed has always come at a high price, that being heat, and lots of it. We always considered AMD to be the tuner-kid with a nitrous bottle in his Civic: Hit the button and you've got blazing speed, but often you'd burn the motor in the process. And that's exactly what happens with AMD systems: The chips are so continuously hot that they literally pop the capacitors on motherboards. It was common even as recently as last year to see an AMD system fail within it's first year. And of course, laptops have far less venting than towers so we saw tons more dead laptops than towers. My advice, is not to get starry-eyed over a dual AMD solution; if you're going to be using any PC laptop make sure you're looking at an Intel-based chipset. In our experience, the most reliable laptops with the fewest issues were either IBM Thinkpads, Fujitsu Lifebooks or the high end Sony Vaio (aluminum case). Everything else was a crapshoot. |
Thanks Robert .....
I'll keep that advise in mind. I wasn't aware of the AMD heat problem!
|
Hmmm, that's strange, Robert. I haven't heard of heat problems related specifically to AMD since the old Athlon XP days. Those did have issues far beyond mere heat problems but things are much better these days with the AMD Athlon 64's, X2's, FX's, and Opterons. The capacitor thing I have no idea about unless you're mistaking the heat issue for the faulty capacitor drama that unfolded where a rash of motherboards were given faulty capacitors out of the Korean factories or somewhere in Asia, I can't remember exactly, it had happened last year. Other problems were with earlier motherboard designs where they placed the capcitors too close to the socket. Things have come a long way since then. As far as modern heat issues go, that's most likely due to user error or faulty third-party manufacturers.
If you do want to go with an AMD solution, I suggest going with a good brand-name dealer like Alienware, Falcon Northwest, Voodoo PC, or Polywell. |
Quote:
Right now, AMD has the best CPU options for desktop systems... Hands down, no comparison. Intel's dual-core implementation is inefficeint and has several bottlenecks, including the brain-damaged approach of non-linked cache, forcing CPU to CPU communication to exit out onto the external CPU bus. In other words, the Pentium D was a half-assed, hack job solution to get something to market ASAP. The new Core Duo isn't much better. Take another look at the numbers that Apple is claiming for the Core duo and put a little thought into it... Hmmm... It's two years newer than the G5 and about the same speed. Yes, the new MacBook Pro is "4X Faster" than a PowerBook G4. Considering it has dual CPU cores, and each one is 2X as fast as the nearly 5 year old G4 chip, I should hope it's at least 4X as fast. As for AMD system failures... Yes, quite common with AMD XP/MP CPUs and 760/760MP[X] chipsets. Not anywhere near as common with FX and X2 CPUs, but most failures come from inadequate power supplies rather than overheating CPUs and components. I see all too many PC vendors shipping dual-core AMDX2 systems and a 7800GTX video card and they only put a 400W PSU in the thing. That system is doomed for failure within the first 250 hours of intense gaming. I build about 75 to 100 systems each year, many of them for our internal use here (render farm, workstations). All the overheat problems come from the newer Intel boxes with Prescott and newer derivative core CPUs. The Northwood core systems and equivalent Xeon based options were great... And faster per clock cycle for most operations too. I have 22 AMD 4800+ dual-core systems running as a render cluster right now. They grind on animation frames 24/7 and every single one maintains temps lower than 119F for both the CPU and internal box temp. I can't get below 127F on a Pentium D without using a peltier or liquid system. The Core Duo chips in PC notebooks (same exact chip as in the new Macs) is failing to impress the PC crowd. It's the same Centrino chip we've all had for the last couple years... No speed increases, no improvements. Just now there are two of them glued together sharing the same CPU bus, oh joy. Intel will have CPU linkage via the L1 and L2 cache in their upcoming dual CPU offerings for desktops/workstations due later this year (think summer/fall) along with some other enhancements. Hopefully this will help relieve Intel's downward spiral. Apple may have made a mistake committing themselves to exclusive Intel usage. It could be that the PC will continue to offer the most CPU power for the money when AMD will have quad-core CPUs by the end of the year and Apple won't be using them. As for mobile notebooks... The AMD Turion mobile CPU just can't seem to get off the ground, even though it is a superior CPU to the Pentium M in many ways. Especially for applications like video, graphics, etc... It has nearly 3X the FPU performance vs. Centrino and better power management. For some reason the whole Pentium M/Centrino platform has a captivated audience right now and nobody is willing to explore other options. ...Too bad. AMD will have dual-core Turion in full mass-production within 2 months too. The only mainstream PC vendor to even be offeirng right off will be HP and only in a few select system configs. Intel hit the mobile PC marketing jackpot with the whole Centrino campaign. Most people don't even realize that Centrino is nothing... Centrino is Intel's trademarked name for combining an Intel mobile CPU with a wireless networking solution (doesn't matter which type or brand or anything) into a compact or mobile platform. |
Quote:
|
Oh, well, yeah. Alienware isn't the best but their systems do perform well if there isn't a problem. Tech support is based around people who can hardly speak english and don't know what they're doing. Half the time they're just looking up the FAQ section of the site for a solution. But, as I've said, their systems do perform well when they work.
|
I never understood why the heck Intel replaced the Northwoods with Prescotts. What were they thinking???
|
Quote:
|
Quote:
And we still don't have 4GHz P4 CPUs because Intel never got a lot of other things to work like their faster bus speeds. The current P4 design is flawed in that it bottlenecks itself at about 3.4GHz when on an 800MHz bus. Now take a 3.4GHz P4-D on that 800MHz bus where it has to share that 800MHz FSB for inter-CPU communication with two CPU cores. Stupid, stupid, stupid!!! This is why Apple/IBM/Motorola, HP/DEC, and AMD all went to the serial interconnect bus ("HyperTransport"). Bus speed is serialized and it increases along with CPU speed. Unfortunately dual-core AMD/G5/PA-RISC/etc.. CPUs with this bus implementation still all share a single bus for two CPUs. The G5 and AMD chips at least have shared cache and direct communication between the CPUs without using the front side bus. Upcoming AMD quad-core CPUs will incorporate some sort of multiple channel hypertransport implementation, which is all still top secret. All we can do is wait and see what shakes loose at E3 this year. |
Actually, here in the frozen tundra of Minnesnowta, we do appreciate the dual purpose chips (CPU/spaceheater).
|
Quote:
|
Quote:
We have a deployed laptop solution where I work, and some of the very newest models are locking up when playing back my training videos. Twenty minutes or so into a wmv file and bam, freeze. Can't seem to figure it out. They played fine on last years models, and the disks are error free, software versions are the same. Different disks, etc. Only similarity is the length of time. It has to be overheating chipsets, (checking on that one). If Intel doesn't fix the issue soon, I may have to relent and go AMD, and I'll go the HP route if I don't build it myself. |
When I go to dual cores, I'm going with the AMDs. Benchmarks I've seen show them just absolutely smoking the Intel P4Ds, at the same price points, at just about any type of task.
|
Quote:
|
Actually, Intel has never really produced the most powerful CPUs, at least not that I am aware of.
Even way back in the early 1980s, when IBM first decided they were going to compete with a little tiny company named Apple Computer, which rather unexpectedly had, almost single-handedly and overnight, created a surprisingly thriving little niche market, for a novelty item called "the personal computer" (as seen in "Playboy" magazine - Apple's first national advertisement). This little fruit company on the West coast, selling toy computers out of a garage, wasn't so much a threat to IBM, as a bit of an embarrassment. After all, nobody respectable bought anything remotely resembling a computer, or even many typewriters, from anyone but true blue! Okay, there were a few miscreants that purchased oddball computers from a handful of little piss-ant rouges (and often got fired for doing so), but IBM was to computers, what AT&T was to telephones, for goodness sakes! IBM didn't particularly care if a little company in Silicone Valley sold a few souped up calculators, but it was pretty gosh-darn embarrassing that, though they be just lowly toys, NOBODY was buying ANY of these squirrel cage, so called computers from IBM, because IBM didn't even have a product in this new arena!!! Obviously, this would never do. Well, somebody sent out a memo I guess, that must have said something like, "Johnson, here's a few measly million. Go find some starving engineers and throw em some bones (contracts) to build us a fresh new one of these thingamajigs and bring it back so we can put our name on it, call it THE PC, and sell it." So, Johnson (or whatever his name was), being the standup, navy blue suited, white collar with a tie, IBM yes man that he undoubtedly was, prolly promptly fired back a memo saying "yes, sir" and "thank you sir" and got right to work (after all, Johnson wasn't the brightest bulb on the tree, and he was grateful to have a good job like this with a company that would never lay him off, and give him a nice little pension after he puttered around his modest little middle-management office for half his life). Well, Johnson was pretty new, so his budget was pretty tight, and besides, even IBM didn't want to be squandering a bunch of it's bazillions in readily available cash at the time, from owning the computer business, lock, stock and barrel (since basically the beginning of time), on some minor little, face-saving, jazzed up electric typewriter with a few extra wires on the inside, a couple lights on the outside, and a speaker that beeped at you when you turned it on. Anyways, Johnson, despite his undying loyalty to IBM, being a little wet behind the ears (and a little soft between them), went about committing what became the biggest blunder, bar none, in IBM's illustrious history, promptly sewed the seeds for IBM's eventual fall from the throne of the colossal computer empire that was IBM's and IBM's alone. One relatively minor mistake in this epic blunder, was selecting Intel's new (at the time) 8086 microprocessor, to put inside, at the heart of the box that you plug the makeshift black and white television screen and the keyboard into. Well, this little microprocessor chip wasn't the slickest one around. Motorola made a much, much better one, the 6502, which was the "brains" (besides Steve Wozniac, not Jobs) in those little "apples" that started this whole mess in the first place. Johnson liked the 6502, but being a little strapped for cash, chose the Intel chip in a pinch, despite being a rather poor performer, because Intel, being the struggling new kid on the block, needed the business and was willing to sell them just a little cheaper than the competing chips, from far more established Motorola. The story goes on, and the history of computers, and particularly personal computers, is full of interesting, and humorous (even if I didn't tell this one well) stories, but I think I will wind this little chapter up here. Just a few years after Intel got the contract to provide those chips for IBM's new toy, NEC came out with a slightly faster version, at a lower price, that was part of the beginning of the(drum roll please) "Clone Wars" ...a multi-sequeled story that eventually ends in the fall of the blue giant. Intel never did produce the world's most powerful microprocessor chips, but I guess, at least at one time, they were pretty dang cheap, and that got them a contract to supply a few chips for an awkward, minor little product in the IBM lineup. |
Quote:
Jobs and Wozniak introduced their creation on April 1, 1976 and it caught on because it was a bit more fun to use than other kit computers at the time. People simply liked it - it wasn't the most powerful kit computer, nor was it the first by any means and it took some effort to put it in a box and attach it to a display, etc... It wasn't a complete package like what was available from IBM at the time. Although, it was more powerful using the 6502 and in terms of CPU power at this price point, the main competitor was the Zilog Z80 CPU, which was available from a few other homebrew kit makers as well as the Tandy kits. As the kit computers of the late '70s were proving to be quite popular, IBM decided to get back into the game of "personal computers". So they resurrected the PC Model 5100 in late '77 and shipped it standard with 64KB of memory, but the PALM CPU was now over 3 years old and not as powerful as the processors included with Apple kits or Radio Shack "Tandy" kit systems. Still they pressed ahead with the Model 5100 because it was still the only complete personal computer available and the only option you could actually go to a real store and buy. Apple was still selling systems via mail order and at trade shows or other geeky meetings like HAM/shortwave radio gatherings, etc.. IBM kept pushing the price of their M5100 down and while sales held steady, they were losing ground to the $500 kits. In 1980, the M5100 still had an MSRP of $2800. IBM didn't want to sacrifice their business model and didn't want to play the game of build-it-yourself kits so they went to the drawing board for a new system -- the IBM PC Model 5150. Now that the PALM CPU was nearly 5 years old, they needed a new CPU option, and fast... They took bids and a small semiconductor company called Intel happened to have what seemed like the right component for the right price -- the Intel i8088 microprocessor. It was a sweet little unit, capable of more than 10 times the number of operations per second than the PALM and super cheap to produce in quantity. It was also about 4X as powerful per clock cycle vs. the 6502 or Z80 CPUs, which were also about 4 years old at this point. The IBM Model 5150 was released in August, 1981 with 64KB of RAM and a 160KB 5.25" floppy drive as standard equipment and ran off the i8088 CPU at a whopping 4.8MHz. IBM introduced the M5150 at $2800, or about the same price point the M5100 was selling for, street price on the thing was close to $2100. The Intel i8086 didn't enter the Personal Computer scene until later (like '85/'86) - it was the 16bit sibling of the 8088. And came clocked at 6 and 8 MHz, IIRC. The IBM PC clone market was also well established by the time the 8086 CPU entered the scene. The 8086 was later succeeded for a very brief time by the 80186 and then the 80286, which sold like hotcakes. Jobs and Wozniak were by no means the first to make a Personal Computer, nor were they the first board kit makers either. They simply had a product that set them apart from the crowd - people really liked it. IBM killed themselves not by choosing Intel, but by not embracing the inexpensive kit model. Ironic that Apple started as a do-it-yourself computer and is now the only non-cloned computer system on the market, other than SGI. But SGI doesn't count anymore, IMO since they got de-listed from the NYSE and they're all but bankrupt. |
What I wrote wasn't meant to be a dry, literal (or anything like thorough) accounting of the history of computing, but rather, a bit of a lighthearted look back, basically from memory off the top of my head (no careful fact checking), being a little "liberal" with the facts for the sake of humor. The gist of it is founded in reality, but names and events have been altered by the age of my brain.
It occurred to me that Intel (and their chips) have played a huge part in this wonderful revolution in computing (and they deserve a big part of the credit), but there have almost always been competing chips that offered something more in the way of performance or lower price (often both). Mostly, the competition has been bowled over by Intel since they got that first huge break, by landing the IBM deal (no more Cyrix, and a lot of others who tried a hand at producing x86 chips, but wound up losing money and throwing in the towel). Basically Intel did get their big break from IBM, and the chip they sold to IBM was nothing spectacular. The chip Intel sold to IBM was the 8088 as you mentioned, not the 8086 as I misstated in the previous post. I also misstated the Motorola chip IBM was considering. IBM was also looking at the much faster Motorola 68000 (a true 16 bit chip) at the time, but went with the much slower 8 bit 8088, for lower cost. Apple came out with a computer based on the 68000 later on. I believe the 8086 was a hybrid (16 bit internal architecture - 8 bits to the bus) and the 80186 was the first true 16 bitter from Intel (and yes, the 80286 was the 16 bitter that took off in the IBM AT). The Motorola chips generally seemed to be a better design, as I recall, particularly for assembley language programming. Some of the stuff about the x86s was pretty strange. It's been so long since I touched assembler for either series I don't really remember hardly any detail. I just recall the x86s to be a bit counter-intuitive. IBM pretty much considered developing the "original" PC (the first one to really sell to business - model 5150 sounds right - "original" IBM PC to the common man in the business marketplace), and shortly afterwards the PC-XT (both to compete with the Apple II, which was really making the market open up and start being taken seriously for business use) to be a relatively minor matter at the time, but actually wound up sewing the seeds for their eventual loss of the iron grip they held on the computer market at the time (much like AT&T's marvelous invention, the transistor, sewed the initial seeds for their eventual disintegration). The fundamental error IBM made, was not realizing how big a role the PC would really play in the future. They farmed out the design and the production of all of the components and didn't protect themselves by buying most (if any) of the patents and copyrights, which was wonderful for the world, but turned out to be somewhat akin to opening Pandora's box for IBM. IBM's PC sold very well, but also opened them up to a new type of competition they had never encountered before (suddenly, almost anyone with a mind to do so, could make and sell compatible machines, and did they ever!). IBM tried to put the lid back on the box with the PS/2, using some proprietary architecture, but that basically backfired on them as it was to late to undo what had been done, when they effectively set an open architecture standard for others to build on (and compete with). So I tried to tell the story a little amusingly. Didn't mean to ruffle any feathers. IBM also gave a little guy, by the name of Bill, a pretty nice break. How Microsoft won the contract to provide the operating system for the IBM PC (and without selling the rights to IBM!) has a little humor to it too (involves flying kites). New monopoly in town! |
Not another one...
Guys, guys, guys...
I see this all to often on thread like this one; the "which is better" question gets asked and it ends up being a mix of historical, academic and usually emotionally charged debate. I think the original question has been answered several times over and can be put to rest with one quick caveat: Do your homework on all the options. Pick the solution that matches your budget, work requirements and hardware compatibility needs. Beyond that and it's all about personal opinion and experience. Check out the Sticky from Chris about forum policy - this thread has broken that several times over. 'Nuf said. |
lol... thanks for the history lesson guys...
linux 4 life :) |
Don't you want to hear about how some engineers and programmers deciding to go fly kites made Bill Gates a Bazillionaire, Andrew? ...LOL :)
|
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
|
Quote:
|
I'm sure we have an audience now, eagerly awaiting further discussion on CPUs from the age of the dinasaurs and 8-track players. :)
I remember some things from back then pretty well, especially the things that made a differance to me at the time or struck me as hilarous, but I guess more memories have a good dose of blur. The weird deal with the 8086s has something to do with loading the most significant bit at the right then going left to least significant (backwards). Made you have to really think harder, not to get things royally screwed up. That was awhile ago. Haven't programmed at that level in over 20 years! Don't really want to do it again either! Those 68000s were out quite a bit earlier than you're recalling. I remember both my brother and I being quite disappointed when the IBM PC intoduced in 1981 didn't have a 68000 under the hood. We would have bought one if they had, but when it was announced with an 8088 inside, no way. I think the first 68000 to go in a widely sold machine was in the Apple Lisa in 83 (pricey computer!), but it might have been the Atari-ST. I don't really remember when the Atari came out, if it was before or after the Lisa. Anyway, my brother bought an ST, and it was a pretty slick machine for it's day. I built my first machine from parts in 87, with one of those NEC 8088 clones, and haven't hardly touched a non-PC computer since. Got into custom building x86s and upgrading them for a few years. That was real lucrative at first, but eventually the profit dried up as the Dells of the world put the crunch on the little guy. I haven't been in computers hard-core in years now. I still build my own, and a few for family and friends, but that's about it. I love them for video. :) :) |
Quote:
Quote:
Quote:
Quote:
|
I remember driving muscle cars too. Now that was fun! Guess I'm a relic. LOL
|
All times are GMT -6. The time now is 08:17 PM. |
DV Info Net -- Real Names, Real People, Real Info!
1998-2025 The Digital Video Information Network