Which is Notebook is Better Mac or PC? - Page 2 at DVinfo.net
DV Info Net

Go Back   DV Info Net > Panasonic P2HD / AVCCAM / AVCHD / DV Camera Systems > Panasonic P2HD / DVCPRO HD Camcorders
Register FAQ Today's Posts Buyer's Guides

Panasonic P2HD / DVCPRO HD Camcorders
All AG-HPX and AJ-PX Series camcorders and P2 / P2HD hardware.

Reply
 
Thread Tools Search this Thread
Old February 24th, 2006, 02:14 PM   #16
Inner Circle
 
Join Date: Sep 2004
Location: Sacramento, CA
Posts: 2,488
Quote:
Originally Posted by Scott Marvel
I have an HVX 200 on order and could use advice about which laptop computer to buy so that I can transfer data from the P2 card to a laptop.
One option I'd consider for this purpose would be a dual-core PC laptop running Canopus Edius Broadcast software, which reportedly has some of the best support available for P2 MXF files. Contrary to what some here are saying, I've found the Intel dual-core processors to be reasonably effective for the price, and definitely more powerful than their single-core predecessors. Same should hold true for AMD dual-core laptops. An HVX200 with a dual-core laptop and a large external hard drive would be a fine video production setup, compared to what you could get for the same price just a couple of years ago.
Kevin Shaw is offline   Reply With Quote
Old February 24th, 2006, 02:35 PM   #17
Trustee
 
Join Date: May 2005
Location: Wyomissing, PA
Posts: 1,141
Images: 57
Quote:
Originally Posted by Jeff Kilgroe
Marketing... It's all marketing... Northwood hit the MHz wall at about 3.2GHz and yields of those were lower than expected... All we can do is wait and see what shakes loose at E3 this year.
Agree. I'm still holding onto my dual xeon box with 3.0GHz. I also have an HP at home with a 2.2GHz northwood. I really don't see any benefit coming from newer intel models as of yet (other than the PCIe graphics), and these current boxes are getting old.

We have a deployed laptop solution where I work, and some of the very newest models are locking up when playing back my training videos. Twenty minutes or so into a wmv file and bam, freeze. Can't seem to figure it out. They played fine on last years models, and the disks are error free, software versions are the same. Different disks, etc. Only similarity is the length of time. It has to be overheating chipsets, (checking on that one).

If Intel doesn't fix the issue soon, I may have to relent and go AMD, and I'll go the HP route if I don't build it myself.
Peter Ferling is offline   Reply With Quote
Old February 24th, 2006, 02:53 PM   #18
Inner Circle
 
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
When I go to dual cores, I'm going with the AMDs. Benchmarks I've seen show them just absolutely smoking the Intel P4Ds, at the same price points, at just about any type of task.
Robert M Wright is offline   Reply With Quote
Old February 24th, 2006, 04:32 PM   #19
Major Player
 
Join Date: Jun 2003
Location: Golden, CO
Posts: 681
Quote:
Originally Posted by Kevin Shaw
One option I'd consider for this purpose would be a dual-core PC laptop running Canopus Edius Broadcast software, which reportedly has some of the best support available for P2 MXF files. Contrary to what some here are saying, I've found the Intel dual-core processors to be reasonably effective for the price, and definitely more powerful than their single-core predecessors. Same should hold true for AMD dual-core laptops. An HVX200 with a dual-core laptop and a large external hard drive would be a fine video production setup, compared to what you could get for the same price just a couple of years ago.
Edius is cool.. As long as it has all the features you need, it's great and has the best support and performance for the HVX200 and MXF of anything on the PC (except maybe the very top Avid options for $$$$$ more). And yes, the dual core Intel chips do deliver more performance than the single core versions, provided your software is multithreaded and ready for multiple processors. As long as you don't saturate your CPU bus doing really heavy streaming video work in realtime, you won't notice too many bottleneck issues. I know I've been doing my share of Intel bashing in this thread, but I think I can find plenty of things to gripe about concerning AMD and Apple's G5 offerings too. I'm not loyal to any one platform, but when talking overall CPU power, AMD is where it's at (for now).
__________________
- Jeff Kilgroe
- Applied Visual Technologies | DarkScience
- www.darkscience.com
Jeff Kilgroe is offline   Reply With Quote
Old February 24th, 2006, 10:33 PM   #20
Inner Circle
 
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
Actually, Intel has never really produced the most powerful CPUs, at least not that I am aware of.

Even way back in the early 1980s, when IBM first decided they were going to compete with a little tiny company named Apple Computer, which rather unexpectedly had, almost single-handedly and overnight, created a surprisingly thriving little niche market, for a novelty item called "the personal computer" (as seen in "Playboy" magazine - Apple's first national advertisement).

This little fruit company on the West coast, selling toy computers out of a garage, wasn't so much a threat to IBM, as a bit of an embarrassment. After all, nobody respectable bought anything remotely resembling a computer, or even many typewriters, from anyone but true blue! Okay, there were a few miscreants that purchased oddball computers from a handful of little piss-ant rouges (and often got fired for doing so), but IBM was to computers, what AT&T was to telephones, for goodness sakes!

IBM didn't particularly care if a little company in Silicone Valley sold a few souped up calculators, but it was pretty gosh-darn embarrassing that, though they be just lowly toys, NOBODY was buying ANY of these squirrel cage, so called computers from IBM, because IBM didn't even have a product in this new arena!!! Obviously, this would never do.

Well, somebody sent out a memo I guess, that must have said something like, "Johnson, here's a few measly million. Go find some starving engineers and throw em some bones (contracts) to build us a fresh new one of these thingamajigs and bring it back so we can put our name on it, call it THE PC, and sell it." So, Johnson (or whatever his name was), being the standup, navy blue suited, white collar with a tie, IBM yes man that he undoubtedly was, prolly promptly fired back a memo saying "yes, sir" and "thank you sir" and got right to work (after all, Johnson wasn't the brightest bulb on the tree, and he was grateful to have a good job like this with a company that would never lay him off, and give him a nice little pension after he puttered around his modest little middle-management office for half his life).

Well, Johnson was pretty new, so his budget was pretty tight, and besides, even IBM didn't want to be squandering a bunch of it's bazillions in readily available cash at the time, from owning the computer business, lock, stock and barrel (since basically the beginning of time), on some minor little, face-saving, jazzed up electric typewriter with a few extra wires on the inside, a couple lights on the outside, and a speaker that beeped at you when you turned it on.

Anyways, Johnson, despite his undying loyalty to IBM, being a little wet behind the ears (and a little soft between them), went about committing what became the biggest blunder, bar none, in IBM's illustrious history, promptly sewed the seeds for IBM's eventual fall from the throne of the colossal computer empire that was IBM's and IBM's alone.

One relatively minor mistake in this epic blunder, was selecting Intel's new (at the time) 8086 microprocessor, to put inside, at the heart of the box that you plug the makeshift black and white television screen and the keyboard into. Well, this little microprocessor chip wasn't the slickest one around. Motorola made a much, much better one, the 6502, which was the "brains" (besides Steve Wozniac, not Jobs) in those little "apples" that started this whole mess in the first place. Johnson liked the 6502, but being a little strapped for cash, chose the Intel chip in a pinch, despite being a rather poor performer, because Intel, being the struggling new kid on the block, needed the business and was willing to sell them just a little cheaper than the competing chips, from far more established Motorola.

The story goes on, and the history of computers, and particularly personal computers, is full of interesting, and humorous (even if I didn't tell this one well) stories, but I think I will wind this little chapter up here.

Just a few years after Intel got the contract to provide those chips for IBM's new toy, NEC came out with a slightly faster version, at a lower price, that was part of the beginning of the(drum roll please) "Clone Wars" ...a multi-sequeled story that eventually ends in the fall of the blue giant.

Intel never did produce the world's most powerful microprocessor chips, but I guess, at least at one time, they were pretty dang cheap, and that got them a contract to supply a few chips for an awkward, minor little product in the IBM lineup.
Robert M Wright is offline   Reply With Quote
Old February 24th, 2006, 11:28 PM   #21
Major Player
 
Join Date: Jun 2003
Location: Golden, CO
Posts: 681
Quote:
Originally Posted by Robert M Wright
One relatively minor mistake in this epic blunder, was selecting Intel's new (at the time) 8086 microprocessor, to put inside, at the heart of the box that you plug the makeshift black and white television screen and the keyboard into. Well, this little microprocessor chip wasn't the slickest one around. Motorola made a much, much better one, the 6502
Er... Not trying to be the party pooper, but you might want to find a new source for your history lessons. The first "personal computer" did not come out of a garage, it came from IBM in January, 1975 -- over a year before Jobs and Wozniak released the Apple I. And it wasn't Intel based, either. Nobody bought the thing, so IBM scrapped the idea thinking it would never fly. The original PC (IBM PC Model 5100) used the 8bit IBM PALM (Place All Logic in Microcode) CPU and initially shipped with 16KB of RAM - upgradable or configurable to 64KB. Street prices were as "low" as $5200 on this baby... The Modle 5100 had a 5" monochrome screen integrated right into it's typewriter-like chassis with a tape-drive.

Jobs and Wozniak introduced their creation on April 1, 1976 and it caught on because it was a bit more fun to use than other kit computers at the time. People simply liked it - it wasn't the most powerful kit computer, nor was it the first by any means and it took some effort to put it in a box and attach it to a display, etc... It wasn't a complete package like what was available from IBM at the time. Although, it was more powerful using the 6502 and in terms of CPU power at this price point, the main competitor was the Zilog Z80 CPU, which was available from a few other homebrew kit makers as well as the Tandy kits.

As the kit computers of the late '70s were proving to be quite popular, IBM decided to get back into the game of "personal computers". So they resurrected the PC Model 5100 in late '77 and shipped it standard with 64KB of memory, but the PALM CPU was now over 3 years old and not as powerful as the processors included with Apple kits or Radio Shack "Tandy" kit systems. Still they pressed ahead with the Model 5100 because it was still the only complete personal computer available and the only option you could actually go to a real store and buy. Apple was still selling systems via mail order and at trade shows or other geeky meetings like HAM/shortwave radio gatherings, etc.. IBM kept pushing the price of their M5100 down and while sales held steady, they were losing ground to the $500 kits. In 1980, the M5100 still had an MSRP of $2800. IBM didn't want to sacrifice their business model and didn't want to play the game of build-it-yourself kits so they went to the drawing board for a new system -- the IBM PC Model 5150. Now that the PALM CPU was nearly 5 years old, they needed a new CPU option, and fast... They took bids and a small semiconductor company called Intel happened to have what seemed like the right component for the right price -- the Intel i8088 microprocessor. It was a sweet little unit, capable of more than 10 times the number of operations per second than the PALM and super cheap to produce in quantity. It was also about 4X as powerful per clock cycle vs. the 6502 or Z80 CPUs, which were also about 4 years old at this point. The IBM Model 5150 was released in August, 1981 with 64KB of RAM and a 160KB 5.25" floppy drive as standard equipment and ran off the i8088 CPU at a whopping 4.8MHz. IBM introduced the M5150 at $2800, or about the same price point the M5100 was selling for, street price on the thing was close to $2100.

The Intel i8086 didn't enter the Personal Computer scene until later (like '85/'86) - it was the 16bit sibling of the 8088. And came clocked at 6 and 8 MHz, IIRC. The IBM PC clone market was also well established by the time the 8086 CPU entered the scene. The 8086 was later succeeded for a very brief time by the 80186 and then the 80286, which sold like hotcakes.

Jobs and Wozniak were by no means the first to make a Personal Computer, nor were they the first board kit makers either. They simply had a product that set them apart from the crowd - people really liked it. IBM killed themselves not by choosing Intel, but by not embracing the inexpensive kit model. Ironic that Apple started as a do-it-yourself computer and is now the only non-cloned computer system on the market, other than SGI. But SGI doesn't count anymore, IMO since they got de-listed from the NYSE and they're all but bankrupt.
__________________
- Jeff Kilgroe
- Applied Visual Technologies | DarkScience
- www.darkscience.com
Jeff Kilgroe is offline   Reply With Quote
Old February 25th, 2006, 12:53 AM   #22
Inner Circle
 
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
What I wrote wasn't meant to be a dry, literal (or anything like thorough) accounting of the history of computing, but rather, a bit of a lighthearted look back, basically from memory off the top of my head (no careful fact checking), being a little "liberal" with the facts for the sake of humor. The gist of it is founded in reality, but names and events have been altered by the age of my brain.

It occurred to me that Intel (and their chips) have played a huge part in this wonderful revolution in computing (and they deserve a big part of the credit), but there have almost always been competing chips that offered something more in the way of performance or lower price (often both). Mostly, the competition has been bowled over by Intel since they got that first huge break, by landing the IBM deal (no more Cyrix, and a lot of others who tried a hand at producing x86 chips, but wound up losing money and throwing in the towel).

Basically Intel did get their big break from IBM, and the chip they sold to IBM was nothing spectacular. The chip Intel sold to IBM was the 8088 as you mentioned, not the 8086 as I misstated in the previous post. I also misstated the Motorola chip IBM was considering. IBM was also looking at the much faster Motorola 68000 (a true 16 bit chip) at the time, but went with the much slower 8 bit 8088, for lower cost. Apple came out with a computer based on the 68000 later on. I believe the 8086 was a hybrid (16 bit internal architecture - 8 bits to the bus) and the 80186 was the first true 16 bitter from Intel (and yes, the 80286 was the 16 bitter that took off in the IBM AT). The Motorola chips generally seemed to be a better design, as I recall, particularly for assembley language programming. Some of the stuff about the x86s was pretty strange. It's been so long since I touched assembler for either series I don't really remember hardly any detail. I just recall the x86s to be a bit counter-intuitive.

IBM pretty much considered developing the "original" PC (the first one to really sell to business - model 5150 sounds right - "original" IBM PC to the common man in the business marketplace), and shortly afterwards the PC-XT (both to compete with the Apple II, which was really making the market open up and start being taken seriously for business use) to be a relatively minor matter at the time, but actually wound up sewing the seeds for their eventual loss of the iron grip they held on the computer market at the time (much like AT&T's marvelous invention, the transistor, sewed the initial seeds for their eventual disintegration). The fundamental error IBM made, was not realizing how big a role the PC would really play in the future. They farmed out the design and the production of all of the components and didn't protect themselves by buying most (if any) of the patents and copyrights, which was wonderful for the world, but turned out to be somewhat akin to opening Pandora's box for IBM. IBM's PC sold very well, but also opened them up to a new type of competition they had never encountered before (suddenly, almost anyone with a mind to do so, could make and sell compatible machines, and did they ever!). IBM tried to put the lid back on the box with the PS/2, using some proprietary architecture, but that basically backfired on them as it was to late to undo what had been done, when they effectively set an open architecture standard for others to build on (and compete with). So I tried to tell the story a little amusingly. Didn't mean to ruffle any feathers.

IBM also gave a little guy, by the name of Bill, a pretty nice break. How Microsoft won the contract to provide the operating system for the IBM PC (and without selling the rights to IBM!) has a little humor to it too (involves flying kites). New monopoly in town!
Robert M Wright is offline   Reply With Quote
Old February 25th, 2006, 01:14 AM   #23
Go Go Godzilla
 
Join Date: Nov 2004
Location: Scottsdale, AZ USA
Posts: 2,788
Images: 15
Not another one...

Guys, guys, guys...

I see this all to often on thread like this one; the "which is better" question gets asked and it ends up being a mix of historical, academic and usually emotionally charged debate.

I think the original question has been answered several times over and can be put to rest with one quick caveat:

Do your homework on all the options. Pick the solution that matches your budget, work requirements and hardware compatibility needs. Beyond that and it's all about personal opinion and experience.

Check out the Sticky from Chris about forum policy - this thread has broken that several times over.

'Nuf said.
__________________
Robert Lane
Producer/Creator - Bike Pilots TV
Robert Lane is offline   Reply With Quote
Old February 25th, 2006, 01:36 AM   #24
Major Player
 
Join Date: Oct 2005
Location: Saint John, CANADA
Posts: 633
lol... thanks for the history lesson guys...
linux 4 life :)
__________________
video : xl2 / letus35xl / bogen 503
photo- canon 1dmkII - bronica etrsi
Andrew Todd is offline   Reply With Quote
Old February 25th, 2006, 01:52 AM   #25
Inner Circle
 
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
Don't you want to hear about how some engineers and programmers deciding to go fly kites made Bill Gates a Bazillionaire, Andrew? ...LOL :)
Robert M Wright is offline   Reply With Quote
Old February 25th, 2006, 02:00 AM   #26
Major Player
 
Join Date: Jun 2003
Location: Golden, CO
Posts: 681
Quote:
Originally Posted by Robert M Wright
What I wrote wasn't meant to be a dry, literal (or anything like thorough) accounting of the history of computing, but rather, a bit of a lighthearted look back, basically from memory off the top of my head (no careful fact checking), being a little "liberal" with the facts for the sake of humor. The gist of it is founded in reality, but names and events have been altered by the age of my brain.
Sorry if I offended. I kinda figured that was the case, I just happen to be a little anal over such topics sometimes.

Quote:
Mostly, the competition has been bowled over by Intel since they got that first huge break, by landing the IBM deal (no more Cyrix, and a lot of others who tried a hand at producing x86 chips, but wound up losing money and throwing in the towel).
Funny you should mention that, Cyrix producing x86 compatible CPUs actually was an IBM-funded venture and IBM owned controlling interest in Cyrix. Sales never took off (because their CPUs sucked beyond compare), and IBM started stamping their own name on them toward the end to see if it would help drive sales a bit. I don't think IBM has ever realized that their name is now a synonym for "has-been". Cyrix CPUs sucked so bad, that IBM wouldn't even put them in their own systems with the exception of the lowest bargain basement desktops sold at Office Depot.

Quote:
The chip Intel sold to IBM was the 8088 as you mentioned, not the 8086 as I misstated in the previous post. I also misstated the Motorola chip IBM was considering. IBM was also looking at the much faster Motorola 68000 (a true 16 bit chip) at the time, but went with the much slower 8 bit 8088, for lower cost.
I know I'm picking nits, but the M68K CPU came much later than the 8088 -- it was actually released in late '86 and is a hybrid 16/32 bit chip. Motorola did approach IBM with the M68K, but it was around the time they adopted Intel's proposal for the 80386 (also a 16/32bit hybrid). IBM blindly continued with Intel and stayed the course. While it may have cost them a superior CPU, it also ensured backward compatibility with previous CPU and PC models... Something that no other computer maker has ever achieved for so long and is yet another key factor to the success of the IBM PC platform. The 68K found a happy life in The Atari ST line of computers, various Apple systems, the Sega Genesis game console (and later the Nomad handheld version) and numerous stand-alone arcade units and the misfit Atari Jaguar game console. The M68K is quite possibly the most produced CPU of all time as it (and low-power derivatives) is/are still used today in everything from home theatre remotes, cell phones to microwave ovens and scientific calculators or industrial applications like intelligent servo controls, stepper motors, etc.. And you're right, the 8086 did sit on an 8bit bus, but it was true 16bit internally.

Quote:
The Motorola chips generally seemed to be a better design, as I recall, particularly for assembley language programming. Some of the stuff about the x86s was pretty strange. It's been so long since I touched assembler for either series I don't really remember hardly any detail. I just recall the x86s to be a bit counter-intuitive.
M68K assembler was dirt-simple. The x86 platform is simple too, and nobody thinks anything of it these days, but back then (in the 80's) it was radically different than other "RISC" style CPUs. Over the years a lot of people have always tried to compare RISC vs. CISC, but most never realized that true RISC CPUs died long ago and the M68K series were no less a CISC style architecture than the x86. There was also the factor of little vs. big endian. The x86 platform places the low-byte first followed by the high byte (little-endian). Most other CPUs back in the day placed the high byte first (big-endian). I used to have to listen to SGI zealots try to tell me that "PCs are less powerful because they're little-endian." ...When asked, they couldn't even tell me what "little-endian" actually meant.

Quote:
Didn't mean to ruffle any feathers.
Likewise.

Quote:
IBM also gave a little guy, by the name of Bill, a pretty nice break. How Microsoft won the contract to provide the operating system for the IBM PC (and without selling the rights to IBM!) has a little humor to it too (involves flying kites). New monopoly in town!
Yep. They had other companies offering to provide the operating system too, but Bill (who is by far a better business man with an uncanny eye for the future, than people credit him for) made IBM an offer they thought they couldn't refuse. IBM suits thought they were buying at a price of almost nothing, a decent, experimental OS (or the rights to use it temporarily) from a college kid. Little Billie shut out companies like Digital Research, Inc. that were offering full license to their operating systems, but at a much higher price tag. DRI provided the GEM OS for the ATARI ST, TT and Falcon systems as well as produced a competing DR-DOS for the PC and later a GUI similar (and at the time, superior) to Windows. But none of it caught on and DRI faded into obscurity with the death of ATARI's computer systems.
__________________
- Jeff Kilgroe
- Applied Visual Technologies | DarkScience
- www.darkscience.com
Jeff Kilgroe is offline   Reply With Quote
Old February 25th, 2006, 02:10 AM   #27
Major Player
 
Join Date: Jun 2003
Location: Golden, CO
Posts: 681
Quote:
Originally Posted by Robert M Wright
Don't you want to hear about how some engineers and programmers deciding to go fly kites made Bill Gates a Bazillionaire, Andrew? ...LOL :)
Hehe... I think I'm going to sleep now. I look back at this thread and see all that I typed and I have only one question. "WHY???"
__________________
- Jeff Kilgroe
- Applied Visual Technologies | DarkScience
- www.darkscience.com
Jeff Kilgroe is offline   Reply With Quote
Old February 25th, 2006, 02:57 AM   #28
Inner Circle
 
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
I'm sure we have an audience now, eagerly awaiting further discussion on CPUs from the age of the dinasaurs and 8-track players. :)

I remember some things from back then pretty well, especially the things that made a differance to me at the time or struck me as hilarous, but I guess more memories have a good dose of blur. The weird deal with the 8086s has something to do with loading the most significant bit at the right then going left to least significant (backwards). Made you have to really think harder, not to get things royally screwed up. That was awhile ago. Haven't programmed at that level in over 20 years! Don't really want to do it again either!

Those 68000s were out quite a bit earlier than you're recalling. I remember both my brother and I being quite disappointed when the IBM PC intoduced in 1981 didn't have a 68000 under the hood. We would have bought one if they had, but when it was announced with an 8088 inside, no way. I think the first 68000 to go in a widely sold machine was in the Apple Lisa in 83 (pricey computer!), but it might have been the Atari-ST. I don't really remember when the Atari came out, if it was before or after the Lisa. Anyway, my brother bought an ST, and it was a pretty slick machine for it's day. I built my first machine from parts in 87, with one of those NEC 8088 clones, and haven't hardly touched a non-PC computer since. Got into custom building x86s and upgrading them for a few years. That was real lucrative at first, but eventually the profit dried up as the Dells of the world put the crunch on the little guy.

I haven't been in computers hard-core in years now. I still build my own, and a few for family and friends, but that's about it. I love them for video. :) :)
Robert M Wright is offline   Reply With Quote
Old February 25th, 2006, 12:16 PM   #29
Major Player
 
Join Date: Jun 2003
Location: Golden, CO
Posts: 681
Quote:
Originally Posted by Robert M Wright
I'm sure we have an audience now, eagerly awaiting further discussion on CPUs from the age of the dinasaurs and 8-track players. :)
What's sad is that I'm not old enough to remember a lot of this stuff.... I've just taught too many classes. Heh.

Quote:
The weird deal with the 8086s has something to do with loading the most significant bit at the right then going left to least significant (backwards). Made you have to really think harder, not to get things royally screwed up.
The reversed bit/byte order... That was the "little-endian" approach of the x86 architecture I was talking about. It wasn't any better or worse and with the execution path of the first x86 CPUs, it made a lot of sense for it to work that way. Confused the heck out of a lot of programmers though.

Quote:
Those 68000s were out quite a bit earlier than you're recalling.
Yeah, I had a brain-fart at 1:00AM. The M68000 CPU first went into production in '79. I don't even know where I got '86 for anything... In '84 we had the 68020, which found its way into more Amiga computers than any other system and then the next major update to the M68K family was the 68030 in '87 and the '040 in '91.

Quote:
I haven't been in computers hard-core in years now. I still build my own, and a few for family and friends, but that's about it. I love them for video. :) :)
__________________
- Jeff Kilgroe
- Applied Visual Technologies | DarkScience
- www.darkscience.com
Jeff Kilgroe is offline   Reply With Quote
Old February 25th, 2006, 01:10 PM   #30
Inner Circle
 
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
I remember driving muscle cars too. Now that was fun! Guess I'm a relic. LOL
Robert M Wright is offline   Reply
Reply

DV Info Net refers all where-to-buy and where-to-rent questions exclusively to these trusted full line dealers and rental houses...

B&H Photo Video
(866) 521-7381
New York, NY USA

Scan Computers Int. Ltd.
+44 0871-472-4747
Bolton, Lancashire UK


DV Info Net also encourages you to support local businesses and buy from an authorized dealer in your neighborhood.
  You are here: DV Info Net > Panasonic P2HD / AVCCAM / AVCHD / DV Camera Systems > Panasonic P2HD / DVCPRO HD Camcorders


 



All times are GMT -6. The time now is 02:46 PM.


DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network