View Full Version : How is Vegas 7.0e working for you?
Eric Shepherd May 27th, 2007, 06:50 PM Because System Restore is not perfect. I always disable it after doing a clean install. Frees up drive space and a minimal amount of resources.
The problem is, Sys Restore isn't a complete duplicate of stuff, it's a partial duplicate. So you can revert back to an earlier time, but not everything reverts back. XP is the best version of Windows to date (including Vista), but it's not perfect.
Dana Salsbury May 27th, 2007, 10:16 PM Danny, you were prophetic about the card. I replaced it with a Dynex 3 Port IEEE1394 and tried it again -- it did the trick! It's also detecting the camera and capturing.
I'm still weary of going to 7.0e, but am ecstatic that I have a working system again.
-Dana Nathan Salsbury
Danny Fye May 28th, 2007, 04:43 AM Danny, you were prophetic about the card. I replaced it with a Dynex 3 Port IEEE1394 and tried it again -- it did the trick! It's also detecting the camera and capturing.
I'm still weary of going to 7.0e, but am ecstatic that I have a working system again.
-Dana Nathan Salsbury
I am glad that something works properly now and I am glad I was able to help!
As for Vegas 7e, well, I may spoken too quick.
The initial test I did was with an *.m2t file that was 48 minutes and it worked fine. In fact I just did another test with it to be sure and it worked fine again. I deleted the *.SFK file to make sure that the peaks were re-built.
I just did a church service with 2 cams and the files are 1 hour and 5 minutes for each file.
Now for the problem. If I load those files in Vegas 7e and build the peaks (be it the normal peaks or 8 bit) when it gets to 99% Vegas would stop. If I let it sit for 20 plus minutes then an error dialog about Vegas needing to shut down comes up.
I divided the files in approximately half and the peaks will build just fine.
So there seems to be a limit (at least on my system) as to how large/long an m2t file can be before the building of the peaks crap-out.
At least there is a work-around by cutting the files in half.
So it looks like we need Vegas 7F.
Danny Fye
www.dannyfye.com
www.vidmus.com/scolvs
Dana Salsbury May 28th, 2007, 10:28 AM I think there is a lot of truth to that. Building peaks seems to be the greatest problem that Vegas has. I forget, are you on 'd' or 'e'?
Danny Fye May 28th, 2007, 11:47 AM I think there is a lot of truth to that. Building peaks seems to be the greatest problem that Vegas has. I forget, are you on 'd' or 'e'?
I am on 'e'. Even with the backups and all I didn't discover this one until way too late to go back. Even so, I have had various problems with peaks using earlier versions. So backups can be extremely helpful but nothing is guaranteed.
I think what Vegas needs is some better error trapping that will allow it to simply skip over problem areas of a file or when it gets to a max it would then just simply end the process instead of hanging up and crashing.
I'm sure there will be posts on the forums about Vegas not building peaks for parts of the file(s) and/or the end of the file(s) but then it should help somewhat with trouble shooting because then one can better determine if the problem is in the file or Vegas itself. Would also help in updates and fixes for Vegas.
As for now and at least on my system I will simply feed Vegas smaller bites of HDV so it won't choke on it. ;)
I am using Infiniticam to edit the cams so instead of one multi-cam-prep project I am using two of them. I can then copy the resultant events to the main project and finish it up there. I already did the first part this morning with the church service. After lunch I will copy the events to the main project and then finish it all up.
I never use the multicam project(s) for Infiniticam for the final project. I always copy the resultant events over to the main project so I don't have to deal with the clutter of the multicam project(s).
Danny Fye
www.dannyfye.com
www.vidmus.com/scolvs
Dave Haynie May 29th, 2007, 07:37 AM I've been chasing an issue with Vegas 7 since the fall. I actually installed Vegas 7e in the hope that was the solution, but alas, it wasn't.
Over time, I've determined that the problem seems to be limited (at least on my system) to Vegas rendering to WMV/HD output. Could well be anything related to WMV, but I'm only using WMV for 1080i/1080p output right now.
The first problem I found was that I was getting really slow renders. This proved to be that, crazily enough, I was thrashing my main video drive (composed of three SATA drives in a RAID0 configuration). In other words, the drive simply couldn't keep up with the demands, despite the heavy CPU use in rendering. This was a combination of HDV, Cineform, and 6-8Mpixel JPEG files on input, with the aforementioned WMV/HD (1080i) output. Targeting any other disc for output solved this problem... which, at worst, could make a render take days longer than it ought to, and seemed to be unusually crashy.
With that solved, I still had occasional problems rendering to WMV/HD, usually resulting in crashes... the whole-system, blame it on a random device driver kind, which are really gnarly to debug. Again, only related to WMV rendering... I could (and did) output the whole project to CineForm files without any indicent... and those CineForm files rendered well to WMV/HD. I did, at one point, boost my swapspace, based on a theory that the WMV CODEC had some memory leak, which might be aggrevated when Vegas has all this extra memory needed for rendering large JPEGs to video, etc.
I have since had one crash, again on WMV/HD rendering, in the last month or so, but it's largely under control. I've recently updated a few of the system resources, just in case there were other related problems/aggrevations, but chasing these things down is often hit and miss.
Hopefully, if Vegas is actually doing things to contribute to this kind of instability, they'll find it and fix it... soon. I'm totally not used to Vegas being any kind of problem; I've been using Vegas since the Vegas Pro (audio only) days, and it's been one of the most reliable programs I've used, in any field.
Dave Haynie May 29th, 2007, 07:43 AM I do know Vegas isn't terribly error resistant... and that's somewhat to be expected, you don't usually get speed and error-checking together. I had a non-HD project, actually rendering a WMV download as a DVD for a family member. The WMV had a number of small glitches in it... these show up red on the timeline, and more often than not, they crash when you try to render said video. The problem, of course, was actually finding these... sometimes only a few frames were disturbed, over several hours of video.
If you get glitches during HDV capture/transfer, it's very reasonable to expect the same kind of bad behavior. I have found Sony's built-in HDV capture less reliable on my system (nForce4 motherboard, AMD64x2 CPU) than HDVSplit in this regard.
For single-pass renders to MPEG (for that DVD), I was actually able to use the incomplete MPEG file to track down the nearly hidden glitch. If you're crashing during HDV renders, try looking for glitches in a similar way; maybe try rendering the whole timeline to something relatively fast like CineForm.
Danny Fye May 29th, 2007, 08:53 AM The first problem I found was that I was getting really slow renders. This proved to be that, crazily enough, I was thrashing my main video drive (composed of three SATA drives in a RAID0 configuration). In other words, the drive simply couldn't keep up with the demands, despite the heavy CPU use in rendering. This was a combination of HDV, Cineform, and 6-8Mpixel JPEG files on input, with the aforementioned WMV/HD (1080i) output. Targeting any other disc for output solved this problem... which, at worst, could make a render take days longer than it ought to, and seemed to be unusually crashy.
----------------------------------------------
I did, at one point, boost my swapspace, based on a theory that the WMV CODEC had some memory leak, which might be aggrevated when Vegas has all this extra memory needed for rendering large JPEGs to video, etc.
--------------------------------
Hopefully, if Vegas is actually doing things to contribute to this kind of instability, they'll find it and fix it... soon. I'm totally not used to Vegas being any kind of problem; I've been using Vegas since the Vegas Pro (audio only) days, and it's been one of the most reliable programs I've used, in any field.
If you have a software RAID and with the heavy load on the processor then that would significantly reduce the effectiveness of the RAID it seems. My questions in another thread concerning RAID and subsequently my studies of it have helped me know that if one was to use RAID that it really needs to be a true hardware RAID with no software involved at all. A true RAID controller costs at least $300.00 from what I have found.
I always let Windows size the swap instead of fixing it. I used to use a fixed size but I would always run into problems having done so. Even if I were to use four times the amount of ram I have.
As for fixing the bugs in Vegas, they have to also track down the causes and duplicate them to be able to fix them. When I did troubleshooting while working on televisions and other electronics at times there would be one of those "dogs" as we called them that would take a lot of extra time to figure out. Especially the ones that were intermittent. So some of these problems will take somewhat longer to solve than others.
Danny Fye
www.dannyfye.com
www.vidmus.com/scolvs
Dave Haynie May 30th, 2007, 01:37 AM If you have a software RAID and with the heavy load on the processor then that would significantly reduce the effectiveness of the RAID it seems. My questions in another thread concerning RAID and subsequently my studies of it have helped me know that if one was to use RAID that it really needs to be a true hardware RAID with no software involved at all. A true RAID controller costs at least $300.00 from what I have found.
You're of course talking about a "smart" RAID controller (eg, one with its own CPU that implements the disc spanning or, usually, more complex RAID protocols such as RAID 5's distributed model) rather than a "dumb" RAID controller or OS-managed stripeset (eg, no CPU of its own), since of course, there's really no such thing as a "hardware" RAID (they all use both hardware and software). In my case, my primary video RAID is running on a dedicated Silicon Image controller, which is of course managed by BIOS level software at boot time, and a Windows driver at runtime.
Thing is, a level 0 RAID is using a very trivial amount of code... no more CPU overhead than your normal filesystem (in fact, in a Windows stripeset, it IS your normal filesystem, just partitioned across multiple drives). Companies like Silicon Image sell smart controller chips, which add on-chip processors much the same way most SCSI controllers went "smart" back in the 1990s. These start at $5 for the chip itself... not a whole implementation, but also not something that need cost $300 in order to be real.
What's obvious about any RAID, when you think about it, and very nicely demonstrated by the slowdown I experienced, is that a RAID is going to be slower seeking than a single drive. Basically, on a single drive, your seek time will average out to the average seek time of the drive... sometimes slower, sometimes faster. With a RAID, however, your seeks per logical block is going to be the worst case seek of the N drives in your array that comprise that logical block. This may be somewhat offset by the fact your logical blocks are now N times larger.
But when I'm jumping around between 50GB CineForm files, dozens of 6 and 8Mpixel JPEGs, etc. that's lots of seeking. The reason the problem was obvious was simple: with all this stuff going on, reading the drive, compositing, rendering to WMV/HD, I was seeing dead cycles... sometimes the CPU use didn't rise above 70-80%. The only reason that could happen is some kind of I/O blocking.. the only reason for I/O blocking would be that somehow I became hard drive bound rather than CPU bound. While I certainly have a fast CPU (well, it's 2006-fast, an AMD64x2 4400), it's insane to believe the drives should be that slow.
And of course, they weren't, if the drive speed had not had such a high component of seek time. Pointing the output a different array (or single drive, it doesn't matter) was enough to eliminate any CPU dead space in there... and that extra day or so of rendering, even when it didn't crash.
As for fixing the bugs in Vegas, they have to also track down the causes and duplicate them to be able to fix them.
Of course they do... this is what software developers spend much of their time doing (and hardware developers... I've done both professionally, but tend toward hardware). They don't fix bugs they don't know about; they do try to fix bugs they know about, and the best possible description of the bug is the key to reporting it... you want to give the developer the ability to reproduce it from your description.
|
|