View Full Version : New DIY HD Cinema Camera Project


Pages : 1 2 [3] 4 5 6 7 8 9 10

Jamie Varney
July 9th, 2007, 12:49 PM
Just incase anyone has missed it there is a thread over in the AVCHD forum about the Apitek GO-HD (http://www.dvinfo.net/conf/showthread.php?t=95227) This camera uses the ambarella chip to do 720P. While the footage is pretty good you can tell the compression is quite high, but as Wayne mentioned it depends on exactly how programmable the chip is to see if it is usable or not.

Take Vos
July 9th, 2007, 01:39 PM
Going for a chip design could be quite cool.

Say for example you take a IIDC or GigE camera like the Pike, that already does all the communication with the sensor. Both IIDC and GigE broadcast their data, which is interesting.

Now have a chip which has a Firewire800 or GigE and a eSATA interface. Simply record all the frames received on the bus and send it to the disk. You don't even need a filesystem if you like. Just think of the disk as a single file that you write linear. Now this chip does not need to be that fast either.

A second chip with a Firewire800 or GigE and a DVI/VGA and USB interface. Its job would be to do a simple debayering and zooming, and show some things like numbers and icons. It would listen to a USB HID device, so that you can control the camera like the shutter time, and send commands to the recording ship to start and stop recording.

The nice thing of this arrangement that it would allow a fully distributed system. You could have a second recorder chip, to record to multiple disk for safety, or have a second preview chip for the director.

Once the recording chip works it would not need more development afterward, this will make the system stable and reliable. Which is nice in an already expensive production. Maybe at some time you want to add playback capability.

I do expect continued development on the preview chips, adding features, increasing preview quality.

It would be an interesting project, but I think right now I'll stick with the software route.

Jose A. Garcia
July 9th, 2007, 05:28 PM
It sounds great! We need to know the price for the chip though, and also what's needed for the whole system to work. I mean, it wouldn't be as easy as connecting the sensor to the chip, the chip to a sata disk and that's all, would it? We need to study the hard/soft for the controller/viewfinder part too.

I wanted to ask something else. It's not Ambarella chip related. Is there any way to overclock an usb interface? I mean, the bottleneck when capturing directly to the computer is the usb. The sensor actually goes like twice faster at full HD. If there's a way to make the interface go just a little faster, we wouldn't have any problem capturing 1920x800.

This is really getting better with each new post. I'm seriously thinking about keeping the demo board. I was going to return it because of the usb. We've got a week to find out if the board-chip-disk solution is actually possible.

By the way, Wayne... You say the ambarella chip has a sensor interface. Does it mean you just need to plug the actual single sensor and that's all or you have to connect the whole demo board?

John Wyatt
July 9th, 2007, 06:59 PM
I've been busy and haven't had much time to follow the threads lately: good to see this debate still going on with a feeling of real purpose. I feel excited again with some of the possibilities people are talking about here. Solving the low cost solution has taken longer than I (and most other people) have thought, but I still keep the faith in home made HD, if only because I still can't get the images I want from a commercial camcorder that I can afford to buy.

As usual the main problem is getting HD down onto the HDD, so perhaps GigE will finally solve it. Until then I am still a fan of Bayer RAM recording, though this seems to be going out of fashion at the moment! True, you have the restricted duration of single shots, so if you want to make a movie with lots of long dialogue shots, this recording method is hopeless. But what of you don't want to make a movie like that? The sort of durations I am using are between 30-45 seconds, which is surprisingly long enough to get most shots (modern cinema grammar has speeded up, perhaps thanks to MTV). It's longer than the single takes I got with my spring-driven Bolex H16 in the early 1980's! And it's good discipline to plan out a shot and shoot it within that time span if possible. Each take is automatically split as a single clip, so at the end of the session you convert them to colour avi, assess the clips, and delete the takes you don't want (save HDD space with these large clips).

The durations I use are slightly less than I can actually get, but the ultimate durations gave a slight "kick" towards the end of the shot which lost sync with the sound. I assume this is the Sumix software not being able to handle the larger clips from 2 GB of RAM. It seems from earlier posts here that Sumix may be updating their software to accomodate large RAM so the safe maximum duration might get extended. Rob Scott coded his own RAM-recording software for the M73 camera which extended the RAW recording duration using RAM buffering (writing data to HDD while still capturing to RAM). See Rob's thread here on Alt imaging called: "Interest in open-source capture software for Sumix M73". I'm not sure the status of Rob's project right now -- Rob if you're checking this thread, perhaps you could give us an update?

My own preference at the moment regarding bandwidth problems is to shoot a more modest frame size. I'm currently shooting tests at around 1400 or 1600 width, which is my preferred method of "compression". Using the M72 camera 1600 is full sensor width, so field of view of the lens is not narrrowed. The post workflow is to take the uncompressed clip, export an uncompressed still sequence (tif, tga, whatever), and then finally use Photoshop to enlarge all the frames in the sequence to 2k width. Because you start with an uncompressed frame, and use Bicubic quality to make the enlargement, results may be more acceptable than onboard realtime jpeg of a larger frame.

All the best,
John.

Take Vos
July 9th, 2007, 11:44 PM
Jose,

Maybe you will be able to overclock the usb chips, but you want to double the speed, and that will probably not be possible. Maybe you can run dual usb, or use dvi, or your own form of sdi, or simply GigE.

Cheers,
Take

Steven Mingam
July 10th, 2007, 12:53 AM
The post workflow is to take the uncompressed clip, export an uncompressed still sequence (tif, tga, whatever), and then finally use Photoshop to enlarge all the frames in the sequence to 2k width. Because you start with an uncompressed frame, and use Bicubic quality to make the enlargement, results may be more acceptable than onboard realtime jpeg of a larger frame.

You really should look for "avisynth" in google.... You could use superior upsizer algorithm than bicubic (Lanczos, B-spline...) and you wouldn't have the hassle to process each frame separatly (even with a batch process it's cumbersome).

Wayne Morellini
July 10th, 2007, 01:53 AM
Just incase anyone has missed it there is a thread over in the AVCHD forum about the Apitek GO-HD (http://www.dvinfo.net/conf/showthread.php?t=95227) This camera uses the ambarella chip to do 720P. While the footage is pretty good you can tell the compression is quite high, but as Wayne mentioned it depends on exactly how programmable the chip is to see if it is usable or not.

That is probably their most simplest version. Be interesting to find out how much PC processing an top render quality codec would take at that data rate.


Jose,

It most likely will not be as simple as that, you would have to program the ambarella interface to suit, and maybe external circuits to connect the two.

To overclock USB, would be interesting, but in reality, why not use GigE, and an GigE caddy? The problem with USB is the peaks, and the processor drain, but with embedded you can have an section handling the USB, and buffering the data peaks before and after the USB. This equals, nearly, 50MB/s max through put (8bit 1080p24/25, or 720p50 without compression).


Take,
Interesting solution, I did find an HDMI to wavelet chip solution that included USB like interface, tried to talk them into making an USB capture solution, but nothing came through, but somebody else could do it.

John Wyatt
July 10th, 2007, 03:21 AM
Steve -- thanks for reminding me about avisynth. I remember looking at that a while ago but didn't go into the upsizing quality because I thought Photoshop would be superior. There are also Photoshop plug-ins like Genuine Fractals and S-spline (called something else now, can't, remember) which can also be used as an action in Photoshop -- I just walk away and do something more interesting while that's going on. I'll check out the latest version of avisynth...
Thanks.

Take Vos
July 10th, 2007, 03:47 AM
You really should look for "avisynth" in google.... You could use superior upsizer algorithm than bicubic (Lanczos, B-spline...) and you wouldn't have the hassle to process each frame separatly (even with a batch process it's cumbersome).

Actually, you should replace the interpolation part of a debayer algorithm with this upscaler. As the debayer algorithm is already interpolating (scaling is interpolation) you will get a better quality result. Scaling once to the correct size is better than doing it in two steps.

I think it would be quite easy to change the Adaptive Homogeneity ... Algorithm, to do this.

Cheers,
Take

Jose A. Garcia
July 10th, 2007, 06:39 AM
Take, I don't want to double the speed of the usb interface. I just want to speed it up a bit. I already have like 18-20 fps at 1920x800. It would be just a matter of 40% or 50% more to get to 25-26fps and then fix it to 24fps.

The question is how can we do all that?

I mean, we're talking about different solutions but almost all of them require hard/soft engineering. I.e. I've got my demo board. How can we modify that board so it has a GigE interface instead of USB? If we do it, we also need new software to capture from the new interface.

That's why I was talking about overclocking the usb. Cause using parts that we already have is much easier than building new parts, putting them all together and developing new software.

If we want to stay with camera-to-laptop solution, which IMHO is the best and easier way, I see two different options:

I can capture 24fps bayer at 1600x666 using usb without any problem and it's direct to HDD capture so we have no time issues. I still prefer capturing to ram though, cause with this things faster is always better. We'd have to develop a piece of software that can read the bayer results and convert them to RGB 2048x858 in just one step while debayering. That would be Take's job if he wants to do it. Upscaling is very common even in top pro cameras, so if we know this upscaling process would give us much better quality than any pro camera, it's ok to me. We could use any of the great interpolation algorithms out there.

The other option would be to overclock the usb interface to get 24fps at 2048x858 or at 1920x800. In the first case it would be like 75% faster, the other would be easier and more stable, as it's just like 50% faster.

This way we would have 2k or near 2k bayer, but full HD anyway and we would just need to bebayer in post. The good part of this one is that we have more resolution to start with.

There's in fact another option that's a little more difficult, but a good one nonetheless:

If we connect the demo board to the Ambarella chip and then we stream to the computer using usb we could have lossless 2k directly to the laptop. We would need a way to connect the board to the chip, an usb interface from the chip to the computer and software to control the camera and capture. If we can do this one, we could easily turn it into a portable solution using a mini pc and a small lcd.

With any option we have to work together. The best of this is that we can start with the most simple solution and then upgrade it till we have a standalone camera.

So what do you think it's the most simple option?

Jose A. Garcia
July 10th, 2007, 09:14 AM
I did it.

I changed USB clock from 125Hz to 500Hz. The camera works and now it captures 2048x858 at about 25fps already debayered to uncompressed AVI.

The question is... Is it bad for the camera? I mean, I already spent 900euro on the board. I used a patch used by pro gamers to increase mouse performance and the guy who programmed the tool says it's completely safe. In fact the program has an option to set the clock to 1000Hz and that's the only option the programmer says is not safe.

I'll post a full res clip soon.

Wayne Morellini
July 10th, 2007, 11:47 AM
Is that related to the actual speed, or an switching speed fro polling etc?

Jose A. Garcia
July 10th, 2007, 12:12 PM
What do you mean?

Jamie Varney
July 10th, 2007, 02:38 PM
I did it.

I changed USB clock from 125Hz to 500Hz. The camera works and now it captures 2048x858 at about 25fps already debayered to uncompressed AVI.

The question is... Is it bad for the camera? I mean, I already spent 900euro on the board. I used a patch used by pro gamers to increase mouse performance and the guy who programmed the tool says it's completely safe. In fact the program has an option to set the clock to 1000Hz and that's the only option the programmer says is not safe.

I'll post a full res clip soon.

I was just thinking about this patch because a gamer friend of mine used to use it. Every electronic part has tolerances to how fast it can actually perform, and typically they are set low 'to be safe.' As long as nothing is getting hot on either your computer or your camera I think you are probably okay. My bigger concern is the integrity of the data you are receiving, By overclocking it you may be introducing noise or corrupt data into the stream. Or maybe not.

After talking to one of my friends I have given up on the gumstix idea. We are now talking about going purely with a CMOS-FPGA-hdd solution. Neither of us have ever attempted anything like this before, but that is kinda what makes it fun. Once I have some money to spare I think I am going to buy a Spartan-3E Xilinx dev board. But once again that could be a while.

Oh and Jose, that website you gave me didn't have the full datasheet for the Micron MT9P031 only the product overview. It seems you have to sign a NDA with Micron to get the full datasheet :-(

Steven Mingam
July 11th, 2007, 01:46 AM
Actually, you should replace the interpolation part of a debayer algorithm with this upscaler. As the debayer algorithm is already interpolating (scaling is interpolation) you will get a better quality result. Scaling once to the correct size is better than doing it in two steps.

I think it would be quite easy to change the Adaptive Homogeneity ... Algorithm, to do this.

Cheers,
Take

Well, i've a working Directional filtering with A Posteriori Decision (http://www.dei.unipd.it/~mutley/Paper_pdf/Menon_Andriani_IEEE_T_IP_2007.pdf) implementation, the avisynth filter is also written and need testing.
It's faster than AHD because you don't have to convert data to CIE color space (which is a very slow operation involving cube root) to calculate homogeneity map, but the result should be quite similar.
(and i've written it in a way that allow me to add easily MMX code where it matters)
Creating an avisynth filter using internal resizer shouldn't be too hard, but i wanted to see if i could wrote something from a scientific paper.

Take Vos
July 11th, 2007, 02:36 AM
Hello Steven,

Acording to the AHD paper they also tried to use YUV with good result, so I am using YUV. I am doing camera_RGB->rec703_RGB->rec703_YUV using 4x4 matrices.

Also, the conversion of RGB to YUV or RGB to CIElab is not the limiting factor, it is the amount of convolutions that have to be performed, including running the image three times to a median filter.

BTW, I am running the complete AHD on the graphics card. It runs at about 25% (slower) real time. On a MacBook Pro.

Your directional filtering algorithm also looks very much like AHD, except that the decision seems to be made before interpolating red and blue.

It took me quite a long time to figure out what the AHD paper tried to explain. While the actual algorithm is quite easy to explain in detail. I which these math people would write papers with engineers in mind.

Cheers,
Take

Steven Mingam
July 11th, 2007, 03:08 AM
Hello Steven,

Acording to the AHD paper they also tried to use YUV with good result, so I am using YUV. I am doing camera_RGB->rec703_RGB->rec703_YUV using 4x4 matrices.

Also, the conversion of RGB to YUV or RGB to CIElab is not the limiting factor, it is the amount of convolutions that have to be performed, including running the image three times to a median filter.

Well, convolution is only integer math, which is very fast on current CPU and vectorizable. CIElab need cube root and divison which are still a lot slower (17 cycles for div vs 0.33 cycles for mov/add on a core2duo, and i'm not talking about memory penality with the lookup table access).

Your directional filtering algorithm also looks very much like AHD, except that the decision seems to be made before interpolating red and blue.

That's the point of the algorithm : reduce amount of convolution needed by choosing before instead of after like AHD.

It took me quite a long time to figure out what the AHD paper tried to explain. While the actual algorithm is quite easy to explain in detail. I which these math people would write papers with engineers in mind.
Same here :D
I lost a lot of hairs before figuring it out and then i was "whaat ? is that all ???"

edit : btw, if anybody has some raw bayer footage, i would glady download it so i can test the algorithm on different video material.

Take Vos
July 11th, 2007, 07:16 AM
Steve,

When I get the camera running correct, I will be able to put some footage up. However, the reseller of the Pike doubts that the artifacts are abnormal.
He basically says: "you shouldn't do gamma correction".

So to prove that it is wrong I will have to make a light sensitivity graph (x: exposure time, y: AD output value) for a couple of pixels. it looks to me like 50% of the pixels are not linear in mid-light.

Cheers,
Take

Cole McDonald
July 11th, 2007, 09:20 AM
Any overclocking is safe provided you can cool the parts enough...it goes up quickly though, you'll be liquid cooling the thing before too long ;)

Luddite question: is there a reason Firewire (IEEE1394) isn't being considered (or considered, then dismissed)?

Jose A. Garcia
July 11th, 2007, 09:38 AM
Steve, I can provide you with raw bayer material from my cam. Do you want anything in particular? Resolution? Colors? Lighting?

Just tell me where I can upload it.

Jose A. Garcia
July 11th, 2007, 07:48 PM
Here's another clip. 2k(2.39:1)@24fps.

I know this can make little or no sense to you so I'll explain.

The clip is actually a message for a friend of mine. He's a film student and knows about this project. He's looking forward to shooting one of his short films with this camera once it's finished or at least ready to film.

He's so excited about this that every time we start talking about the camera he says "Come on... Make me cry" (of happiness, of course). He always says he can't believe he'll be able to shoot 2K Digital without having to spend thousands.

So in the clip, the message I write on the piece of paper means "Now you can start crying".

You can download two versions:

http://www.cus-cus.net/dani/Test02-2k.wmv
http://www.cus-cus.net/dani/Test02-2k.mov

Jamie Varney
July 11th, 2007, 08:55 PM
Very nice footage footage Jose! I keep getting more and more excited about the possibilities!

Anyway, is it possible to tell which FPGA the Micron demo board uses? I am just looking for a part number to get a rough idea what kinda processing power is required.

Jose A. Garcia
July 12th, 2007, 03:07 AM
I can't upload the demo board manual where you can find all hardware descriptions. You can find it at www.framos.co.uk clicking on "Products", "CMOS sensors", "Demo Boards" and "Demo Camera System".

Wayne Morellini
July 12th, 2007, 03:45 AM
What do you mean?

Polling is how often it checks per second, which is done by software in USB I understand, and is one of it's problems that slows down performance, but would yield some better consistency in data-rate. Or, it could be an master timing clock, that regulates the actually speed.

Things are set lower speed, but also made to tolerances. Overclocking can reduce life of the part, and there was something else I can't remember. I'm not saying not to do it, just that things are not as simple as they might seem sometimes.

Steven Mingam
July 12th, 2007, 04:16 AM
Steve, I can provide you with raw bayer material from my cam. Do you want anything in particular? Resolution? Colors? Lighting?

Just tell me where I can upload it.

Well, anything that could cause problem, lot of color, edges. For the resolution, 1080p, 720p and the ones you're planning to use (but for performances issues, you should _really_ use mod16 resolution).
If you don't have any server space, uploading will be an issue tough... can you access http://dl.free.fr ? It's not overloaded with ads... ("file to send" > "fichier à envoyer")

Thank you very much !
(btw do you have the full datasheet for the micron sensor ?)

Take Vos
July 12th, 2007, 04:27 AM
Jose,

Are you using gamma correction on your footage?

Jose A. Garcia
July 12th, 2007, 04:43 AM
No. I wanted to know what the sensor could do by itself. The software does have controls for gamma, contrast, gain, white balance... But that clip's not corrected in any way. I must say I really like the results.

Take Vos
July 12th, 2007, 05:04 AM
Could you please make one with gamma correction, in low light conditions. Keep gain and bias, etc on 0.

I want to know if what the Pike is doing is the same, I am already doing gamma correction and the company says that that is the problem. Which is completely weird, as everything should be gamma corrected.

Cheers,
Take

Jose A. Garcia
July 12th, 2007, 05:14 AM
Wayne, I'm thinking about adding a big computer fan to the final design to avoid very high temperatures but if anyone knows of a better procedure, please say it. I wouldn't like to loose 900euro and the possibility to shoot 2k because of that.

Jose A. Garcia
July 12th, 2007, 05:47 AM
Take and Steven... I'll shoot something for you asap.

Steven Mingam
July 12th, 2007, 05:56 AM
Well you are not "overclocking" any hardware device, you are just forcing windows to look more often at the USB port if something's happening... By decreasing the time you react to usb activity you gained a bit of fluidity but nothing will burn because of that. The hardware is still running at the same frequency...

[edit] err, in facts that's totally wrong. The usb master is polling the bus every 1ms (1000Hz)(that's why it sux btw ;)) so that's why they overclock their usb mouses : to get the refresh rate of the mouse at something better than 125Hz, but you are not concerned by this kind of hack, you're not using a mouse . I wonder why you gained something by doing so...

Jose A. Garcia
July 12th, 2007, 05:59 AM
Great then. I was worried about that.

Jose A. Garcia
July 12th, 2007, 05:57 PM
I've been talking with an Omnivision representative. Appart from the sensor and the demo board they offer me different solutions to make the camera portable.

We've been talking about different options but two of them sound very interesting.

One of them includes the board, a minipc fanless embedded system and a small touchscreen lcd. The minipc is really small and it has 1Gb of ram and up to 1,8Ghz processor. It also includes 4 usb ports and a compact flash slot or a 40Gb HDD.

The other option exchanges the minipc for a panelpc, so it includes the touchscreen lcd as part of the whole computer.

The conversation made me think, and I've been looking for mini-itx computers and carputers (computers for cars). I found many different options at very good prices that could be perfectly used to build a standalone camera. Many of those computers boot in just a few seconds and if they just have Windows and the camera software, I think they can handle the capture without any problem.

What do you think?

Igor Babic
July 13th, 2007, 01:01 AM
http://www.commell.com.tw/Product/SBC/LS-371.HTM

Jose A. Garcia
July 13th, 2007, 03:37 AM
Wow! Do you know the price?

Wayne Morellini
July 13th, 2007, 10:12 AM
Well you are not "overclocking" any hardware device, you are just forcing windows to look more often at the USB port if something's happening... By decreasing the time you react to usb activity you gained a bit of fluidity but nothing will burn because of that. The hardware is still running at the same frequency...

[edit] err, in facts that's totally wrong. The usb master is polling the bus every 1ms (1000Hz)(that's why it sux btw ;)) so that's why they overclock their usb mouses : to get the refresh rate of the mouse at something better than 125Hz, but you are not concerned by this kind of hack, you're not using a mouse . I wonder why you gained something by doing so...

If it is polling, then it maybe that synchronisation with the PC is better, and more close to the data issue on the camera. This would result in an little increase in efficiency, and less wastage of bandwidth (in case anybody is wondering what this means, frames are rolled out after shutter and other processes, this means that the data has to come out in an peak, for 180 degree shutter, it will likely double the bandwidth requirement, then 10bit words often take up 16bits, and is not packed fro transmission, you can start kissing 720p goodbye if this is the case as well). Cameras without memory buffer and packing are the problem.

Jose A. Garcia
July 13th, 2007, 11:17 AM
To be honest, Wayne, I didn't understand a thing of what you just said. I increased the frequency, now it goes faster (it reads more times per second) and my biggest concern was if that process could damage the camera in time, at least faster than normal use.

Also I have bad news. I just received an email from Micron. They say they apologize for the mistake but framerate cannot be fixed. I just can't believe it. That basically means the board can't be used to shoot movies. The only way to decrease framerate if it goes faster than 24fps is by decreasing electronic shutter frequency, and that means a more noticeable rolling shutter.

In a few weeks I'll have here the Omnivision board for testing but I seriously doubt it's better than the Micron.

Juan M. M. Fiebelkorn
July 13th, 2007, 08:11 PM
I think it would be better for you to start experimenting with an Elphel camera instead of trying any propietary solution out there.
Maybe at first the learning curve may be harder, but seeing how much progress you have obtained up to now I believe it would be the best path.

Jamie Varney
July 13th, 2007, 10:32 PM
I think it would be better for you to start experimenting with an Elphel camera instead of trying any propietary solution out there.
Maybe at first the learning curve may be harder, but seeing how much progress you have obtained up to now I believe it would be the best path.

Sorry Juan but I think that this is kinda a dim view. Jose is showing some serious progress, and If we don't keep experimenting how will we evar know what is going to work best for us?

Jose A. Garcia
July 14th, 2007, 04:09 AM
Hi Juan,

If you read through the post, you'll see the Elphel has always been one of my options. In fact, the new 353 uses exactly the same 5mp micron sensor I have and you can set the fps you want.

There're 3 points I don't really like about the Elphel:

- First, it's a complete already made product.

- Second, it compresses the image. It does it very well, but even the RAW Jpeg is compressed. You cannot choose to extract a RAW Bayer sequence from it.

- Third, and I know this sounds bad, Linux. I've got Windows and OSX Tiger installed in my computer and, if possible, I don't want to need to boot Linux everytime I want to shoot something. In fact, if I have to choose between both systems, I'd choose OSX, cause I'll end up compositing in Shake and editing in Final Cut.

Now, I decided I want to stay with the demo board if possible. I really like the image, motion and ease of use so,

Take and Steven, if you really want to help this project, what do you need to develop:

- A very simple recording tool that can control gain, resolution, shutter, binning, gamma, contrast, clock, white balance... and VERY important, FPS.

- A converter from RAW bayer to any usable lossless codec.

I'll try to find everything you need. Datasheets, manuals, actual software... I'll record RAW sequences for you. I can test Windows and OSX software here. Anything to get this project done. Just ask.

Jose A. Garcia
July 14th, 2007, 04:27 AM
Here is the first RAW sequence. 159 frames captured in Micron RAW format. It was debayering in real time when capturing (Laroche-Prescott) so I think it will just need RAW decoding. It also has a txt file with all sensor info and settings during the capture. If you need another one in Bayer format, just ask.

I'm also adding another file with all the PDFs that came with the board. All schematics, manuals, datasheets...

http://www.cus-cus.net/dvinfo/capture.rar

http://www.cus-cus.net/dvinfo/doc.rar (Not active)

It's uploading right now. It'll finish in about 10-15 min.

Jose A. Garcia
July 14th, 2007, 04:32 AM
Hey! I also found many C++ code samples! They explain how to program simple capture tools and things like that.

I think I'll just compress the whole Micron CD so you can have everything you need.

I'll post a link when it's ready.

Jose A. Garcia
July 14th, 2007, 05:12 AM
http://www.cus-cus.net/dvinfo/microncd.rar

Here it is.

Ivan Hamer
July 14th, 2007, 06:49 AM
Why do you want to do lossless encoding?
A single 7200 rpm SATA harddisk is fast enough to sustain bayer 1920x800@24fps,14bits uncompressed.

Take,

I am not getting how this is possible...
1920 x 800 x (4 x 14 / 8) x 24 = 258048000 Bytes/sec or about 246MB/s.
The "(4 x 14 / 8)" in the calculation is two Green, one Red and one Blue channel per pixel at 14 bits converted to Bytes.
The best sata drives (15K rpm) can sustain about 130MB/s. For regular 7200rpm, about 60-70 is all you can count on.

Ivan

Take Vos
July 14th, 2007, 07:20 AM
Hello Ivan,

Your calculation is off by 4; on a bayer sensor the green 1, green 2, blue and red, each occupy a separate pixel, with some fancy interpolation algorithm you can reconstruct all the three color components per pixel. Effectively there is a 1:3 compression ratio.

Also, as my own camera has firewire 800 connections my limitations are different. 1800 x 750 x 14bit @ 24fps or 1920 x 800 x 12bit @ 24fps. Actually I could do a little more height in 12 bit or wider with an larger sensor.

Ivan Hamer
July 14th, 2007, 07:52 AM
Hello Ivan,

Your calculation is off by 4; on a bayer sensor the green 1, green 2, blue and red, each occupy a separate pixel, with some fancy interpolation algorithm you can reconstruct all the three color components per pixel. Effectively there is a 1:3 compression ratio.

Also, as my own camera has firewire 800 connections my limitations are different. 1800 x 750 x 14bit @ 24fps or 1920 x 800 x 12bit @ 24fps. Actually I could do a little more height in 12 bit or wider with an larger sensor.

Oh, this is good news for me, then. My plan was to put a sensor and an FPGA on a mini PCI board which would go into a Via's nano ITX with RAID support. I figured it would only be possible by doing at least a 2x compression in the FPGA, but now I am thinking that it might be doable with no compression.

Steven Mingam
July 14th, 2007, 08:13 AM
If you need another one in Bayer format, just ask.
Please, i want to test my debayer algorithm ;)
(in fact, the best would be to have the same sequence in bayer and debayered by micron so i can compare... it might be difficult but there is no point to continue to work on my algorithm if it's bad :))

And for the elphel compressing video, well you could by-pass almost everything but the huffman compression in the FPGA and get lossless bayer compressed video. That's why it's a good start : you got the right hardware and the source of the camera, you just need the skills to modify it a bit.

Jose A. Garcia
July 14th, 2007, 08:24 AM
http://www.cus-cus.net/dvinfo/captureRawBayer.rar

Here it is. RAW and Bayer. Uploading now. May take 10 min or so.

Jose A. Garcia
July 14th, 2007, 06:08 PM
But Steve, wouldn't it be even better if we develop specific software for the demo board? We already have working hardware and the results look better than I thought. We just need to control fps and develop a simple tool to read and convert raw sequences (and even debayer them using the best possible algorithms) and we've got our camera! I mean, there're few cameras out there that can deliver 2k and the image and motion feeling on this one's just great.

We're almost there!

Jose A. Garcia
July 14th, 2007, 06:51 PM
Going back to the optical part of the camera, I'd like to know where to buy very sharp c-mount lenses. Don't get me wrong, I like the soft movielike look the camera has now but when the adaptor is added, the image will pass through 3 lenses (c-mount, achromat and 35mm lens) and the ground glass so if final image is a bit soft I want the 35mm lens to be responsible for that. The rest must be as sharp as possible.

I don't care if the c-mount lens I choose is a second hand one as long as it's very sharp. In fact, it will be much cheaper if it's used.

Where can I look for it?