![]() |
VGA to GigE Vision or IIDC converter ??
On the topic of Pleora boards and the like - there are tons of high quality VGA machine vision cameras lying around... Has anyone worked on a RGB to GigE Vision or IIDC converter ?
Just wondering what it would take - perhaps I could try attempt one myself.. Quote:
|
Changing mouse settings on linux
FYI - Linux allows USB on steroids too :-)
http://www.linux-gamers.net/modules/...WTO+USBPolling Windows utility is called USB Mouserate switcher. |
|
maybe, but for what? geode cpus are very slow.
|
So I just finished reflowing the camera chip on to my heard board PCB
http://goosetech.homelinux.com/photo.php?photoid=6 Looks good! I wont be able to get around to coding up the interface for a while (exam week doth approach) But hopefully over Christmas we'll see the first pictures out of this puppy! Also, for processors, Im looking at the BF537 Its a 500MHz Analog devices DSP with two built in MAC's which means Ill have a myriad of avenues for encoding. |
SO what HAPPED to this project?
Did it end move or lost interest?
Just curious as I was waiting for some more info on any progress. Since 5 months ago have there been any new chips or revelations? Bob |
I guess Sumix offered what most of us were looking for and as I've got a lot to shoot and little time to do it, I'll just start by buying the 12a2c cam. After that I'll think about finishing this project.
|
Actually, Im still working away. Ive got my head board working on a logic analyzer implemented inside of the FPGA dev board Im using. This means I can see pixel data in byte form coming from the imager at 48MHz. The current challenge is storing this data stream, so I've been working with a SDRAM (8MB) to buffer a frame so I can attempt to offload it to a PC and get a "frame".
Full time student status makes progress on this task a little slow ;-) One thing Im also planning is to use experiment with a cypress part (FX2) to stream raw data across USB 2.0. So is anyone else still working on anything? Also out of curiosity mainly, would anyone out there be interested in a kit of my final revision of the project? My current laundry list of pieces for the kit/ final revsion are: - Blackfin BF537 DSP - Cyclone II FPGA - 64MB SDRAM - some amount of FLASH for firmware... - 100BaseT ethernet - MT9T001 Micron Imager I figure between the blackfin and the DSP blocks in the cyclone there would be more than enough head room to run some good compression schemes to dump the video across ethernet (or USB 2.0) Im rambling... by for now! |
I've been thinking of a great, at least to my knowledge, way to do a digital cinema camera. If you take a magazine from a 16mm(or 35mm for that matter) camera that has the sprocket mechanism in the magazine(Like the NPR) and put all your digital cinema diy products in there you'd esentially have a complete camera for very little money and work.
All the "camera" stuff would already be done in the body. You could just snap on your digital magazine to your body and shoot digitally or your ordinary magazine for film. Lensmount, shutter, optical viewfinder, you name it in camera. The magazine just holds the sensor, computer, recording media, battery and connectors(Maybe a small lcd on the side for info). You could even have a switch that is switched by the film feeder sprocket so that the computer knows which framerate the camera is running at. Some kind of weird fusion of computer and mechanical age. Now I do not know how a cmos sensor behaves but imagine that you take information from it at a regular rate, say 24/s, and you have a mechanical rolling shutter in front of it?! If so it seems it would work perfectly with a software that takes information from the sensor every time the switch I wrote about above registers a new frame. It seems unnesecary to reinvent camera functions for a digital cinema camera when you more easily could just change the media. You'd esentially just attach your s-16 sized cmos on the pressure plate inside the magazine. (Maybe with some sort of protection that keeps the pressure plate with the cmos inside the magazine until you snap it in place so that you wont scratch the sensor at attachment. Don't know exactly how that would be done but seems doable.) If you don't have that much money you could just buy a r16 camera and shoot cropped until you can afford to turn it into s16. Now I might be missing some major problems here as I'm not a sensor expert by any means but to me it seems like the perfect solution. |
Alex, may I ask if you're already controlling the sensor with the FPGA, why do you need a computer? The ideal solution would be to have an independent camera.
|
Klas -- I remember several years ago there was an attempt to convert 35mm SLR still cameras to digital by placing a panel-type unit (which had a sensor in it) into the standard 35mm pressure plate of the camera. I think the product changed names at least once but was called something like "efilm" or "SiliconFilm". The product didn't really take off when DSLRs became more affordable. I wonder if we could use an idea like that: just take an SLR still camera body and find a sensor which could be mounted in there, cabled out for dumping the data to a laptop or small computer? Like a glorifed webcam. I know some of these left field ideas are more problematic than just getting an existing video camera! But I think the spirit of these Alt Imaging threads is to ask questions and see if there's equipment and components out there, probably designed for other purposes, that might be used in some way to create DIY cameras (at a low cost). Sometimes I think a camera producing distinctive images can be more interesting than commercial cameras that produce technically better images, though obviously it depends on the type of projects you have in mind for them.
|
I have 2 old canon SLR cameras ready... If we place the sensor in the right position we'd also have autofocus if needed.
The only problem I see is that this is just a case. We'd still need either an APS-C or a full35mm sensor + sensor head + FPGA or similar to have the camera. |
Position accuracy is critical for the sensor. Without absolute perpendicularity of the sensor to the optical access the camera will not work for short depth of focus as you will have focus on one side and not on the other side. You must have laser autocollimator to adjust position accuracy to several microns for any quality camera. Protecting sensor from noise is even more difficult and only an art. Any noise on the power supply feeding the sensor will show in the images.
|
Yes, that's another problem...
But again, the main problem is not the case, the main problem is getting the camera to work. |
“This is just a case. We'd still need either an APS-C or a full35mm sensor + sensor head + FPGA” -- I agree. It depends how far back along the process you begin: if you start with a board/sensor, putting it inside an existing camera with a known lens system might be a step forward (though as Farhad says, alignment accuracy might exceed what an amateur can achieve). If you start with a camerahead, you already have all that and will be looking at data writing storage solutions, software, codecs, etc. The way forward for DIY cameras (the best start point) still seems unclear.
I once took apart a cheap webcam and put the sensor inside a 35mm SLR, but I couldn't get a good picture! |
Quote:
Though the sensor is a bit thicker than film i guess so you're right, there would probably be some adjustments needed to get the right focal plane. But that could be done by measuring and checking where your focus is if you could do some smart micro collimation, like on the Brevis. As for still photo adapters, they do exist. Not for your regular nikon but Hasselblad has some digital backs that you can attach to your old film cameras. |
I'm considering having a shot at this with MT9P401 sensor and cy7c68013a usb
2.0 chip. From what I've read from cy7c68013a datasheet, it should be possible to plug it directly to MT9P401(using gpif), so there is really no need for fpga. This chip runs at 48 MHz so 1440*1080@30fps is max. Would be nice to know if MT9P401 is pin compatible with MT9P031... You can't use huffyuv or similar algorithms (effectively) unless you perform debayering, so I think it's probably better to work on bayerized data. So I have been considering http://www.eurasip.org/Proceedings/E...rs/a5p-j05.pdf / http://ltswww.epfl.ch/ltsftp/ICIP2007/pdfs/0200353.pdf . Problem with this is that it's quite computationally heavy algorithm at these data rates. I'm also little worried that blackfin might not have enough wide io for sram to run this kind of compression. Rates are 253.68 MB/s for blackfin with 16 bit bus clocked at 133 MHz and 381.47 MB/s for TMS320C6413 with 32 bit bus clocked at 100 MHz(max allowed). It probably takes something like 5 times the input data rate to run this kind of compression algorithm. Also, I do not yet know whether it's computationally possible to do this in real-time. TMS320C6413 at max speed would give 32 MMACS per pixel for 1920*1080@30fps assuming all operations work on 16 bit data. Double that for 8 bit data. Compression ratio is about 0.5 - 0.6 on "average" case. Probably something like 0.3 - 0.4 if compressing delta between predicted and past frames with this algo. I haven't yet implemented the ffmpeg codec so I don't know what the exact compression ratios would be for video. The big problem with lossless compression is that compression cannot be guaranteed if the scene is just too complex. Final version would probably have: -some fast dsp -sony psp screen -arm7 cpu -sata/ide hd -http://www.dealextreme.com/details.dx/sku.7533 It's very unlikely final version becomes reality though. |
Hey, that sounds really good!
I have a few questions: - Why 30fps? If we're trying to build a cinema camera we'd need 24fps. - Are other resolutions possible? You talk about 1440x1080. What about 1920x800@24fps? - Is this a pc-less camera? - Lossless compression would be good but it's not a must. Low lossy compression is ok too if it stays between 4:1 and 8:1. New versions of Cineform are 4:1 and the RED One compresses even more. What about Motion JPEG2000? And visually lossless RAW? - Are you considering sharing the cy7c68013a code you write with the community, so everyone can buy all the parts needed and build a working cam? - What can I do to make your life easier? Thanks a lot. It really looks like we're finally getting somewhere. |
1920*800*24/1000000 would be 36.864 MHz which is fine as well. In theory, any resolution and fps with w*h*fps/1000000 < 48 should be fine. Bits per pixel is limited by transfering speed of usb 2.0. I'm not sure how advanced cy7c68013a is so only 8 bpp and 16 bpp(remaining ignored) could be possible.
First version having cy7c68013a would plug to usb port. Second version would probably have cy7c68013a as well. cy7c68013a is the same cypress FX2 Alex Stewart was talking about earlier. According to that paper, JPEG2000 compression ratio isn't as good as with this algo. I'm not familiar with JPEG2000 but I'd think it's also computationally more intensive than this. One candidate for near lossy compression could be http://ieeexplore.ieee.org/Xplore/lo...418.pdf?temp=x . I do not have access to ieee so I couldn't tell. It might be possible to come up with a good(fast) lossy algo by taking some of ideas proposed in papers I posted earlier. It's not the compression ratio thats the big problem here, its finding a algo that can achieve ~0.5(fixed) compression ratio using low cost and easily available components. More simpler compression algo means there will be less compression. TMS320C6413 costs about 25 euro, which is about as high I'm willing to go. Anyway, building a standalone compressing version is way more difficult and time consuming than just plugging a sensor to pc via usb. I'd hate to be realistic here, but this project will probably die once the usb version is done. Compressing version would require team work and a plan that everyone interested agrees to. Alex Stewart's version is too complex and expensive for me. MT9P401 pin layout is my main concern now. BTW, Alex Stewart got picture. |
Had another look at dsps and found TMS320DM6441. It's little faster than TMS320C6413 and has arm9 cpu, usb 2.0, ata, 100 mb emac, and ddr2 interface. Also has all sorts of jingle bells meant for video processing. At 30 euro it seems like a good candidate. This one has 0.8mm ball pitch while TMS320C6413 has 1.0mm. Ball layout looks much worse than TMS320C6413's.
|
Quote:
|
you can easily modify Huffyuv to work with bayer, just assume you are feeding it a 32 bit image.You just get 4 planes of a quarter resolution, two of green, 1 red, and 1 blue.The problem is you will get limited to 8 bit per color channel and you will also need to generate new statistical Huffman tables for the new data.
|
And some of the prediction schemes would not work as well because physical distance between comparable pixels is higher. You also couldn't perform conversion to color difference domain(r-g, b-g) since you have no green pixel at that position. Simple debayering on those pixels might work but I doubt you'd get as good compression as with huffyuv. It might still be a good solution especially considering how simple/fast huffyuv actually is.
I'm not that worried about the possible patent issues since most aspects of it can be replaced with similar methods. Only the context prediction scheme is really important. Need to ask about that. I got my modified version to run at 3 fps on my core duo@1.8 GHz(using one core) without mmx, sse, etc. optimizations, so it doesn't seem to be fast enough. gpgpu implementation would probably be fast enough for compression. Decompression doesn't parallelize that well so it would be slower. Anyway, I realized there is really no point in doing compression on standalone version because modern computers can't perform real-time compression to any reasonable format anyway. That is, unless you have some other reason to cut bandwidth requirements. Cheapo hard drives can write ~70-50 MB/s (depending on head position) which would be enough. I'm wondering if it would be possible to directly drive hd with arm7 cpu and sensor. Blanking times could be used to perform activity on hd with arm7(assuming it's fast enough to react :) . It would require that ata drives can accept several sectors of data without intervention between frames. According to http://home.nikocity.de/andymon/hfg/Alya/alya.html , 8 bit mode might not be supported on modern drives. LCD would need fpga or clpd though. Checking resistances on mt9p401 revealed that it is likely pin-compatible with mt9p031. |
Update from the DSP' front
Long time no post for me.... Since my last post I've graduated college, found a paying job, and gotten ready for my wedding... Which is this Saturday :)
And also as I'm sure some of you noticed I got my system to cough up a image !!! The color is awful because the i2c bus is not hooked up ( is I'm using 100% default gains and white balance). I did not make much progress do to a nasty kernal panic that the ppi driver on the blackfin would kick out..... Hapiily that issue is solved now. So!! At the moment I'm rigging the i2c bus up so I can mess with the gain, then I'll be piping my data into vlc to stream some video. This is mainly for debugging, the bf can only encode at about 10 fps into mp4 at this res, Once that works I'll be attaching a hdmi and IDE interface to the blackfin. Piping in on archecture of my system, the idea is for the Fpga to do the heavy lifting ( ie decode the bayer, and convert to yuv) so the data stream from the Fpga will be raw full res yuv data, this gets piped into two adv202"s which plop out a jpg2000 frame that the DSP' dumps to storage medium .... Or streams acrosss Ethernet... The nice thing is, if say I want to aide in compression , the dsp' has enough ummf but so foes the Fpga.... Hince the hardware, once I'm done, it's modular enough to support just about any software and cinema work flow you could think of.... ( ps.... I plan to sell the boards at $600 ea) (pps.... My genious 3d animator brother is also doing up a spiffy case I'll have milled out of billet Al. ........ Red one here I come :p ) Please forgive the horrid spelling and formatting, I'm typing this on a iPod touch since my laptop is in a box.... Somewhere..... ( sad thing is I can ping it... So I know it's near ...,.) By or now folks !!! I'll be back with hopefully some video clips in about three weeks :) |
Waiting for some equipment so I can make the pcbs.
|
Marlin F-131c vs HV20
just wondering if anyone has compared the Ibis sensor Marlin F-131c cameras with a HV20 to compare quality ? I'm sure its worse than the CCD MVC's but how does it compare to consumer HD camcorders ?
Thanks! Quote:
|
Software for DIY Digital Cinema camera
I am developing DI editing, color correction, sound mixing, and filmout to 35mm movie film software for use with DIY Digital Cinema cameras.
I was wondering if someone has a list of uncompressed cameras that can shoot RAW sensor data better than 8bits, 1280x720p or better at exactly 24fps for sync sound shooting? The Aptina MT9P401I12STC sensor seems like a good choice for such a camera, could you summarize the current status of your project, and how others could get results usable for Digital Cinema like the results you are getting, i.e. what needs to be purchased, who do you get it from, what softwares need to be loaded and how one gets the settings right? Anyone who has a Digital Cinema camera that is in a state that lets them shoot 1280x720p or more, 24fps, 12bit or so, RAW sensor data please e-mail me directly so we can stay in contact as I develop my post production system more, I would like to develop some extra software de-Bayer and such to support low cost Digital Cinema cameras so users/"Beta Testers" of my programs would have more low cost Digital Cinema camera options. If you have any questions about my "freeish" software or need support using it with DIY Digital Cinema cameras feel free to e-mail me directly or through here. Dan Hudgins tempnulbox [at] yahoo [dot] com The official DANCAD3D (tm) Beta Test Web site. I may also be able to help with some suggestions for some circuit/firmware additions to aid in having your camera work well with my programs and in general for use as a Digital Cinema camera. |
Buffered camera for USB speed issues?
I have been thinking that it would be useful to make a Digital Cinema camera based on the Aptina MT9P401I12STC sensor to get 2592x1102x12bit@24fps and 2048x1152x12bit@24fps RAW sensor data. If you cannot get 2592x1102 to work then 2560x1080 may.
At any rate, to overcome the limits of the USB speed, you can build a "shoot and dump" camera that has enough FLASH or DRAM to store about a minute of shooting as RAW 12bit sensor data, then after the shot is finished the data in the buffer can be downloaded through the slow USB. What that gets you is high quality images without any problems with the computer's harddisk speed or dropped frames. A maximum record time of one minute per shot is fine for feature filmmaking since normal feature films do not have any single shots longer than one minute, as edited in the finished film. If you make the buffer memory swapable, you can run the camera in two modes, "shoot and dump" or "shoot and swap". If you are shooting in "shoot and swap" mode you can put the buffer card back in to download it later. Being able to also swap lets you shoot many takes rapidly, than download them later. Being able to "shoot and dump" lets you download while you are setting up for the next take. The download time would only be about three or four times the shoot time so it should work well. Such a camera could be an open project so that various people could make improvements to the PCB circuit and layout. I would be interested in both using such a camera to help with my Digital Cinema software development, and to support such a camera for use with the DI software I am developing. The reason for trying for 2592x1102x12bit@24fps is to be able to down size and squeeze the images for 35mm filmout using my film recorder program to get fewer artifacts in the end result, and to use the largest sensor area possible to overcome the resolution issues of the C/CS mount lenses that can be used. Please feel free to e-mail me about such a camera, or anything that can shoot well at 24fps true RAW sensor data at more than 8 bits and high resolution. I would like to develop working workflows for such cameras so that people can use the cameras that work well with my programs. I have tested my suggested workflow for the RED ONE (tm) Digital Cinema camera, and have a suggested workflow for the Acam dII (tm) camera on my Web site, but I will need to wait for sample DNG frame sets from Acam dII (tm) to confirm that it works or to make adjustments. At any rate, it should not be too hard to add other RAW mode camera's to the supported workflows. Dan Hudgins tempnulbox [at] yahoo [dot] com The official DANCAD3D (tm) Beta Test Web site. |
Thanks for the interest
Thanks to the DVINFO readers that have contacted me about my software and the subject of uncompressed cameras using the Aptina sensors. I expect there will be some more developments in this area, if you know of a camera project that might be useful with my DI system please let me know more so I can look into what cameras would be a good choice for doing Digital Cinema with the software I am developing to help people make motion pictures digitally... I have been doing some tests with my DANCINEL.EXE (tm) program to transfer digital frames to 35mm movie print stock, and by using my "spread gamma" feature the results look good so far shooting off a 1600x1200 LCD monitor with time exposures of about 4 seconds and longer, depending on the filters used and the f/stop of the lens. My film recorder program also supports CRT monitors up to 2048x1536x32bpp (24bpp+8bits padding, as thats the way the video boards work.)
|
ROSCO #22 Filter
As a follow up to my post about doing some filmout tests, I found that a ROSCO #22 filter (redish orange) gives close to color balance for 35mm Eastman Kodak Vision color print film 2383 2393 stock shooting off a LCD 1600x1200 monitor. The total exposure time is about 120 to 180 seconds at T/2.8, when using my DANCINEL.EXE (tm) in three exposure "gamma spread" mode you set the animation motor delay to 45 to 60 seconds. For printing black and white on Eastman Kodak 2302 the exposure is about 4 x 3 seconds (12 seconds total) at T/8 with no filter. The main problem shooting color printstock of a monitor is the lack of RED light, I am looking into ways to deal with this issue, but for now you can use tri-color or color seperation modes if you have the time. If you want help with the gamma adjustments feel free to contact me, my email is above etc.
|
Uncompressed Digital Cinema Cameras?
Are there any DIY Digital Cinema Camera projects with uncompressed true RAW sensor data under-development now?
I have been working on de-Bayer software to go with my freeish DI/NLE/CC/MIX software I am working on, and if anyone has sample sensor data RAW files, I can look into adding support for their camera(s) before I release my new program, maybe this summer or so. I have updated my "freeish" NLE's CC control screens to work better on some Windows XP Home (tm) computers that had issues with the higher resolution video modes, its up on my web site for download now. It works with frame files up to 6K or more. Dan Hudgins tempnulbox [at] yahoo [dot] com The official DANCAD3D (tm) Beta Test Web site. San Francisco, CA USA |
Has anyone heard from Jose A. Garcia?
Hi,
Things are moving forward on the software and camera front, I was talking to Jose A. Garcia, but did not hear from him on my last email, has anyone heard from him in the last few months? Thanks for any info you have, since this was his project thread I thought I would let him know about some developments, which I can't go into in public quite yet. Dan Hudgins tempbulbox [at] yahoo [dot] com The official DANCAD3D (tm) Beta Test Web site. San Francisco, CA USA |
Dan, have you seen the newly announced USB3 cameras from Point Grey?
Point Grey | The World's First USB 3.0 Camera Preliminary pricing on the 32S2 model will be about $800US and the 63S4 will be around $2000US. |
Re: New DIY HD Cinema Camera Project
Hi Arnie,
Thanks for the link about the USB 3.0 camera. I made a SSD test program and some people there with fast computers were able to record it seems 2592x1102x12bit @ 24fps using two SSD on a RAID. There may be some glitch issues, and a 64bit OS may work better with a quad core, but it might work. With "normal" older computers it may be hard to record above about 1280x720p uncompressed. Cabled cameras are a bit harder to use in the field, but for studio use it could work. The de-Bayer program I have been working on for Cinema use it getting close to release. Someone sent me a SUMIX RAW file, but is was not uncompressed "RAW" it was poorly white balanced, gamma adjusted, and noise filtered. Once the sensor data is mucked up, you cannot un-do that. It seems you have to go into the SUMIX software and check and uncheck some things before it saves actual RAW sensor data in 12bits format. If someone has a real 12bit RAW SUMIX file, I would like to look at that data and it would help me with the presets for the de-Bayer processing. I do have the commands to break-up the SUMIX RAW into RAW frames, for shots up to 2GB long. And they gave me their 12bit packing method so I can un-pack using that, although I need real RAW sensor data to check and see if everything is working right. at 1920x1080x1.5bytes per pixel at 24fps, 2GB is, 26 seconds, so that is enough to shoot a movie as few if any shots are longer than that. With other cameras that shoot RAW frames, there is no "limit" as my program can process shots over an hour long that way. I also have support for SI-2K (through their DNG convertor), Kinor-2K (as DNG), ACAM dII (as DNG), and some other cameras that make RAW but are not on the market yet. My de-Bayer should be able to process any normal 8, 12, or 16 bit RAW Bayer or monochrome data to BMP 8/24bpp, DPX 10/16bpc, CIN 10bpc, or TIF 16bpc. Anyway, if you know any other low cost cameras that can shoot RAW at 24fps, please let me know so I can look into supporting them also if I am able. When I get my de-Bayer program posted on my Web site it will be a free download for "beta testing", between now and then people can contact me about supporting cameras or doing sample conversions. I was talking to Jose A. Garcia about making his type of camera "real" but have not heard from him for a while, but there has been progress... |
Re: New DIY HD Cinema Camera Project
The USB camera is really interesting. I'm wondering how much work it would be to push everything (processing and controls) into software, and replace all the expensive proprietary hardware with commodity parts.
If you could bundle up a simple board to convert the raw signal to (say) Adobe's Cinema DNG, control the camera, and feed a RAID system then it would be possible to build a small, lightweight rig. However it's probably worth keeping an eye on Intel's Thunderbolt / Lightpeak technology which is even faster than USB 3.0. And if they ever release the optical version, you could have a 100 metre cable, which gives a tethered camera a lot of flexibility. The Register has some coverage. Intel throws Thunderbolt ? The Register Intel: 'PC makers took the light out of Light Peak' ? The Register And there's the Intel product page for the truly geeky... Thunderbolt Technology |
Re: New DIY HD Cinema Camera Project
Dan, are you aware of the Elphel cameras & the Apertus opens source cinema project that's affiliated with it? I think that Elphel is currently using an Aptina sensor. The camera is open hardware with open source software. Apertus is trying to develop the software along with an open hardware controller.
Elphel, Inc. | Imaging solutions with Free software and open hardware Home page | Apertus - Open Source Cinema |
Re: New DIY HD Cinema Camera Project
Quote: [Graeme Sutherland, If you could bundle up a simple board to convert the raw signal to (say) Adobe's Cinema DNG, control the camera, and feed a RAID system then it would be possible to build a small, lightweight rig.]
The new software I plan to release soon already converts True RAW sensor data to DNG, and also does bad pixel removal (interpolation) and FPN subtraction. The main problem with DNG is Tag 50721 XYZ to RGB color matrix. There does not seem to be any document that tells how to find the right XYZ values based on shooting test charts. It is possable to guess the "right" matrix values, but the results from DNG conversion are not good. Direct conversion to 16bpc DPX and TIF files is a MUCH better option than doing anything with DNG files for output. I also have DNG input, but use it as RAW data and ignore the tags that just muck up the image processing and use my own "film-like" processing video path that seems to be producing results better than what I have seen posted from DNG conversion in other programs. I sent some samples to Acam dII of their sample DNG data and asked for permission to show my version of their images but have not heard anything back from them. When my program is out you may be able to use it to process the Acam dII sample DNG files and see for yourself if any other DNG convertor does a better job, if you find one let me know I would like to look at what you think looks better about their conversion. My processing is full manual adjustment, which can help avoid issues with pink highlights and such. If anyone has their own Acam dII (or any other True RAW cinema camera) and wants me to do a sample conversion for them they can contact me. The main problem with recording on a computer is the speed of the SSD, and the glitches caused by the OS and hardware being interrupted. An all hardware solution using CF card array would make a much more compact camera. The software support I have developed solves all of the image processing issues once you save the true sensor data without glitches. Quote: [Arnie Schlissel, Dan, are you aware of the Elphel cameras & the Apertus...] I spoke to them on their forum about adding a True RAW 12bit port to their 8bit compressed camera so that the full 12bit sensor data could be extracted for recording on an external recorder using DRAM. It seems they were not interested in talking with me about what would be needed for producing a True RAW recording camera. Major camera makers are going to 12 and 14 bits, their inisting that 8 bits is "all that is needed" or whatever they say is non productive as far as where my projects are headed. More than 8 bits are needed just to hide all banding in Digital Images, and if you are going to grade with heavy ISO boost and S-curves then there is no other result than histogram gaps if you start with just 8 bits. 8 bit linear data will not even gamma correct to 2.2 without gaps opening in the shadow areas. Also it is not a good idea to try to de-Bayer compressed data, and any claim that the compression is 100% lossless (no change to the sensor data ADC values) at compression ratios of 10:1 or more (compaired to the packed sensor data bits or delta encoding) is just out of the picture. If they want to take all the compression "stuff" out of their design and record the actual sensor data as is 100% un-altered, and save it as is to a file named RAW, in numbered frame files like, 00000000.RAW 00000001.RAW 00000002.RAW Then all that is needed after I release my new de-Bayer program is to enter the RAW setup values for un-supported cameras in the manual setup prompt with values like: X pixel size Y pixel size Data format, 8, 12 (two types supported now), or 16 bits Bytes per pixel, 1, 1.5, 2 Shift Left for MSB aligment and so on. There are prompts for X and Y mirror and Bayer order as some sensors use various starting color pixel. I just added support for SUMIX (tm) cameras, but when I asked SUMIX (tm) for some sample True RAW sensor data samples shot under calibrated lighting they did not reply as far as I know. Someone sent me a sample that had: gamma correction and clipping, white balance done badly, median filter noise reduction, and such done to the data, so there is nothing I can do with that to make a Preset for that camera yet, someone would need to send me a sample of actual RAW sensor data, not Bayer data that has been "goofed" with and so is unusable for extracting my idea of quality results. == I was talking to Jose A. Garcia about the camera development, has anyone heard from him lately? |
Re: New DIY HD Cinema Camera Project
I am a Documentary shooter, so while I am always excited about these projects, the wind comes out of my sails at the talk of any kind of tethering. This is my question: Why can't we just connect a 2.5" SSD housing to these cameras to capture the data stream? Is there some way to pre-load whatever software controls are on the PC into an on-board card (and eliminate the PC entirely)?
What I am looking for is an ENG camera that uses SSD's as a capture medium, and takes advantage of the fast transfer speeds to decrease compression a little bit and increase the bitrate of the footage. Can one of the knowledgeable in this forum help me understand what is missing? |
Re: New DIY HD Cinema Camera Project
I don't think there is any reason that such a camera could not be made...
For ENG use though rolling shutter issues need to be delt with, by shooting at 48 or 72fps for 24fps results, or using a sensor with a global shutter which can increase the camera cost. For tripod use where the camera does not move fast (such as for some kinds of feature film making), the rolling shutter is less of an issue at 180 degrees... |
Re: New DIY HD Cinema Camera Project
The main issue that has come up with talking with various camera makers working on True RAW recording cameras is that they seem to not want to make a "shoot-then-dump" camera because the concept of a camera that has a maximum record time of 30 seconds seems limiting the market for the camera.
But for feature film production all you need is a camera that can shoot shots up to 30 seconds, as very few movies have any shots longer than 30 seconds. So it seems the lowest cost Digital Cinema Camera would be one that is like a web cam, but has a 30 second buffer built in using DRAM since FLASH memory can wear out over time. DRAM also can have an advantage in that it may have no bad addresses, FLASH generally requires mapping the bad addresses which makes its implementation more complex. Since all the True RAW data writes to the buffer, the link to download the data in non-real-time can be slower than the ADC output, so even USB/LAN can be used for 2.5K data. A net-book can read a sub-sample of the data flow from the sensor to the buffer as the monitor display, then download the data after the buffer is full or a shot is finished, passing that data on to an external TB HDD plugged into another USB port etc. Without a buffer between the sensor and the downloading computer, it is very hard to record on a "light weight" computer full uncompressed sensor data. After having worked with uncompressed sensor data in the de-Bayer program I am developing, I think its better than various compressed data formats in that there are NO extra artifacts added, and its not that hard to deal with the time and cost of working with uncompressed DI now that TB disks are falling in price so much and second hand computers for the "render farm" are fast enough to get the work done in less than a year. If you were cutting the picture (workprint and then negative conform) and sound (multi-track magfilm) yourself it would probably take longer than making an uncompressed DI at home, and the DI will probably give you better results in the end since you would not have as much dust and scratches etc. It still takes time to sort through the thousands of shots and sound takes and get things right no matter if you are working with film or in digital frames, at least if you want the best results you can get DIY. |
All times are GMT -6. The time now is 07:44 PM. |
DV Info Net -- Real Names, Real People, Real Info!
1998-2025 The Digital Video Information Network