DPChallenge: A Digital Photography Contest You are not logged in. (log in or register
 

DPChallenge Forums >> General Discussion >> Black Silicon
Pages:  
Showing posts 1 - 11 of 11, (reverse)
AuthorThread
10/13/2008 05:25:00 PM · #1
Black Silicon

500 times more sensitive. Howz that work out in stops? Are we gonna have a 6D or a D4 with a 1 million ISO?
10/13/2008 06:15:15 PM · #2
Originally posted by fir3bird:

Howz that work out in stops? Are we gonna have a 6D or a D4 with a 1 million ISO?


Don't hold your breath. That 500x number is certainly a "marketing number," in other words it's probably true, under certain special conditions, but wildly optimistic for most applications.
Today's camera sensors are already quite efficient. A perfectly efficient sensor, generating one electron for each incident photon, would have a QE (Quantum Efficiency) of 1.0 (100%). Today's sensors are as much as 45% efficient (Reference, see Table 2), so 2x (1 stop) is about the theoretical maximum improvement that can be achieved through sensitivity improvement alone.
Where the improvement might be more important would be in solar power, where improvement in broad-band QE (including IR and visible) means the possibility of capturing/converting a far greater portion of the power falling on the chip, and needing to reject a much smaller fraction of that power in waste heat.
10/13/2008 06:24:17 PM · #3
Originally posted by kirbic:

Originally posted by fir3bird:

Howz that work out in stops? Are we gonna have a 6D or a D4 with a 1 million ISO?


Don't hold your breath. That 500x number is certainly a "marketing number," in other words it's probably true, under certain special conditions, but wildly optimistic for most applications.
Today's camera sensors are already quite efficient. A perfectly efficient sensor, generating one electron for each incident photon, would have a QE (Quantum Efficiency) of 1.0 (100%). Today's sensors are as much as 45% efficient (Reference, see Table 2), so 2x (1 stop) is about the theoretical maximum improvement that can be achieved through sensitivity improvement alone.
Where the improvement might be more important would be in solar power, where improvement in broad-band QE (including IR and visible) means the possibility of capturing/converting a far greater portion of the power falling on the chip, and needing to reject a much smaller fraction of that power in waste heat.


Geez....what are you some kind of mechanical engineer? Oh, yea...you are.
10/13/2008 06:33:48 PM · #4
Originally posted by kirbic:

Where the improvement might be more important would be in solar power, where improvement in broad-band QE (including IR and visible) means the possibility of capturing/converting a far greater portion of the power falling on the chip, and needing to reject a much smaller fraction of that power in waste heat.

So why haven't they invented warp drive yet?

10/13/2008 11:08:24 PM · #5
Originally posted by kirbic:

Originally posted by fir3bird:

Howz that work out in stops? Are we gonna have a 6D or a D4 with a 1 million ISO?


Don't hold your breath. That 500x number is certainly a "marketing number," in other words it's probably true, under certain special conditions, but wildly optimistic for most applications.
Today's camera sensors are already quite efficient. A perfectly efficient sensor, generating one electron for each incident photon, would have a QE (Quantum Efficiency) of 1.0 (100%). Today's sensors are as much as 45% efficient (Reference, see Table 2), so 2x (1 stop) is about the theoretical maximum improvement that can be achieved through sensitivity improvement alone.
Where the improvement might be more important would be in solar power, where improvement in broad-band QE (including IR and visible) means the possibility of capturing/converting a far greater portion of the power falling on the chip, and needing to reject a much smaller fraction of that power in waste heat.


It's scary....I kind of get what your saying.

What if this, and I am just being hypothetical, and don't know anything about black silicon. But, the spikes mentioned in the article, what if the sensitivity is more efficient, still keeping to the QE 1.0, but say somehow the spikes are able to gather the light more effectively, as opposed to standard silicon.

In other words, what if the spikes are kind of like a net that captures the light more efficiently, where as the old silicon version skips light like a rock on water.

Does this make sense?
10/14/2008 06:08:46 PM · #6
Originally posted by Man_Called_Horse:



What if this, and I am just being hypothetical, and don't know anything about black silicon. But, the spikes mentioned in the article, what if the sensitivity is more efficient, still keeping to the QE 1.0, but say somehow the spikes are able to gather the light more effectively, as opposed to standard silicon.

In other words, what if the spikes are kind of like a net that captures the light more efficiently, where as the old silicon version skips light like a rock on water.

Does this make sense?


Yep, that's exactly what happens. But remember, if you are Spongebob, trapping jellyfish with a net, the greatest number of jellyfish you can net is the total jellyfish in the swarm.
Same with photons; there are only so many photons to be captured, and a QE of 1.0 means you are capturing them all... there are no more left to capture. Today's sensors capture nearly half the total photons available, and that is quite a feat.
One real area left for improvement is getting rid of the color filter array. If we don't filter, we have to differentiate between colors *somehow* and that could be done a number of ways. Foveon currently does it by leveraging the fact that light of different wavelengths penetrates silicon to different depths. They use detectors positioned at different depths to do this. In effect, they are still filtering, but they can capture and use some of the "filtered" photons. The results have not measured up to the best Bayer pattern sensors, though.
Another, much more sophisticated way to do this would be to measure the energy of each photon detected, rather than just counting presence. The energy of the photon is its "color." This is *much* more difficult than it sounds.
10/14/2008 06:14:54 PM · #7
Good analogies to help us lacking in physics understanding (though I do understand to a small degree because it is required for most engineers anyway). Go Spongebob!
10/14/2008 06:31:42 PM · #8
dont sensors have to be FLAT to make a sharp image?
10/14/2008 06:45:47 PM · #9
Originally posted by kirbic:


Same with photons; there are only so many photons to be captured, and a QE of 1.0 means you are capturing them all... there are no more left to capture. Today's sensors capture nearly half the total photons available, and that is quite a feat.


How do you calculate/measure the number of photons striking the sensor
?
10/14/2008 07:55:55 PM · #10
Originally posted by fir3bird:

Originally posted by kirbic:


Same with photons; there are only so many photons to be captured, and a QE of 1.0 means you are capturing them all... there are no more left to capture. Today's sensors capture nearly half the total photons available, and that is quite a feat.


How do you calculate/measure the number of photons striking the sensor
?


Given the measured intensity of light (in Lux or some other quantifiable unit) the number of photons per unit area per second can be calculated. Measuring light intensity is like measuring the speed with which a bucket fills up in a downpour, for example 5 cm per hour. Knowing that rate, yoiu can calculate how many drops per second are required to fill it at that rate. The drops per second is like the rate of photon arrival.
We can take the raindrop analogy further... if we stick a note card out in a light rain shower for a short time, then pull it in and count the raindrops that have hit it, then repeat the test many times (each with the same "exposure time" we'll find that the number of drops is not the same on all cards, but varies randomly. Photon arrival works *exactly* the same way, in fact the statistical distribution of raindrop counts follows the same rules as photon counts do!
Now imagine that you expose an array of note cards to the rain shower, and count the drops on each card. Same result as before, counts vary. This is the same effect as "photon arrival noise" which is a very large component of the noise we see in very high ISO images... not the "fixed pattern" noise, but the noise that's different from frame to frame.
10/14/2008 07:57:59 PM · #11
Originally posted by tapeworm_jimmy:

dont sensors have to be FLAT to make a sharp image?


The "spikes" they are talking about are incredibly tiny, so tiny that you probably could not even see them with an optical microscope. As far as the light is concerned, the sensor would still be flat.
Pages:  
Current Server Time: 05/26/2020 09:51:47 AM

Please log in or register to post to the forums.


Home - Challenges - Community - League - Photos - Cameras - Lenses - Learn - Prints! - Help - Terms of Use - Privacy - Top ^
DPChallenge, and website content and design, Copyright © 2001-2020 Challenging Technologies, LLC.
All digital photo copyrights belong to the photographers and may not be used without permission.
Proudly hosted by Sargasso Networks. Current Server Time: 05/26/2020 09:51:47 AM EDT.