Technically speaking all graphics packages are based on "8 bit" technology. An image on a computer is made up of (usually) 4 channels.. red, green, blue and alpha. Each channel contains a greyscale image.. with 256 levels of greyness. 256 levels = 8 bit.
That's not the same as saying they're using 8 bit CPU instructions though. They're not. The code has kept up with modern CPUs and languages.
Technically speaking all graphics packages are based on "8 bit" technology. An image on a computer is made up of (usually) 4 channels.. red, green, blue and alpha. Each channel contains a greyscale image.. with 256 levels of greyness. 256 levels = 8 bit.
The format you describe is the standard, but there is also a higher-quality standard where each channel is 16 bits. This is supported to some extent by many graphics packages; certainly Photoshop has had fairly decent 16-bit channel support since Photosho
So the GP's question makes perfect sense: does GIMP now support 16-bit channels?
And if either you or the GGP had read the article, the answer is no. It's been pushed back. Actually, most cameras don't operate with 16 bit/channel color. I know mine has 12 bits/channel in the raw format. That still means I lose a tiiiny little bit when processing with a 8 bit application, but I honestly can't see any difference. And I've kept my raw images for later, should I wish to redo them. It's not so much the end produc
most cameras don't operate with 16 bit/channel color. I know mine has 12 bits/channel in the raw format. That still means I lose a tiiiny little bit when processing with a 8 bit application, but I honestly can't see any difference.
Most video cards only do 32 bit "true color", that is 8 bits each for R,G,B and an 8 bit alpha. You won't see any difference between 8, 12 and 16 bit per channel images with most cards. You might have a fancier card with fancy drivers that are set well. Then you might be able to see the difference.
Most video cards only do 32 bit "true color", that is 8 bits each for R,G,B and an 8 bit alpha. You won't see any difference between 8, 12 and 16 bit per channel images with most cards. You might have a fancier card with fancy drivers that are set well. Then you might be able to see the difference.
Yes, my screen is limited to 8bpc, But if I export it to 8bpc or 16bpc TIFF, perform the same operations and print it, I don't see the difference. At least not in any sense of "this is the original, this is the lo
Every time you do some kind of convolution on an image, rounding takes place. If you start with an 8-bit channel, the resulting image after several convolutions will have lost a significant amount of information. If you use 16-bits per channel, then you will lose less. It seems rather quaint, however, that the GIMP does destructive editing. A better design would store the sequence of transforms you apply to an image, and then run them at a higher colour resolution.
This is not that bad, really. You can't actually do most operations if you can't see how they'll come up. And for the rounding issue when convoluting, this is solved by using more resolution, not more colors.
You won't see any difference between 8, 12 and 16 bit per channel images with most cards.
This is 99% true for image VIEWING programs. However for image MANIPULATION programs (like the GIMP) it's a very different story.
Say for example you have a photograph that is underexposed such that the brightest pixel is 25% gray. For the sake of argument let's deal with a grayscale image (or just one channel of an RGB image).
On a histogram all the 'bars' for this underexposed picture will be bunched up the left side, o
I wanted to second this comment. I've been hoping for 16 bits per channel support in GIMP for a long time. I read somewhere that this was going to be a feature in the 2.0 release -- whoops! My digital camera puts out 12 bits per channel. Photoshop can handle this, but not gimp. Astronomical CCDs typically produce 16 bits or more. This performance is vital in astronomy because there is detail at multiple intensities (classic example is the nucleus and outer extents of a galaxy). Often tricks like logari
"Protozoa are small, and bacteria are small, but viruses are smaller
than the both put together."
8-bit graphic ? (Score:2, Redundant)
If I am right, may I know if the new 2.4 version has any improvement on this front ?
Thanks !!
Re:8-bit graphic ? (Score:2)
That's not the same as saying they're using 8 bit CPU instructions though. They're not. The code has kept up with modern CPUs and languages.
Re:8-bit graphic ? (Score:3, Informative)
The format you describe is the standard, but there is also a higher-quality standard where each channel is 16 bits. This is supported to some extent by many graphics packages; certainly Photoshop has had fairly decent 16-bit channel support since Photosho
Re:8-bit graphic ? (Score:2)
And if either you or the GGP had read the article, the answer is no. It's been pushed back. Actually, most cameras don't operate with 16 bit/channel color. I know mine has 12 bits/channel in the raw format. That still means I lose a tiiiny little bit when processing with a 8 bit application, but I honestly can't see any difference. And I've kept my raw images for later, should I wish to redo them. It's not so much the end produc
true color and your display (Score:1)
Most video cards only do 32 bit "true color", that is 8 bits each for R,G,B and an 8 bit alpha. You won't see any difference between 8, 12 and 16 bit per channel images with most cards. You might have a fancier card with fancy drivers that are set well. Then you might be able to see the difference.
Nvidia 7800 has up to 128 bpp [nvidia.com]. So do other fancy cards.
My crummy nforce4 has no such options, even with the nvidia driver. This is no big deal to me now.
Re:true color and your display (Score:2)
Yes, my screen is limited to 8bpc, But if I export it to 8bpc or 16bpc TIFF, perform the same operations and print it, I don't see the difference. At least not in any sense of "this is the original, this is the lo
Re:true color and your display (Score:2)
Re:true color and your display (Score:1)
Re:true color and your display (Score:3, Informative)
This is 99% true for image VIEWING programs.
However for image MANIPULATION programs (like the GIMP) it's a very different story.
Say for example you have a photograph that is underexposed such that the brightest pixel is 25% gray. For the sake of argument let's deal with a grayscale image (or just one channel of an RGB image).
On a histogram all the 'bars' for this underexposed picture will be bunched up the left side, o
Re:true color and your display (Score:2)
My digital camera puts out 12 bits per channel. Photoshop can handle this, but not gimp. Astronomical CCDs typically produce 16 bits or more. This performance is vital in astronomy because there is detail at multiple intensities (classic example is the nucleus and outer extents of a galaxy). Often tricks like logari