Technically speaking all graphics packages are based on "8 bit" technology. An image on a computer is made up of (usually) 4 channels.. red, green, blue and alpha. Each channel contains a greyscale image.. with 256 levels of greyness. 256 levels = 8 bit.
That's not the same as saying they're using 8 bit CPU instructions though. They're not. The code has kept up with modern CPUs and languages.
Technically speaking all graphics packages are based on "8 bit" technology. An image on a computer is made up of (usually) 4 channels.. red, green, blue and alpha. Each channel contains a greyscale image.. with 256 levels of greyness. 256 levels = 8 bit.
The format you describe is the standard, but there is also a higher-quality standard where each channel is 16 bits. This is supported to some extent by many graphics packages; certainly Photoshop has had fairly decent 16-bit channel support since Photosho
So the GP's question makes perfect sense: does GIMP now support 16-bit channels?
And if either you or the GGP had read the article, the answer is no. It's been pushed back. Actually, most cameras don't operate with 16 bit/channel color. I know mine has 12 bits/channel in the raw format. That still means I lose a tiiiny little bit when processing with a 8 bit application, but I honestly can't see any difference. And I've kept my raw images for later, should I wish to redo them. It's not so much the end produc
most cameras don't operate with 16 bit/channel color. I know mine has 12 bits/channel in the raw format. That still means I lose a tiiiny little bit when processing with a 8 bit application, but I honestly can't see any difference.
Most video cards only do 32 bit "true color", that is 8 bits each for R,G,B and an 8 bit alpha. You won't see any difference between 8, 12 and 16 bit per channel images with most cards. You might have a fancier card with fancy drivers that are set well. Then you might be able
You won't see any difference between 8, 12 and 16 bit per channel images with most cards.
This is 99% true for image VIEWING programs. However for image MANIPULATION programs (like the GIMP) it's a very different story.
Say for example you have a photograph that is underexposed such that the brightest pixel is 25% gray. For the sake of argument let's deal with a grayscale image (or just one channel of an RGB image).
On a histogram all the 'bars' for this underexposed picture will be bunched up the left side, o
I wanted to second this comment. I've been hoping for 16 bits per channel support in GIMP for a long time. I read somewhere that this was going to be a feature in the 2.0 release -- whoops!
My digital camera puts out 12 bits per channel. Photoshop can handle this, but not gimp. Astronomical CCDs typically produce 16 bits or more. This performance is vital in astronomy because there is detail at multiple intensities (classic example is the nucleus and outer extents of a galaxy). Often tricks like logarithmic scaling help to visualize an image better. Sure, there are specialized astronomical imaging programs to handle this, but for amateurs the gimp would be the perfect tool if only.
In regular photography much the same needs arise, except that they are not so overwhelmingly important. It's not uncommon to want to bring out some detail in the shadows while not saturating lighter parts of an image. Then there are those super-high-contrast LCDs that have been coming out...
I don't see why more people don't think this is a big deal. Images with 16 bits per channel have been around for at least 15 years.
8-bit graphic ? (Score:2, Redundant)
If I am right, may I know if the new 2.4 version has any improvement on this front ?
Thanks !!
Re:8-bit graphic ? (Score:2)
That's not the same as saying they're using 8 bit CPU instructions though. They're not. The code has kept up with modern CPUs and languages.
Re:8-bit graphic ? (Score:3, Informative)
The format you describe is the standard, but there is also a higher-quality standard where each channel is 16 bits. This is supported to some extent by many graphics packages; certainly Photoshop has had fairly decent 16-bit channel support since Photosho
Re:8-bit graphic ? (Score:2)
And if either you or the GGP had read the article, the answer is no. It's been pushed back. Actually, most cameras don't operate with 16 bit/channel color. I know mine has 12 bits/channel in the raw format. That still means I lose a tiiiny little bit when processing with a 8 bit application, but I honestly can't see any difference. And I've kept my raw images for later, should I wish to redo them. It's not so much the end produc
true color and your display (Score:1)
Most video cards only do 32 bit "true color", that is 8 bits each for R,G,B and an 8 bit alpha. You won't see any difference between 8, 12 and 16 bit per channel images with most cards. You might have a fancier card with fancy drivers that are set well. Then you might be able
Re:true color and your display (Score:3, Informative)
This is 99% true for image VIEWING programs.
However for image MANIPULATION programs (like the GIMP) it's a very different story.
Say for example you have a photograph that is underexposed such that the brightest pixel is 25% gray. For the sake of argument let's deal with a grayscale image (or just one channel of an RGB image).
On a histogram all the 'bars' for this underexposed picture will be bunched up the left side, o
Re:true color and your display (Score:2)
My digital camera puts out 12 bits per channel. Photoshop can handle this, but not gimp. Astronomical CCDs typically produce 16 bits or more. This performance is vital in astronomy because there is detail at multiple intensities (classic example is the nucleus and outer extents of a galaxy). Often tricks like logarithmic scaling help to visualize an image better. Sure, there are specialized astronomical imaging programs to handle this, but for amateurs the gimp would be the perfect tool if only.
In regular photography much the same needs arise, except that they are not so overwhelmingly important. It's not uncommon to want to bring out some detail in the shadows while not saturating lighter parts of an image. Then there are those super-high-contrast LCDs that have been coming out...
I don't see why more people don't think this is a big deal. Images with 16 bits per channel have been around for at least 15 years.