Eric Bohlman wrote:
jake <ja**@gododdin.demon.co.uk> wrote in
news:Kt**************@gododdin.demon.co.uk:
Actually, I find that most browsers (IE/Mozilla/Netscape/Opera) do
quite a good job of re-scaling the image downwards (i.e. automatic
'down sampling'). The secret seems to be to produce the image big
enough so that, for most common browser size settings, the browser
is *always* re-sampling downwards.
The larger size of the image means that the compression need to be
higher, and so the image is probably not quite as good as a
fixed-sized one -- but it's really not too bad at all for all
practical purposes.
It works well with flexible pages, so that as the screen shrinks, the
image shrinks to maintain the same proportion (which may or may not
be a good thing, depending upon what your aiming for).
It should not, however, be used to create thumbnails. The distortion
then is usually severe, download time is *badly* affected, and
browsers (it's happened to me with both MSIE and Opera in various
versions) tend to scroll slowly or jerkily, or even crash, when
viewing pages with lots of "dumbnails."
I guess the point is that browsers should not be relied upon to resize
images *drastically*.
I agree with all of the above, for the current image standards such as JPEG,
GIF, PNG. (SVG is obviously intended to scale, but isn't properly supported
yet and doesn't really cover what is being talked about here).
I think it will change with JPEG2000. I've just been discussing elsewhere some
possibilities. I hope I have understood enough about it:
http://groups.google.com/gr*****************************@newsfep1-gui.server.ntli.net
<extract>
A key is the progressive nature of the serialisation of the compressed image.
The first part of the data stream has a low quality/resolution version of the
image, then later parts of the stream progressively add detail. (I haven't got
as far as finding out whether the first version has low colour as well as low
detail, but I think that is a possibility).
Suppose a photographer puts (say) a 2000 x 1000 JPEG2000 on the web site. A
user viewing that full size on a fast enough network will take the whole lot,
and first see a low resolution version that gradually improves. On a lower
speed network, a user may choose to stop the stream part way through, and make
do with the lower resolution image. Or ....
As far as I can tell, another intention is that the UA can display the image
over a smaller area of the screen (as the width & height in the "img" can say
at the moment). So if the UA shows that above picture at 1000 x 500, it will
only need to take part of the stream in order to achieve high resolution at
the small size. In other words, downward-scalability combined with bandwidth
optimisation. Thumbnails would simply be images formed from just the first
part of the data stream and displayed on a small area of the screen. They
would not be separate things on the server, nor would they need any different
protocol or even parameters - except ceasing the stream earlier.
</extract>
Apart from the fact that we are some way away from having widespread support
for JPEG2000, there is the problem identified by this thread. How will this
extra flexibility be influenced by CSSs? We must surely want these effects to
be handled by CSS without needing constructs that only work for JPEG2000. So
the CSS, the mark-up, the JPEG2000, the UA, etc, all need to cooperate to make
the image fit naturally into place, according to page-design intentions and
user's accessibility / scalability / performance requirements.
I can't put those together in my mind to see which way these will go with
JPEG2000.
--
Barry Pearson
http://www.Barry.Pearson.name/photography/ http://www.BirdsAndAnimals.info/ http://www.ChildSupportAnalysis.co.uk/