[ome-users] ome-users Digest, Vol 45, Issue 9 - -4. OMERO compression and duplicate storage (Ghislain Bonamy)

Ghislain Bonamy GBonamy at gnf.org
Sun Dec 7 22:04:20 GMT 2008


Frans,
 
Thanks for the answer. Like you I agree that JPEG2000, offers state of the art compression when it comes to images. This is why I have implemented JPEG2000 compression in bioformats for the TIFF and the OME files (readers and writers). This should hopefully be released soon. I am also trying to implement a solution to allow for variable rate compression. For instance images that are on a 16bit scale but only use the first 8 bit, should be saved on 1byte and not 2, this would also increase compression by a factor of 2. I also would like to generalize this so that images that use 12bit of data be compressed as 12 bit images and not 16 (if this is possible, this would certainly save a lot of space this would be particularly useful in lossy compression). Finally I intend to implement lossy Jpeg2000 compression in this variable bit rate format, which should allow 10 fold compression, without many artifacts, in particular for images that are dim.
 
I would be most interested to hear how you compress your 3D data! Do you actually compress 1 slice at a time, or do you somehow use compression like in JPEG motion where info from multiple slice are pulled together to improve the overall compression. If so, how could this be implemented for multi stack tiffs etc. and build into bioformats?
 
For the storage solution  am not quite sure how Castor operates nor how expensive it is, but I prompted our IT guys to have a look at it. This could ultimately be the solution of choice!
 
Best,
 
 Ghislain Bonamy, PhD 
_______________________________
Genomic Institute of the
Novartis Research
Foundation
Functional Genomics, G214
+1 (858) 812-1534 (W & F)
+1 (858) 354-7388 (C)
www.gnf.org <http://www.gnf.org/> 
 
Hudson Alfa Instiute for Biotechnology
www.haib.org <http://www.haib.org/> 
<http://www.gnf.org/> 

________________________________

From: Cornelissen, Frans [PRDBE] [mailto:FCORNELI at its.jnj.com]
Sent: Sat 12/6/2008 4:55 AM
To: ome-users at lists.openmicroscopy.org.uk; Ghislain Bonamy
Subject: RE: ome-users Digest, Vol 45, Issue 9 - -4. OMERO compression and duplicate storage (Ghislain Bonamy)




   4. OMERO compression and duplicate storage (Ghislain Bonamy)

----------------------------------------------------------------------

Ghislain,


About: "store lossy images on disk":

We are currently using Jpeg 2000 compression for this.
The advantage of using this would be that you do not even need to store
2 versions of your image files; JP2 allow sto read any size or any
quality
image version out of a single (lossless or lossy) compressed file.

JP2 is currently THE state of the art way to get maximum flexibility AND
Maximum compression ratio; for 2D Biological images lossless, factors of
2.5 to 4 are achievable (depending on the actual image content)
In 3D, quasi lossless compression (PSNR >45) of 20-100 times is possible


About:"slow tape system to store files"

you could have a look at CASTOR from CARINGO (www.caringo.com), a
simple,cheap, but very efficient next-gen Software system for
distributed storage on heterogenous hardware, with very good
performance.
Scales very well to > 80 PB.
We have been testing it to store JPEG 2000 images for 18 months, worked
withour a flaw!
Also in use at the The Center of Inherited Disease Research (CIDR) at
Johns Hopkins University



Message: 4
Date: Fri, 5 Dec 2008 10:10:54 -0800
From: "Ghislain Bonamy" <GBonamy at gnf.org>
Subject: [ome-users] OMERO compression and duplicate storage
To: <ome-users at lists.openmicroscopy.org.uk>
Message-ID:
        <F5A26DAD36F60843830631774C95CAE205807A58 at EXCH2.rec.gnf.org>
Content-Type: text/plain; charset="us-ascii"

Dear all,

I was wondering if OMERO has already implement, is thinking or would
consider implementing a mechanism to store images into two different
format while keeping there metadata linked.

I am working in a center were we are generating several TB of data a
month and where keeping all of our images on disk becomes impossible. To
remedy to this, we have a >2 PT tape storage solution, which is however
very slow (takes about 2-4 Minutes to retrieve a file the first time
until it is de-cued) . My idea would be to store lossy images on disk
for people to view and modify metadata, while keeping the full blown
image onto our slow storage solution in case they needed to be
reanalyzed for instance. The metadata for this images would however be
kept identical between images.

Perhaps, a way to do this would be to store in Omero the file and keep
in the metadata a link to the original image. If there are another and
better solution please let me know.

In addition, does OMERO store the metadata in a compressed way (as well
as the images), or is there a way to have OMERO apply a script (for
instance a gzip compression) when importing images and when an image is
queried?

Thanks a bunch for all your help and answers,

Best,

Ghislain Bonamy, PhD
__________________________________________

Research Investigator I

Genomic Institute of the

Novartis Research

Foundation

Department of Molecular & Cell Biology, room G214

10675 John Jay Hopkins Drive

San Diego CA 92121

USA


+1 (858) 812-1534 (W & F)

+1 (757) 941-4194 (H)

+1 (858) 354-7388 (M)

www.gnf.org


Hudson-Alpha Institute for Biotechnology

www.hudsonalpha.org


_______________________________________________
ome-users mailing list
ome-users at lists.openmicroscopy.org.uk
http://lists.openmicroscopy.org.uk/mailman/listinfo/ome-users


End of ome-users Digest, Vol 45, Issue 9
****************************************






More information about the ome-users mailing list