[ome-users] OMERO and storing large amounts of data

Pasi Kankaanpää pkankaan at abo.fi
Wed Apr 17 14:02:08 BST 2019


Hi everyone,

We at Turku BioImaging and the Finnish Euro-BioImaging Node are 
currently making plans for an OMERO server setup, and some questions 
have come up in this context. The OMERO team recommended posting these 
questions here, in case they would benefit also others. Any experiences, 
thought and comments would be greatly appreciated.

1) How well would you estimate OMERO handles large files, such as data 
from a light sheet microscope, in these two scenarios a) a large amount 
of data consisting of numerous small files and b) a large amount of data 
consisting of just one or few very large files.

2) If we think of a 5-10 year spectrum and a total cumulative data 
amount during this time of 10 petabytes, will OMERO handle this without 
problems?

3) If we think of a scenario where of the total storage capacity 10-20% 
would be faster access working storage and the rest more permanent 
archiving storage (that can have slower access times), how would you 
envision OMERO working with this scenario, or what would you see as pros 
and cons of these alternatives:

a) OMERO would run only in working storage and not archiving (how would 
the transfer between them take place?)

b) OMERO would run only in archiving and not working storage (how would 
the transfer between them take place?)

c) OMERO would run both in working storage and archiving (how would the 
transfer between them take place?)

d) 100% of the capacity would be with high access speed, operated by 
OMERO, so no separate working and archiving storages

Thanks a lot in advance for any comments,

Pasi

--
Pasi Kankaanpää, PhD
Administrative Director, Turku BioImaging
Project Manager, Euro-BioImaging
Åbo Akademi University and University of Turku
Turku, Finland

pkankaan at abo.fi



More information about the ome-users mailing list