[ome-users] Problems with OMERO Imports

Christian Carsten Sachs c.sachs at fz-juelich.de
Thu Jan 7 09:45:37 GMT 2016


Hello Josh,

thank you, happy New year to you as well!
I've spent this week tinkering with the issue (unfortunately each import 
takes ~1h to fail so it's not a fast process ;)), and have some new 
findings:

First I've had our admin raise the memory settings in the production 
system, this unfortunately did not solve the issue ...

Afterwards I had our test system (a VM I can administrate) grown 
(RAM/HDD wise) and started checking the issue there.

Fortunately, I could replicate the problem. Unfortunately, after some 
testing it appears stranger than I feared:

When importing the file without the '--auto_close --minutes_wait=0' 
parameters, it worked without error (I used the exact same call that 
failed, minus the two options, and ran it manually) ...

Unbelieving, I retried it immediately (with the two parameters, started 
by the script), and it failed.

Concerning the swallowed exception, are you sure the line in question is 
1882 (1886)?
As I was wondering about the many

ERROR [    ome.formats.OMEROMetadataStoreClient] (2-thread-2) Server 
error setting extended properties for Pixels:<num> Target file:<file>

errors without any further info, so I've changed 
https://github.com/openmicroscopy/openmicroscopy/blob/v5.2.0/components/blitz/src/ome/formats/OMEROMetadataStoreClient.java#L1594 
line 1594/1594 to log.error(..., e) to log the exception as well, and 
used the resulting custom built jar ...

Here's an example:

2016-01-06 18:32:25,202 ERROR [    ome.formats.OMEROMetadataStoreClient] 
(2-thread-2) Server error setting extended properties for Pixels:2451 
Target file:sachs_202/2016-01/06/17-08-57.945/nd034.nd2
Ice.ObjectNotExistException: null
	at IceInternal.Outgoing.invoke(Outgoing.java:158) ~[ice.jar:na]
	at 
omero.api._MetadataStoreDelM.setPixelsFile(_MetadataStoreDelM.java:233) 
~[blitz.jar:na]
	at 
omero.api.MetadataStorePrxHelper.setPixelsFile(MetadataStorePrxHelper.java:749) 
~[blitz.jar:na]
	at 
omero.api.MetadataStorePrxHelper.setPixelsFile(MetadataStorePrxHelper.java:721) 
~[blitz.jar:na]
	at 
ome.formats.OMEROMetadataStoreClient.setPixelsFile(OMEROMetadataStoreClient.java:1590) 
~[blitz.jar:na]
	at 
ome.services.blitz.repo.ManagedImportRequestI.pixelData(ManagedImportRequestI.java:565) 
[blitz.jar:na]
	at 
ome.services.blitz.repo.ManagedImportRequestI.step(ManagedImportRequestI.java:406) 
[blitz.jar:na]
	at omero.cmd.HandleI.steps(HandleI.java:438) [blitz.jar:na]
	at omero.cmd.HandleI$1.doWork(HandleI.java:366) [blitz.jar:na]
	at omero.cmd.HandleI$1.doWork(HandleI.java:362) [blitz.jar:na]
	at sun.reflect.GeneratedMethodAccessor280.invoke(Unknown Source) ~[na:na]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) 
~[na:1.8.0_65]
	at java.lang.reflect.Method.invoke(Unknown Source) ~[na:1.8.0_65]
	at 
org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307) 
[spring-aop.jar:3.0.1.RELEASE]
	at 
org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183) 
[spring-aop.jar:3.0.1.RELEASE]
	at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) 
[spring-aop.jar:3.0.1.RELEASE]
	at 
ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:562) 
[server.jar:na]
	at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) 
[spring-aop.jar:3.0.1.RELEASE]
	at ome.security.basic.EventHandler.invoke(EventHandler.java:154) 
[server.jar:na]
	at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) 
[spring-aop.jar:3.0.1.RELEASE]
	at 
org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:111) 
[spring-orm.jar:3.0.1.RELEASE]
	at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) 
[spring-aop.jar:3.0.1.RELEASE]
	at 
org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:108) 
[spring-tx.jar:3.0.1.RELEASE]
	at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) 
[spring-aop.jar:3.0.1.RELEASE]
	at 
ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) 
[server.jar:na]
	at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) 
[spring-aop.jar:3.0.1.RELEASE]
	at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) 
[server.jar:na]
	at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) 
[spring-aop.jar:3.0.1.RELEASE]
	at 
org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202) 
[spring-aop.jar:3.0.1.RELEASE]
	at com.sun.proxy.$Proxy72.doWork(Unknown Source) [na:na]
	at ome.services.util.Executor$Impl.execute(Executor.java:443) 
[server.jar:na]
	at omero.cmd.HandleI.run(HandleI.java:360) [blitz.jar:na]
	at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) 
[na:1.8.0_65]
	at ome.services.util.Executor$Impl$1.call(Executor.java:484) 
[server.jar:na]
	at java.util.concurrent.FutureTask.run(Unknown Source) [na:1.8.0_65]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) 
[na:1.8.0_65]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) 
[na:1.8.0_65]
	at java.lang.Thread.run(Unknown Source) [na:1.8.0_65]

You can see the Blitz0 log with both imports, the first, successful one, 
and the second, unsuccessful one, here:

https://fz-juelich.sciebo.de/index.php/s/ZKy1ccVzqqCjLn0/download

Any ideas what the problem might be? The strange thing is, that the 
'--auto_close --minutes_wait=0' seems to work fine with smaller files, 
and I'd like to have the import call within the script to return as fast 
as possible...

Thank you,
best regards
Christian Sachs

On 01/05/2016 08:54 AM, Josh Moore wrote:
> Happy New Year, Christian and others.
>
> On Fri, Dec 18, 2015 at 5:29 PM, Christian Carsten Sachs
> <c.sachs at fz-juelich.de> wrote:
>> Hello Josh,
>>
>> On 12/18/2015 09:58 AM, Josh Moore wrote:
>>>
>>> Good morning,
>>>
>>> On Thu, Dec 17, 2015 at 6:26 PM, Christian Carsten Sachs
>>> <c.sachs at fz-juelich.de> wrote:
>>>
>>>> On 12/17/2015 05:25 PM, Josh Moore wrote:
>>>>>
>>>>>
>>>>> On Thu, Dec 17, 2015 at 5:10 PM, Christian Carsten Sachs
>>>
>>>
>>>>>> Nikon ND2 image stacks (.nd2 files) are imported using the command line
>>>>>> client, in ln_s mode (called by a custom OMERO.dropbox-like script,
>>>>>> enforcing some additional naming rules etc.).
>>>>>
>>>>>
>>>>> Is the script available somewhere?
>>>>
>>>>
>>>> No, and we cannot easily upload is somewhere as it contains confidential
>>>> info ...
>>>
>>>
>>> Understood, but debugging without seeing what's going on can be
>>> difficult. Do you have log files from your script? Can you elide out
>>> the sensitive information?
>>>
>>
>> Here is the interaction with the OMERO cli tool:
>>
>> https://fz-juelich.sciebo.de/index.php/s/kUQMagLbBlPRQuk/download
>>
>> The Add_Annotation.py used is available here (third parties interested
>> in this, please be careful and don't blindly install it ...)
>> https://fz-juelich.sciebo.de/index.php/s/gzn9YU9QkfHvqW6/download
>>
>> The other script does not interact any further with OMERO besides the
>> commands given in the transcript above.
>
>
> Thanks for these. I noticed the use of --auto_close. That itself
> shouldn't cause problems as long as the number of imports is in
> reason, but combined with a very low memory setting this could mean
> you were not getting an exception that would otherwise shown up
> earlier. If you are still having troubles after increasing the memory,
> you might want to try temporarily removing --auto_close and see if you
> get any more feedback.
>
> ...
>
>>>>>> Import proceeds like normal, and all positions are added to OMERO, but
>>>>>> the actual image data remains inaccessible. (Only the clock icon in
>>>>>> thumbnail view in the webclient).
>>>>>>
>>>>>> In Blitz-0 (or similarily the individual file log file), the following
>>>>>> errors are visible:
>>>>>>
>>>>>> (repeated for each plane)
>>>>>> 2015-12-17 16:07:10,458 ERROR [ ome.formats.OMEROMetadataStoreClient]
>>>>>> (2-thread-4) Server error setting extended properties for Pixels:8145
>>>>>> Target file:sachs_102/2015-12/17/15-07-53.074/nd009.nd2
>>>
>>>
>>> Unfortunately, that line:
>>>
>>>
>>> https://github.com/openmicroscopy/openmicroscopy/blob/v5.2.0/components/blitz/src/ome/formats/OMEROMetadataStoreClient.java#L1882
>>>
>>> swallows the exception. We could try to come up with a build that
>>> would print out what is going on, but it might be that another log
>>> file has related information. Could you possibly send us the
>>> var/log/master* log files? The postgresql log files may also have some
>>> information.
>>>
>>>
>>
>> master.err and master.out are both empty, and have not been touched
>> since some time (they're supposed to be in the same folder as Blitz-0,
>> right?)
>>
>> I can't access the Postgres log as of now.
>
> No problem.
>
>
>>>>> Is there any stack trace above this line? Perhaps send the whole log
>>>>> (dir)?
>>>>
>>>>
>>>> I did not see any stack trace, this is why I'm rather puzzled ...
>>>> The whole logfile is some hundred megabytes large, I've split off the
>>>> last import process (actually, since the last restart, as I had asked
>>>> our admin to restart the system right before attempting the last import).
>>>>
>>>> As it's still too large to comfortably add it as attachment, please see
>>>> following (temporary) link:
>>>>
>>>> https://fz-juelich.sciebo.de/index.php/s/zeltfZ4hfrJUMLU
>>>
>>>
>>> Downloaded, thanks, but you're right, there's not enough info there.
>>>
>>>
>>>> My wording might've been misunderstandable: It has worked in both the
>>>> testing and production system; the production system was tested with the
>>>> import script and some datasets in late November and everything worked
>>>> fine.
>>>> Yesterday we wanted to 'go live' and let users start to import data ...
>>>> Thus we're a bit puzzled that problems occur now ...
>>>
>>>
>>> One guess: is this file larger than others that were attempted
>>> previously? The memory setting of your server is not particularly high
>>> for production:
>>>
>>> 2015-12-17 15:03:48,260 INFO  [
>>> ome.services.util.JvmSettingsCheck] (      main) Max Memory (MB):   =
>>>    2185
>>>
>>
>> Good hint!
>> A little background here: A colleague has initially set up the server,
>> and now left the institute, I 'inherited' the project and scripted some
>> necessities for our workflows, and the institute's admin is in physical
>> control of the machine and administrative tasks there, so I'm not aware
>> of every minute detail of the setup yet.
>>
>> The machine has 24 GB of physical RAM, I'll change the settings to give
>> OMERO access to more. Our image files (multi position multi channel time
>> lapse microscopy) range from 25-250 GB, so the current settings might
>> really be too small.
>>
>> Maybe that even solves the problem.
>>
>> Unfortunately, I can't try it out now, and I'll be away for the
>> holidays. I'll get back to test it next year.
>
> Sounds good. Let us know how it goes.
> ~Josh.
>
>
>> Thanks a lot,
>> best Regards,
>> Christian Sachs
>>
>>> Cheers,
>>> ~Josh
>>>
>>>
>>>> PS: I forgot to mention that the file itself is properly openable in
>>>> Fiji with current Bio-Formats.
>>>>
>>>> Thank you very much,
>>>> Best regards
>>>>
>>>> Christian Sachs
>>>>
>>>>>> We're using OMERO 5.2.0 on a Windows server (which has - to my
>>>>>> knowledge
>>>>>> - been upgraded multiple times from earlier versions).
>>>>>> As of the last three files imported, none was imported properly, we
>>>>>> have
>>>>>> told our users to stop imports till the problem is resolved.
>>>>>>
>>>>>> Thank you,
>>>>>> best regards
>>>>>>
>>>>>> Christian Sachs
>>>
>>> _______________________________________________
>>> ome-users mailing list
>>> ome-users at lists.openmicroscopy.org.uk
>>> http://lists.openmicroscopy.org.uk/mailman/listinfo/ome-users
>>>
>>
>>
>> ------------------------------------------------------------------------------------------------
>> ------------------------------------------------------------------------------------------------
>> Forschungszentrum Juelich GmbH
>> 52425 Juelich
>> Sitz der Gesellschaft: Juelich
>> Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
>> Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
>> Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
>> Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
>> Prof. Dr. Sebastian M. Schmidt
>> ------------------------------------------------------------------------------------------------
>> ------------------------------------------------------------------------------------------------
>>
>> _______________________________________________
>> ome-users mailing list
>> ome-users at lists.openmicroscopy.org.uk
>> http://lists.openmicroscopy.org.uk/mailman/listinfo/ome-users
> _______________________________________________
> ome-users mailing list
> ome-users at lists.openmicroscopy.org.uk
> http://lists.openmicroscopy.org.uk/mailman/listinfo/ome-users
>



More information about the ome-users mailing list