Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [ice-dev] MOOSE Job error message

Robert,


MOOSE is the only place where it is available now.


Jay


Jay Jay Billings
Oak Ridge National Laboratory
Twitter Handle: @jayjaybillings

From: ice-dev-bounces@xxxxxxxxxxx <ice-dev-bounces@xxxxxxxxxxx> on behalf of Smith, Robert W. <smithrw@xxxxxxxx>
Sent: Thursday, January 07, 2016 3:09 PM
To: ice developer discussions
Subject: Re: [ice-dev] MOOSE Job error message
 

Jay,


Is the Plant View supposed to still be accessible from anywhere else but the MOOSE Item? I need a way to open a Plant View in ICE to test the JavaFX version, and this is the only way left I know how to create one. 


(I still have other work on the branch to keep me busy until Alex is back even if not.)


Robert


From: ice-dev-bounces@xxxxxxxxxxx <ice-dev-bounces@xxxxxxxxxxx> on behalf of Jay Jay Billings <jayjaybillings@xxxxxxxxx>
Sent: Thursday, January 07, 2016 3:02 PM
To: ice developer discussions
Subject: Re: [ice-dev] MOOSE Job error message
 
If it still fails, you can assume all is OK for now and move on until he gets back.

Jay

On Thu, Jan 7, 2016 at 2:44 PM, Smith, Robert W. <smithrw@xxxxxxxx> wrote:

Jay,


Not since he merged his Job Launcher branch into next, no. I'll merge and give it another try.


Thanks,


Robert


From: ice-dev-bounces@xxxxxxxxxxx <ice-dev-bounces@xxxxxxxxxxx> on behalf of Jay J. Billings <billingsjj@xxxxxxxx>
Sent: Thursday, January 07, 2016 2:41 PM
To: ice-dev@xxxxxxxxxxx
Subject: Re: [ice-dev] MOOSE Job error message
 
Robert,

Have you merged in the latest changes from next into your branch? Alex has made a lot of changes to the way things are launched in ICE that might be in the nightly, but not your branch.

Jay

On 01/07/2016 02:35 PM, Smith, Robert W. wrote:

All,


I'm running into a stranger problem working with RELAP-7, and was hoping someone with more MOOSE experience could tell me what the issue is. I'm running a MOOSE workflow with a file in my branch and getting the error message below back from RELAP-7. When I run it using the latest unstable nightly, the workflow completes fine. I shouldn't have made any changes to the way ICE and MOOSE communicate in my branch. Both versions are being run with remote connections to the same machine and having compared the files created on the remote server, they both create identical files with the same permissions on both the file and containing folder, so I'm not sure what is stopping RELAP-7 from reading the file. 


Any help would be appreciated,


Robert


'MOOSE Tree Validation' has encountered a problem. 


*** ERROR ***
Unable to open file "/home/r8s/ICEJobs/iceLaunch_20160107020534javafxBranchJob.i". Check to make sure that it exists and that you have read permission.
*** ERROR ***
Unable to open file "/home/r8s/ICEJobs/iceLaunch_20160107020534javafxBranchJob.i". Check to make sure that it exists and that you have read permission.

[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib64/libthread_db.so.1".
0x00007f0ad954c03c in waitpid () from /lib64/libc.so.6
#0  0x00007f0ad954c03c in waitpid () from /lib64/libc.so.6
#1  0x00007f0ad94d1092 in do_system () from /lib64/libc.so.6
#2  0x00007f0adddeaafe in libMesh::print_trace(std::ostream&) () from /home/moose/moose/scripts/../libmesh/installed/lib/libmesh_opt.so.0
#3  0x00007f0ae007562c in MooseUtils::checkFileReadable(std::string const&, bool, bool) () from /home/moose/moose/framework/libmoose-opt.so.0
#4  0x00007f0adfa2473e in Parser::parse(std::string const&) () from /home/moose/moose/framework/libmoose-opt.so.0
#5  0x00007f0adfc17531 in MooseApp::setupOptions() () from /home/moose/moose/framework/libmoose-opt.so.0
#6  0x00007f0ae10e2c31 in Relap7App::setupOptions() () from /home/moose/relap-7/lib/librelap-7-opt.so.0
#7  0x00007f0adfc1577a in MooseApp::run() () from /home/moose/moose/framework/libmoose-opt.so.0
#8  0x000000000040ad58 in main ()
[0] /home/moose/moose/framework/src/utils/MooseUtils.C, line 97, compiled Sep 23 2015 at 12:02:10
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
-------------------------------------------------------------------------- ​




_______________________________________________
ice-dev mailing list
ice-dev@xxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.eclipse.org/mailman/listinfo/ice-dev

-- 
Jay Jay Billings
Oak Ridge National Laboratory
(865) 241-6308
Twitter Handle: @jayjaybillings

I only check email at 8:30, 12:30 and 16:30 every day. Please call me or send 
me a tweet if your message is urgent and I will take a look.

_______________________________________________
ice-dev mailing list
ice-dev@xxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.eclipse.org/mailman/listinfo/ice-dev



--
Jay Jay Billings
Oak Ridge National Laboratory
Twitter Handle: @jayjaybillings

Back to the top