Back to Multiple platform build/check report for BioC 3.9
ABCDEFGHIJKLMNOPQ[R]STUVWXYZ

BUILD report for RGMQL on celaya2

This page was generated on 2019-04-09 13:19:37 -0400 (Tue, 09 Apr 2019).

Package 1338/1703HostnameOS / ArchINSTALLBUILDCHECKBUILD BIN
RGMQL 1.3.0
Simone Pallotta
Snapshot Date: 2019-04-08 17:01:18 -0400 (Mon, 08 Apr 2019)
URL: https://git.bioconductor.org/packages/RGMQL
Branch: master
Last Commit: 6b6f770
Last Changed Date: 2019-02-05 13:44:16 -0400 (Tue, 05 Feb 2019)
malbec2 Linux (Ubuntu 18.04.2 LTS) / x86_64  OK  OK  OK UNNEEDED, same version exists in internal repository
tokay2 Windows Server 2012 R2 Standard / x64  OK  OK  OK  OK UNNEEDED, same version exists in internal repository
celaya2 OS X 10.11.6 El Capitan / x86_64  OK [ ERROR ] skipped  skipped 
merida2 OS X 10.11.6 El Capitan / x86_64  OK  OK  OK  OK 

Summary

Package: RGMQL
Version: 1.3.0
Command: /Library/Frameworks/R.framework/Versions/Current/Resources/bin/R CMD build --keep-empty-dirs --no-resave-data RGMQL
StartedAt: 2019-04-08 22:49:24 -0400 (Mon, 08 Apr 2019)
EndedAt: 2019-04-08 22:50:33 -0400 (Mon, 08 Apr 2019)
EllapsedTime: 68.6 seconds
RetCode: 1
Status:  ERROR 
PackageFile: None
PackageFileSize: NA

Command output

##############################################################################
##############################################################################
###
### Running command:
###
###   /Library/Frameworks/R.framework/Versions/Current/Resources/bin/R CMD build --keep-empty-dirs --no-resave-data RGMQL
###
##############################################################################
##############################################################################


* checking for file ‘RGMQL/DESCRIPTION’ ... OK
* preparing ‘RGMQL’:
* checking DESCRIPTION meta-information ... OK
* installing the package to build vignettes
* creating vignettes ... ERROR
--- re-building ‘RGMQL-vignette.Rmd’ using rmarkdown
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/04/08 22:50:30 INFO SparkContext: Running Spark version 2.2.0
19/04/08 22:50:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/04/08 22:50:31 INFO SparkContext: Submitted application: GMQL-R
19/04/08 22:50:31 INFO SecurityManager: Changing view acls to: biocbuild
19/04/08 22:50:31 INFO SecurityManager: Changing modify acls to: biocbuild
19/04/08 22:50:31 INFO SecurityManager: Changing view acls groups to: 
19/04/08 22:50:31 INFO SecurityManager: Changing modify acls groups to: 
19/04/08 22:50:31 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(biocbuild); groups with view permissions: Set(); users  with modify permissions: Set(biocbuild); groups with modify permissions: Set()
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
19/04/08 22:50:32 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:433)
	at sun.nio.ch.Net.bind(Net.java:425)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
	at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
	at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
	at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
	at java.lang.Thread.run(Thread.java:748)
19/04/08 22:50:32 INFO SparkContext: Successfully stopped SparkContext
Quitting from lines 237-238 (RGMQL-vignette.Rmd) 
Error: processing vignette 'RGMQL-vignette.Rmd' failed with diagnostics:
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
--- failed re-building ‘RGMQL-vignette.Rmd’

SUMMARY: processing the following file failed:
  ‘RGMQL-vignette.Rmd’

Error: Vignette re-building failed.
Execution halted