Back to Multiple platform build/check report for BioC 3.10
ABCDEFGHIJKLMNOPQ[R]STUVWXYZ

BUILD report for RGMQL on tokay1

This page was generated on 2020-04-15 12:29:41 -0400 (Wed, 15 Apr 2020).

Package 1422/1823HostnameOS / ArchINSTALLBUILDCHECKBUILD BIN
RGMQL 1.6.0
Simone Pallotta
Snapshot Date: 2020-04-14 16:46:13 -0400 (Tue, 14 Apr 2020)
URL: https://git.bioconductor.org/packages/RGMQL
Branch: RELEASE_3_10
Last Commit: 8b679b5
Last Changed Date: 2019-10-29 13:10:56 -0400 (Tue, 29 Oct 2019)
malbec1 Linux (Ubuntu 18.04.4 LTS) / x86_64  OK  ERROR  skipped 
tokay1 Windows Server 2012 R2 Standard / x64  OK [ ERROR ] skipped  skipped 
merida1 OS X 10.11.6 El Capitan / x86_64  OK  ERROR  skipped  skipped 

Summary

Package: RGMQL
Version: 1.6.0
Command: chmod a+r RGMQL -R && C:\Users\biocbuild\bbs-3.10-bioc\R\bin\R.exe CMD build --keep-empty-dirs --no-resave-data RGMQL
StartedAt: 2020-04-14 23:54:37 -0400 (Tue, 14 Apr 2020)
EndedAt: 2020-04-14 23:56:12 -0400 (Tue, 14 Apr 2020)
EllapsedTime: 94.9 seconds
RetCode: 1
Status:  ERROR  
PackageFile: None
PackageFileSize: NA

Command output

##############################################################################
##############################################################################
###
### Running command:
###
###   chmod a+r RGMQL -R && C:\Users\biocbuild\bbs-3.10-bioc\R\bin\R.exe CMD build --keep-empty-dirs --no-resave-data RGMQL
###
##############################################################################
##############################################################################


* checking for file 'RGMQL/DESCRIPTION' ... OK
* preparing 'RGMQL':
* checking DESCRIPTION meta-information ... OK
* installing the package to build vignettes
* creating vignettes ... ERROR
--- re-building 'RGMQL-vignette.Rmd' using rmarkdown
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/04/14 23:55:47 INFO SparkContext: Running Spark version 2.2.0
20/04/14 23:55:48 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/04/14 23:55:49 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
	at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:378)
	at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:393)
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:386)
	at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
	at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:116)
	at org.apache.hadoop.security.Groups.<init>(Groups.java:93)
	at org.apache.hadoop.security.Groups.<init>(Groups.java:73)
	at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)
	at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)
	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
	at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2430)
	at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2430)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2430)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:295)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
	at it.polimi.genomics.r.Wrapper$.initGMQL(Wrapper.scala:96)
	at it.polimi.genomics.r.Wrapper.initGMQL(Wrapper.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
	at java.lang.reflect.Method.invoke(Unknown Source)
	at RJavaTools.invokeMethod(RJavaTools.java:386)
20/04/14 23:55:50 INFO SparkContext: Submitted application: GMQL-R
20/04/14 23:55:52 INFO SecurityManager: Changing view acls to: biocbuild
20/04/14 23:55:52 INFO SecurityManager: Changing modify acls to: biocbuild
20/04/14 23:55:52 INFO SecurityManager: Changing view acls groups to: 
20/04/14 23:55:52 INFO SecurityManager: Changing modify acls groups to: 
20/04/14 23:55:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(biocbuild); groups with view permissions: Set(); users  with modify permissions: Set(biocbuild); groups with modify permissions: Set()
20/04/14 23:55:58 INFO Utils: Successfully started service 'sparkDriver' on port 50027.
20/04/14 23:55:59 INFO SparkEnv: Registering MapOutputTracker
20/04/14 23:55:59 INFO SparkEnv: Registering BlockManagerMaster
20/04/14 23:56:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/04/14 23:56:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/04/14 23:56:00 INFO DiskBlockManager: Created local directory at C:\Users\biocbuild\bbs-3.10-bioc\tmp\blockmgr-773a4539-4d71-4b64-aa05-703014cd30d4
20/04/14 23:56:00 INFO MemoryStore: MemoryStore started with capacity 114.6 MB
20/04/14 23:56:00 INFO SparkEnv: Registering OutputCommitCoordinator
20/04/14 23:56:03 INFO Utils: Successfully started service 'SparkUI' on port 4040.
20/04/14 23:56:03 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://172.29.0.5:4040
20/04/14 23:56:03 INFO Executor: Starting executor ID driver on host localhost
20/04/14 23:56:03 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50088.
20/04/14 23:56:03 INFO NettyBlockTransferService: Server created on 172.29.0.5:50088
20/04/14 23:56:04 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/04/14 23:56:04 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 172.29.0.5, 50088, None)
20/04/14 23:56:04 INFO BlockManagerMasterEndpoint: Registering block manager 172.29.0.5:50088 with 114.6 MB RAM, BlockManagerId(driver, 172.29.0.5, 50088, None)
20/04/14 23:56:04 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 172.29.0.5, 50088, None)
20/04/14 23:56:04 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 172.29.0.5, 50088, None)
GMQL Server is up
Quitting from lines 431-433 (RGMQL-vignette.Rmd) 
Error: processing vignette 'RGMQL-vignette.Rmd' failed with diagnostics:

--- failed re-building 'RGMQL-vignette.Rmd'

SUMMARY: processing the following file failed:
  'RGMQL-vignette.Rmd'

Error: Vignette re-building failed.
Execution halted