Adam学习19之在window下eclipse的mvn test

来源:互联网 发布:做奥数题的软件 编辑:程序博客网 时间:2024/06/05 07:57

结果为:

[INFO] Scanning for projects...[INFO] ------------------------------------------------------------------------[INFO] Reactor Build Order:[INFO] [INFO] ADAM_2.10[INFO] ADAM_2.10: Core[INFO] ADAM_2.10: APIs for Java[INFO] ADAM_2.10: CLI[INFO]                                                                         [INFO] ------------------------------------------------------------------------[INFO] Building ADAM_2.10 0.19.0[INFO] ------------------------------------------------------------------------[INFO] [INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-parent_2.10 ---[INFO] [INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-parent_2.10 ---[INFO] [INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-parent_2.10 ---[INFO] Modified 0 of 199 .scala files[INFO] [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-parent_2.10 ---[INFO] No sources to compile[INFO] [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-parent_2.10 ---[INFO] No sources to compile[INFO]                                                                         [INFO] ------------------------------------------------------------------------[INFO] Building ADAM_2.10: Core 0.19.0[INFO] ------------------------------------------------------------------------[INFO] [INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-core_2.10 ---[INFO] [INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-core_2.10 ---[INFO] [INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-core_2.10 ---[INFO] Source directory: D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-core\src\main\scala added.[INFO] [INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-core_2.10 ---[INFO] Modified 0 of 159 .scala files[INFO] [INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-core_2.10 ---[INFO] Using 'UTF-8' encoding to copy filtered resources.[INFO] Copying 1 resource[INFO] [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-core_2.10 ---[WARNING]  Expected all dependencies to require Scala version: 2.10.4[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4[WARNING]  org.bdgenomics.utils:utils-metrics_2.10:0.2.4 requires scala version: 2.10.4[WARNING]  org.bdgenomics.utils:utils-io_2.10:0.2.4 requires scala version: 2.10.4[WARNING]  org.bdgenomics.utils:utils-cli_2.10:0.2.4 requires scala version: 2.10.4[WARNING]  org.scoverage:scalac-scoverage-plugin_2.10:1.1.1 requires scala version: 2.10.4[WARNING]  org.bdgenomics.adam:adam-core_2.10:0.19.0 requires scala version: 2.10.4[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0[WARNING] Multiple versions of scala libraries detected![INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-core_2.10 ---[INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-core_2.10 ---[INFO] Test Source directory: D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-core\src\test\scala added.[INFO] [INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-core_2.10 ---[INFO] Using 'UTF-8' encoding to copy filtered resources.[INFO] Copying 65 resources[INFO] [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-core_2.10 ---[WARNING]  Expected all dependencies to require Scala version: 2.10.4[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4[WARNING]  org.bdgenomics.utils:utils-metrics_2.10:0.2.4 requires scala version: 2.10.4[WARNING]  org.bdgenomics.utils:utils-io_2.10:0.2.4 requires scala version: 2.10.4[WARNING]  org.bdgenomics.utils:utils-cli_2.10:0.2.4 requires scala version: 2.10.4[WARNING]  org.scoverage:scalac-scoverage-plugin_2.10:1.1.1 requires scala version: 2.10.4[WARNING]  org.bdgenomics.adam:adam-core_2.10:0.19.0 requires scala version: 2.10.4[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0[WARNING] Multiple versions of scala libraries detected![INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-core_2.10 ---[INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-core_2.10 ---[INFO] Tests are skipped.[INFO] [INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-core_2.10 ---Discovery starting.Discovery completed in 3 seconds, 139 milliseconds.Run starting. Expected test count is: 373RichGenotypeSuite:- different ploidy- all types for diploid genotypeReferenceUtilsSuite:- unionReferenceSet: empty- unionReferenceSet: one region- unionReferenceSet: multiple regions on one contig, all overlap- unionReferenceSet: multiple regions on one contig, some overlap- unionReferenceSet: multiple regions on multiple contigsMdTagSuite:- null md tag- zero length md tag- md tag with non-digit initial value- md tag invalid base- md tag, pure insertion- md tag, pure insertion, test 2- md tag pure insertion equality- md tag equality and hashcode- valid md tags- get start of read with no mismatches or deletions- get start of read with no mismatches, but with a deletion at the start- get start of read with mismatches at the start- get end of read with no mismatches or deletions- check that mdtag and rich record return same end- get end of read with no mismatches, but a deletion at end- CIGAR with N operator- CIGAR with multiple N operators- CIGAR with P operators- Get correct matches for mdtag with insertion- Get correct matches for mdtag with mismatches and insertion- Get correct matches for mdtag with insertion between mismatches- Get correct matches for mdtag with intron between mismatches- Get correct matches for mdtag with intron and deletion between mismatches- Throw exception when number of deleted bases in mdtag disagrees with CIGAR- Get correct matches for mdtag with mismatch, insertion and deletion- Get correct matches for mdtag with mismatches, insertion and deletion- Get correct matches for MDTag with mismatches and deletions- Get correct matches base from MDTag and CIGAR with N- get end of read with mismatches and a deletion at end- get correct string out of mdtag with no mismatches- get correct string out of mdtag with mismatches at start- get correct string out of mdtag with deletion at end- get correct string out of mdtag with mismatches at end- get correct string out of complex mdtag- check complex mdtag- move a cigar alignment by two for a read- rewrite alignment to all matches- rewrite alignment to two mismatches followed by all matches- rewrite alignment to include a deletion but otherwise all matches- rewrite alignment to include an insertion at the start of the read but otherwise all matches- create new md tag from read vs. reference, perfect match- create new md tag from read vs. reference, perfect alignment match, 1 mismatch- create new md tag from read vs. reference, alignment with deletion- create new md tag from read vs. reference, alignment with insert- handle '=' and 'X' operators- CIGAR/MD tag mismatch should cause errorsGenotypesToVariantsConverterSuite:- Simple test of integer RMS- Simple test of floating point RMS- Max genotype quality should lead to max variant quality- Genotype quality = 0.5 for two samples should lead to variant quality of 0.75PairingRDDSuite:2016-05-12 18:54:30 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable2016-05-12 18:54:37 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- sliding on an empty RDD returns an empty RDD2016-05-12 18:54:45 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- sliding on an RDD where count() < width returns an empty RDD2016-05-12 18:54:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- sliding on an RDD where count() == width returns an RDD with one element.2016-05-12 18:54:47 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- sliding on a small RDD works correctly2016-05-12 18:54:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- sliding works correctly on a partitioned RDD2016-05-12 18:54:52 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- pairing a simple sequence works2016-05-12 18:54:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- pairing an empty sequence returns an empty sequence2016-05-12 18:54:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- pairing a sorted sequence works2016-05-12 18:54:55 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- pairWithEnds on an empty sequence returns an empty sequence2016-05-12 18:54:55 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- pairWithEnds gives us the right number and set of valuesADAMVariationRDDFunctionsSuite:2016-05-12 18:54:57 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- recover samples from variant context2016-05-12 18:54:58 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- joins SNV database annotation2016-05-12 18:54:59 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:55:01 WARN  :139 - Your hostname, xubo-PC resolves to a loopback/non-reachable address: fe80:0:0:0:482:722f:5976:ce1f%20, but we couldn't find any external IP address!- can write, then read in .vcf fileSingleReadBucketSuite:- convert unmapped pair to fragment- convert proper pair to fragment- convert read pair to fragment with first of pair chimeric readFlankReferenceFragmentsSuite:- don't put flanks on non-adjacent fragments- put flanks on adjacent fragmentsReferencePositionSuite:- create reference position from mapped read- create reference position from variant- create reference position from genotypeAlignmentRecordRDDFunctionsSuite:2016-05-12 18:55:02 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- sorting reads2016-05-12 18:55:04 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- characterizeTags counts integer tag values correctly2016-05-12 18:55:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- withTag returns only those records which have the appropriate tag2016-05-12 18:55:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- withTag, when given a tag name that doesn't exist in the input, returns an empty RDD2016-05-12 18:55:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- characterizeTagValues counts distinct values of a tag2016-05-12 18:55:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- characterizeTags counts tags in a SAM file correctly2016-05-12 18:55:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- round trip from ADAM to SAM and back to ADAM produces equivalent Read values2016-05-12 18:55:07 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- SAM conversion sets read mapped flag properly2016-05-12 18:55:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- convert malformed FASTQ (no quality scores) => SAM => well-formed FASTQ => SAM2016-05-12 18:55:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- round trip from ADAM to FASTQ and back to ADAM produces equivalent Read values2016-05-12 18:55:10 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- round trip from ADAM to paired-FASTQ and back to ADAM produces equivalent Read values2016-05-12 18:55:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:55:12 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation2016-05-12 18:55:12 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.- writing a small sorted file as SAM should produce the expected result2016-05-12 18:55:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:55:13 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation2016-05-12 18:55:13 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.- writing unordered sam from unordered sam2016-05-12 18:55:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:55:13 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation2016-05-12 18:55:13 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.- writing ordered sam from unordered sam2016-05-12 18:55:14 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:55:14 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation2016-05-12 18:55:14 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.- write single sam file back2016-05-12 18:55:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:55:16 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation2016-05-12 18:55:16 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.- write single bam file backFlattenerSuite:- Flatten schema and recordShuffleRegionJoinSuite:2016-05-12 18:55:18 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- Overlapping reference regions2016-05-12 18:55:19 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- Multiple reference regions do not throw exception2016-05-12 18:55:20 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- RegionJoin2 contains the same results as cartesianRegionJoinConsensusGeneratorFromReadsSuite:2016-05-12 18:55:21 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- checking search for consensus list for artificial readsNucleotideContigFragmentRDDFunctionsSuite:2016-05-12 18:55:22 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- generate sequence dict from fasta2016-05-12 18:55:22 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- recover reference string from a single contig fragment2016-05-12 18:55:22 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- recover trimmed reference string from a single contig fragment2016-05-12 18:55:23 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- recover reference string from multiple contig fragments2016-05-12 18:55:23 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- recover trimmed reference string from multiple contig fragments2016-05-12 18:55:24 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- save single contig fragment as FASTA text file2016-05-12 18:55:24 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- save single contig fragment with description as FASTA text file2016-05-12 18:55:25 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- save single contig fragment with null fields as FASTA text file2016-05-12 18:55:25 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- save single contig fragment with null fragment number as FASTA text file2016-05-12 18:55:25 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- save single contig fragment with null number of fragments in contig as FASTA text file2016-05-12 18:55:26 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- save multiple contig fragments from same contig as FASTA text file2016-05-12 18:55:26 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- save multiple contig fragments with description from same contig as FASTA text file2016-05-12 18:55:27 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- merge single contig fragment null fragment number2016-05-12 18:55:27 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- merge single contig fragment number zero2016-05-12 18:55:28 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- merge multiple contig fragmentsFragmentConverterSuite:- build a fragment collector and convert to a read2016-05-12 18:55:28 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- convert an rdd of discontinuous fragments, all from the same contig2016-05-12 18:55:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- convert an rdd of contiguous fragments, all from the same contig2016-05-12 18:55:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- convert an rdd of varied fragments from multiple contigsInterleavedFastqInputFormatSuite:2016-05-12 18:55:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- interleaved FASTQ hadoop reader: interleaved_fastq_sample1.ifq->interleaved_fastq_sample1.ifq.output2016-05-12 18:55:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- interleaved FASTQ hadoop reader: interleaved_fastq_sample2.ifq->interleaved_fastq_sample2.ifq.output2016-05-12 18:55:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- interleaved FASTQ hadoop reader: interleaved_fastq_sample3.ifq->interleaved_fastq_sample3.ifq.output2016-05-12 18:55:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- interleaved FASTQ hadoop reader: interleaved_fastq_sample4.ifq->interleaved_fastq_sample4.ifq.outputReferenceRegionSuite:- contains(: ReferenceRegion)- contains(: ReferencePosition)- merge- overlaps- distance(: ReferenceRegion)- distance(: ReferencePosition)- create region from unmapped read fails- create region from mapped read contains read start and end- validate that adjacent regions can be merged- validate that non-adjacent regions cannot be merged- compute convex hull of two sets- region name is sanitized when creating region from read- intersection fails on non-overlapping regions- compute intersection- overlap tests for oriented reference region- check the width of a reference regionAlignmentRecordConverterSuite:- testing the fields in a converted ADAM Read- converting a read with null quality is OK- convert a read to fastq- reverse complement reads when converting to fastq- converting to fastq with unmapped reads- converting a fragment with no alignments should yield unaligned reads- converting a fragment with alignments should restore the alignmentsIntervalListReaderSuite:- Can read the simple GATK-supplied example interval list fileSingleFastqInputFormatSuite:2016-05-12 18:55:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- FASTQ hadoop reader: fastq_sample1.fq->single_fastq_sample1.fq.output2016-05-12 18:55:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- FASTQ hadoop reader: fastq_sample2.fq->single_fastq_sample2.fq.output2016-05-12 18:55:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- FASTQ hadoop reader: fastq_sample3.fq->single_fastq_sample3.fq.output2016-05-12 18:55:32 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- FASTQ hadoop reader: fastq_sample4.fq->single_fastq_sample4.fq.outputRegExpSuite:- matches returns Some(matcher) when a complete match is found- find returns Some(matcher) when a partial match is foundAttributeUtilsSuite:- parseTags returns a reasonable set of tagStrings- parseTags works with NumericSequence tagType- empty string is parsed as zero tagStrings- incorrectly formatted tag throws an exception- string tag with a ':' in it is correctly parsedMarkDuplicatesSuite:2016-05-12 18:55:32 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- single read2016-05-12 18:55:32 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- reads at different positions2016-05-12 18:55:33 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- reads at the same position2016-05-12 18:55:33 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- reads at the same position with clipping2016-05-12 18:55:34 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- reads on reverse strand2016-05-12 18:55:34 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- unmapped reads2016-05-12 18:55:35 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- read pairs2016-05-12 18:55:35 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- read pairs with fragments- quality scores2016-05-12 18:55:36 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- read pairs that cross chromosomesIndelRealignmentTargetSuite:2016-05-12 18:55:37 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- checking simple realignment target2016-05-12 18:55:37 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- creating simple target from read with deletion2016-05-12 18:55:37 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- creating simple target from read with insertion2016-05-12 18:55:37 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- joining simple realignment targets on same chr2016-05-12 18:55:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- joining simple realignment targets on different chr throws exception2016-05-12 18:55:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- creating targets from three intersecting reads, same indel2016-05-12 18:55:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- creating targets from three intersecting reads, two different indel2016-05-12 18:55:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- creating targets from two disjoint reads2016-05-12 18:55:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- creating targets for artificial reads: one-by-one2016-05-12 18:55:39 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- creating targets for artificial reads: all-at-once (merged)2016-05-12 18:55:39 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- creating indel targets for mason readsTwoBitSuite:- correctly read sequence from .2bit file- correctly return masked sequences from .2bit file- correctly return Ns from .2bit fileConsensusSuite:- test the insertion of a consensus insertion into a reference- test the insertion of a consensus deletion into a reference- inserting empty consensus returns the referenceDecadentReadSuite:- reference position of decadent read- reference position of decadent read with insertions- build a decadent read from a read with null qual- converting bad read should fail2016-05-12 18:55:40 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:55:40 WARN  DecadentRead:64 - Converting read {"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null} to decadent read failed with java.lang.IllegalArgumentException: Error "(D,Some(3)) (of class scala.Tuple2)" while constructing DecadentRead from Read({"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null}). Skipping...- convert an RDD that has an bad read in it with loose validation2016-05-12 18:55:40 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:55:40 ERROR Executor:96 - Exception in task 1.0 in stage 0.0 (TID 1)java.lang.IllegalArgumentException: Error "(D,Some(3)) (of class scala.Tuple2)" while constructing DecadentRead from Read({"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null})    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:42)    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:34)    at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:57)at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:55)    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555)at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)at org.apache.spark.scheduler.Task.run(Task.scala:88)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)at java.lang.Thread.run(Unknown Source)Caused by: scala.MatchError: (D,Some(3)) (of class scala.Tuple2)at org.bdgenomics.adam.util.MdTag$.apply(MdTag.scala:71)at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag$lzycompute(RichAlignmentRecord.scala:98)at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag(RichAlignmentRecord.scala:96)at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceBase$1(RichAlignmentRecord.scala:176)at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceContext(RichAlignmentRecord.scala:192)at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:213)    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:211)    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)    at scala.collection.Iterator$class.foreach(Iterator.scala:727)    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)    at scala.collection.AbstractTraversable.map(Traversable.scala:105)    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:211)at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:200)    at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)    at scala.collection.immutable.List.foldLeft(List.scala:84)    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts$lzycompute(RichAlignmentRecord.scala:200)    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts(RichAlignmentRecord.scala:198)    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions$lzycompute(RichAlignmentRecord.scala:196)    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions(RichAlignmentRecord.scala:196)    at org.bdgenomics.adam.rich.DecadentRead.<init>(DecadentRead.scala:91)    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:38)    ... 15 more2016-05-12 18:55:40 WARN  TaskSetManager:71 - Lost task 1.0 in stage 0.0 (TID 1, localhost): java.lang.IllegalArgumentException: Error "(D,Some(3)) (of class scala.Tuple2)" while constructing DecadentRead from Read({"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null})    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:42)    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:34)    at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:57)at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:55)    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555)at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)at org.apache.spark.scheduler.Task.run(Task.scala:88)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)at java.lang.Thread.run(Unknown Source)Caused by: scala.MatchError: (D,Some(3)) (of class scala.Tuple2)at org.bdgenomics.adam.util.MdTag$.apply(MdTag.scala:71)at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag$lzycompute(RichAlignmentRecord.scala:98)at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag(RichAlignmentRecord.scala:96)at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceBase$1(RichAlignmentRecord.scala:176)at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceContext(RichAlignmentRecord.scala:192)at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:213)    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:211)    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)    at scala.collection.Iterator$class.foreach(Iterator.scala:727)    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)    at scala.collection.AbstractTraversable.map(Traversable.scala:105)    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:211)at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:200)    at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)    at scala.collection.immutable.List.foldLeft(List.scala:84)    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts$lzycompute(RichAlignmentRecord.scala:200)    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts(RichAlignmentRecord.scala:198)    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions$lzycompute(RichAlignmentRecord.scala:196)    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions(RichAlignmentRecord.scala:196)    at org.bdgenomics.adam.rich.DecadentRead.<init>(DecadentRead.scala:91)    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:38)    ... 15 more2016-05-12 18:55:40 ERROR TaskSetManager:75 - Task 1 in stage 0.0 failed 1 times; aborting job- converting an RDD that has an bad read in it with strict validation will throw an errorAlphabetSuite:- test size of a case-sensitive alphabet- test apply of a case-sensitive alphabet- test reverse complement of a case-sensitive alphabet- test exact reverse complement of a case-sensitive alphabet- test size of a case-insensitive alphabet- test apply of a case-insensitive alphabet- test reverse complement of a case-insensitive alphabet- test exact reverse complement of a case-insensitive alphabet- DNA alphabet- map unknown bases to NFieldEnumerationSuite:2016-05-12 18:55:41 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".SLF4J: Defaulting to no-operation (NOP) logger implementationSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.- Empty projections are illegal2016-05-12 18:55:43 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:55:43 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- Simple projection on Read worksBroadcastRegionJoinSuite:- alternating returns an alternating seq of items- Single region returns itself- Two adjacent regions will be merged- Nonoverlapping regions will all be returned- Many overlapping regions will all be merged- ADAMRecords return proper references2016-05-12 18:55:44 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- Ensure same reference regions get passed together2016-05-12 18:55:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- Overlapping reference regions2016-05-12 18:55:47 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- Multiple reference regions do not throw exception2016-05-12 18:55:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- regionJoin contains the same results as cartesianRegionJoinCoverageSuite:- regionToWindows- calculate empty coverage- calculate coverage of one region- calculate coverage of two regions- calculate coverage of three regions- calculate coverage of two adjacent regions- calculate coverage of two nearby regions- calculate coverage of three out-of-order regions- calculate coverage of two regions which join at a window boundary2016-05-12 18:55:51 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- find empty coverage2016-05-12 18:55:51 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- find coverage of one region2016-05-12 18:55:53 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- find coverage of two regions2016-05-12 18:55:56 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- find coverage of three regions2016-05-12 18:55:57 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- find coverage of two adjacent regions2016-05-12 18:56:00 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- find coverage of two nearby regions2016-05-12 18:56:04 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- find coverage of three out-of-order regions2016-05-12 18:56:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- find coverage of two regions which join at a window boundaryVariantContextConverterSuite:- Convert GATK site-only SNV to ADAM- Convert GATK site-only SNV to ADAM with contig conversion- Convert GATK site-only CNV to ADAM- Convert GATK SNV w/ genotypes w/ phase information to ADAM- Convert GATK SNV with different filters to ADAM- Convert ADAM site-only SNV to GATK- Convert ADAM site-only SNV to GATK with contig conversion- Convert ADAM SNV w/ genotypes to GATK- Convert GATK multi-allelic sites-only SNVs to ADAM- Convert GATK multi-allelic SNVs to ADAM- Convert gVCF reference records to ADAMAttributeSuite:- test SAMTagAndValue parsing- Attributes can be correctly re-encoded as text SAM tagsSAMRecordConverterSuite:- testing the fields in an alignmentRecord obtained from a mapped samRecord conversion- testing the fields in an alignmentRecord obtained from an unmapped samRecord conversion- '*' quality gets nulled outADAMContextSuite:2016-05-12 18:56:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- sc.loadParquet should not fail on unmapped reads2016-05-12 18:56:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- sc.loadParquet should not load a file without a type specified2016-05-12 18:56:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- can read a small .SAM file2016-05-12 18:56:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- can read a small .SAM with all attribute tag types2016-05-12 18:56:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- can filter a .SAM file based on quality- Can convert to phred- Can convert from phred2016-05-12 18:56:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- findFiles correctly finds a nested set of directories2016-05-12 18:56:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- loadADAMFromPaths can load simple RDDs that have just been saved2016-05-12 18:56:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- Can read a .gtf file2016-05-12 18:56:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- Can read a .bed file2016-05-12 18:56:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- Can read a .narrowPeak file2016-05-12 18:56:17 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:56:18 WARN  QueuedThreadPool:145 - 8 threads could not be stopped- Can read a .interval_list file2016-05-12 18:56:19 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- can read a small .vcf file2016-05-12 18:56:20 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- import records from interleaved FASTQ: 12016-05-12 18:56:21 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- import records from interleaved FASTQ: 22016-05-12 18:56:21 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- import records from interleaved FASTQ: 32016-05-12 18:56:22 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- import records from interleaved FASTQ: 42016-05-12 18:56:22 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- import records from single ended FASTQ: 12016-05-12 18:56:23 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- import records from single ended FASTQ: 22016-05-12 18:56:23 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- import records from single ended FASTQ: 32016-05-12 18:56:23 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- import records from single ended FASTQ: 42016-05-12 18:56:24 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- filter on load using the filter2 APIRichCigarSuite:- moving 2 bp from a deletion to a match operator- moving 2 bp from a insertion to a match operator- moving 1 base in a two element cigar- move to start of readUtilSuite:- isSameConfigMDTaggingSuite:2016-05-12 18:56:25 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test adding MDTags over boundary2016-05-12 18:56:26 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test adding MDTags; reads span full contig2016-05-12 18:56:27 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test adding MDTags; reads start inside first fragment2016-05-12 18:56:28 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test adding MDTags; reads end inside last fragment2016-05-12 18:56:28 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test adding MDTags; reads start inside first fragment and end inside last fragment2016-05-12 18:56:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test adding MDTags; reads start and end in middle fragementsNormalizationUtilsSuite:- cannot move an indel left if there are no bases to it's left- move a simple indel to farthest position left until bases run out- move a simple indel to farthest position left, past length of indel- cannot move a left normalized indel in a short tandem repeat- move an indel in a short tandem repeat- move an indel in a short tandem repeat of more than 2 bases, where shift is not an integer multiple of repeated sequence length- moving a simple read with single deletion that cannot shift- shift an indel left by 0 in a cigar- shift an indel left by 1 in a cigar- do not left align a complex read which is already left alignedSmithWatermanSuite:- gather max position from simple scoring matrix- gather max position from irregular scoring matrix- gather max position from irregular scoring matrix with deletions- score simple alignment with constant gap- score irregular scoring matrix- score irregular scoring matrix with indel- can unroll cigars correctly- execute simple trackback- execute trackback with indel- run end to end smith waterman for simple reads- run end to end smith waterman for short sequences with indel- run end to end smith waterman for longer sequences with snp- run end to end smith waterman for longer sequences with short indel- run end to end smith waterman for shorter sequence in longer sequence- run end to end smith waterman for shorter sequence in longer sequence, with indelFastaConverterSuite:2016-05-12 18:56:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- find contig index- convert a single record without naming information- convert a single record with naming information2016-05-12 18:56:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- convert single fasta sequence2016-05-12 18:56:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- convert fasta with multiple sequences2016-05-12 18:56:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- convert fasta with multiple sequences; short fragment2016-05-12 18:56:32 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- convert reference fasta fileIndelTableSuite:- check for indels in a region with known indels- check for indels in a contig that doesn't exist- check for indels in a region without known indels2016-05-12 18:56:34 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- build indel table from rdd of variantsRealignIndelsSuite:2016-05-12 18:56:34 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- checking mapping to targets for artificial reads2016-05-12 18:56:34 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- checking alternative consensus for artificial reads2016-05-12 18:56:35 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- checking extraction of reference from reads2016-05-12 18:56:35 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- checking realigned reads for artificial input2016-05-12 18:56:36 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test mismatch quality scoring2016-05-12 18:56:36 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test mismatch quality scoring for no mismatches2016-05-12 18:56:36 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test mismatch quality scoring after unpacking read- we shouldn't try to realign a region with no target2016-05-12 18:56:36 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- we shouldn't try to realign reads with no indel evidence2016-05-12 18:56:37 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test OP and OC tagsBaseQualityRecalibrationSuite:2016-05-12 18:56:37 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- BQSR Test Input #12016-05-12 18:56:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- BQSR Test Input #1 w/ VCF SitesProgramRecordSuite:- convert a sam program record with no optional fields into a program record and v/v- convert a sam program record into a program record and v/vSequenceDictionarySuite:- Convert from sam sequence record and back- Convert from SAM sequence dictionary file (with extra fields)- merge into existing dictionary- Convert from SAM sequence dictionary and back- Can retrieve sequence by name- SequenceDictionary's with same single element are equal- SequenceDictionary's with same two elements are equals- SequenceDictionary's with different elements are unequal- SequenceDictionaries with same elements in different order are compatible- isCompatible tests equality on overlap- The addition + works correctly- The append operation ++ works correctly- ContainsRefName works correctly for different string types- Apply on name works correctly for different String types- convert from sam sequence record and back- convert from sam sequence dictionary and back- conversion to sam sequence dictionary has correct sort orderGenomicPositionPartitionerSuite:- partitions the UNMAPPED ReferencePosition into the top partition- if we do not have a contig for a record, we throw an IAE- partitioning into N pieces on M total sequence length, where N > M, results in M partitions- correctly partitions a single dummy sequence into two pieces- correctly counts cumulative lengths- correctly partitions positions across two dummy sequences2016-05-12 18:56:56 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:56:56 WARN  TaskSetManager:71 - Stage 0 contains a task of very large size (131 KB). The maximum recommended task size is 100 KB.2016-05-12 18:56:56 WARN  TaskSetManager:71 - Stage 1 contains a task of very large size (131 KB). The maximum recommended task size is 100 KB.- test that we can range partition ADAMRecords2016-05-12 18:56:59 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- test that simple partitioning works okay on a reasonable set of ADAMRecordsRichAlignmentRecordSuite:- referenceLengthFromCigar- Unclipped Start- Unclipped End- Illumina Optics- Cigar Clipping Sequence- tags contains optional fields- Reference Positions- read overlap unmapped read- read overlap reference position- read overlap same position different contigGeneSuite:2016-05-12 18:57:00 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- can load a set of gene models from an Ensembl GTF file2016-05-12 18:57:01 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- can load a set of gene models from a Gencode GTF file2016-05-12 18:57:03 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- chr20 gencode transcript sequences match the published sequencesMapToolsSuite:- add two nonzero integer maps- add two nonzero float maps- adding an empty map is the identityRecordGroupDictionarySuite:- simple conversion to and from sam read group- sample name must be set- simple equality checksRun completed in 2 minutes, 41 seconds.Total number of tests run: 373Suites: completed 52, aborted 0Tests: succeeded 373, failed 0, canceled 0, ignored 0, pending 0All tests passed.2016-5-12 18:56:25 INFO: org.apache.parquet.filter2.compat.FilterCompat: Filtering using predicate: eq(start, 16097631)2016-5-12 18:56:25 INFO: org.apache.parquet.filter2.compat.FilterCompat: Filtering using predicate: eq(start, 16097631)[INFO]                                                                         [INFO] ------------------------------------------------------------------------[INFO] Building ADAM_2.10: APIs for Java 0.19.0[INFO] ------------------------------------------------------------------------[INFO] [INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-apis_2.10 ---[INFO] [INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-apis_2.10 ---[INFO] [INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-apis_2.10 ---[INFO] Source directory: D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-apis\src\main\scala added.[INFO] [INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis_2.10 ---[INFO] Modified 0 of 4 .scala files[INFO] [INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-apis_2.10 ---[INFO] Using 'UTF-8' encoding to copy filtered resources.[INFO] skip non existing resourceDirectory D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-apis\src\main\resources[INFO] [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-apis_2.10 ---[WARNING]  Expected all dependencies to require Scala version: 2.10.4[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0[WARNING] Multiple versions of scala libraries detected![INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-apis_2.10 ---[INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-apis_2.10 ---[INFO] Test Source directory: D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-apis\src\test\scala added.[INFO] [INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-apis_2.10 ---[INFO] Using 'UTF-8' encoding to copy filtered resources.[INFO] skip non existing resourceDirectory D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-apis\src\test\resources[INFO] [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-apis_2.10 ---[WARNING]  Expected all dependencies to require Scala version: 2.10.4[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0[WARNING] Multiple versions of scala libraries detected![INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-apis_2.10 ---[INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-apis_2.10 ---[INFO] Tests are skipped.[INFO] [INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-apis_2.10 ---Discovery starting.Discovery completed in 377 milliseconds.Run starting. Expected test count is: 2JavaADAMContextSuite:2016-05-12 18:57:10 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable2016-05-12 18:57:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:57:14 WARN  :139 - Your hostname, xubo-PC resolves to a loopback/non-reachable address: fe80:0:0:0:482:722f:5976:ce1f%20, but we couldn't find any external IP address!- can read a small .SAM file2016-05-12 18:57:17 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".SLF4J: Defaulting to no-operation (NOP) logger implementationSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.- can read a small .SAM file inside of javaRun completed in 10 seconds, 133 milliseconds.Total number of tests run: 2Suites: completed 2, aborted 0Tests: succeeded 2, failed 0, canceled 0, ignored 0, pending 0All tests passed.[INFO]                                                                         [INFO] ------------------------------------------------------------------------[INFO] Building ADAM_2.10: CLI 0.19.0[INFO] ------------------------------------------------------------------------[INFO] [INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-cli_2.10 ---[INFO] [INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-cli_2.10 ---[INFO] [INFO] --- git-commit-id-plugin:2.2.0:revision (default) @ adam-cli_2.10 ---[INFO] [INFO] --- templating-maven-plugin:1.0-alpha-3:filter-sources (filter-src) @ adam-cli_2.10 ---[INFO] Using 'UTF-8' encoding to copy filtered resources.[INFO] Copying 1 resource[INFO] Source directory: D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-cli\target\generated-sources\java-templates added.[INFO] [INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-cli_2.10 ---[INFO] Source directory: D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-cli\src\main\scala added.[INFO] [INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli_2.10 ---[INFO] Modified 0 of 36 .scala files[INFO] [INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-cli_2.10 ---[INFO] Using 'UTF-8' encoding to copy filtered resources.[INFO] Copying 1 resource[INFO] [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-cli_2.10 ---[WARNING]  Expected all dependencies to require Scala version: 2.10.4[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0[WARNING] Multiple versions of scala libraries detected![INFO] D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-cli\target\generated-sources\java-templates:-1: info: compiling[INFO] D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-cli\src\main\scala:-1: info: compiling[INFO] Compiling 26 source files to D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-cli\target\scala-2.10.4\classes at 1463050642158[INFO] prepare-compile in 0 s[INFO] compile in 20 s[INFO] [INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-cli_2.10 ---[INFO] Changes detected - recompiling the module![INFO] Compiling 1 source file to D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-cli\target\scala-2.10.4\classes[INFO] [INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-cli_2.10 ---[INFO] Test Source directory: D:\all\eclipse432\adam-2.10-0.19-git-bin\adam-2.10-0.19-git\adam-cli\src\test\scala added.[INFO] [INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-cli_2.10 ---[INFO] Using 'UTF-8' encoding to copy filtered resources.[INFO] Copying 9 resources[INFO] [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-cli_2.10 ---[WARNING]  Expected all dependencies to require Scala version: 2.10.4[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0[WARNING] Multiple versions of scala libraries detected![INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-cli_2.10 ---[INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-cli_2.10 ---[INFO] Tests are skipped.[INFO] [INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-cli_2.10 ---Discovery starting.Discovery completed in 458 milliseconds.Run starting. Expected test count is: 33TransformSuite:2016-05-12 18:57:46 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable2016-05-12 18:57:48 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:57:50 WARN  :139 - Your hostname, xubo-PC resolves to a loopback/non-reachable address: fe80:0:0:0:482:722f:5976:ce1f%20, but we couldn't find any external IP address!2016-05-12 18:57:52 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation2016-05-12 18:57:52 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.- unordered sam to unordered sam2016-05-12 18:57:53 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:57:54 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation2016-05-12 18:57:54 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.- unordered sam to ordered sam2016-05-12 18:57:55 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".SLF4J: Defaulting to no-operation (NOP) logger implementationSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.2016-05-12 18:57:57 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation2016-05-12 18:57:57 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.- unordered sam, to adam, to sam2016-05-12 18:57:57 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:57:58 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation2016-05-12 18:57:58 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.- unordered sam, to adam, to ordered samFlattenSuite:2016-05-12 18:57:58 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- can flatten a simple VCF fileFlagStatSuite:2016-05-12 18:58:00 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- Standard FlagStat testViewSuite:2016-05-12 18:58:01 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:02 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- -f 0 -F 0 is a no-op2016-05-12 18:58:02 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:03 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- no -f or -F args is a no-op2016-05-12 18:58:03 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:04 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- -f 4: only unmapped reads2016-05-12 18:58:04 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:04 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- -F 4: only mapped reads2016-05-12 18:58:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- -f 4 -F 8: unmapped reads with mapped mates2016-05-12 18:58:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- -f 12: unmapped reads with unmapped mates2016-05-12 18:58:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- -g 12: reads that are unmapped or whose mate is unmapped2016-05-12 18:58:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:07 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- -F 12: mapped reads with mapped mates2016-05-12 18:58:07 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- -g 36: unmapped reads or reads with mate on negative strand2016-05-12 18:58:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- -F 36: unmapped reads or reads with mate on negative strandPluginExecutorSuite:2016-05-12 18:58:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- take10 works correctly on example SAM2016-05-12 18:58:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- java take10 works correctly on example SAM2016-05-12 18:58:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- takeN works correctly on example SAM with arg of '3'ADAM2FastaSuite:2016-05-12 18:58:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- round trip FASTA to nucleotide contig fragments in ADAM format to FASTAFeatures2ADAMSuite:- can convert a simple BED file !!! IGNORED !!!2016-05-12 18:58:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.- can convert a simple wigfix fileAboutSuite:- template variables have been replaced- templated values are not emptyADAMMainSuite:- default command groups is not empty- module provides default command groups- inject default command groups when called via main- command groups is empty when called via apply- single command group- add new command group to default command groups- module restores default command groups when called via apply- custom module with single command group- custom module with new command group added to default command groupsAdam2FastqSuite:2016-05-12 18:58:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.2016-05-12 18:58:13 ERROR AlignmentRecordRDDFunctions:75 - Found 16 read names that don't occur exactly twice:    1x: 16Samples:    SRR062634.16445865    SRR062634.9119161    SRR062634.17190076    SRR062634.17969132    SRR062634.7301099    SRR062634.2087100    SRR062634.20911784    SRR062634.16769670    SRR062634.18958430    SRR062634.12099057    SRR062634.12606172    SRR062634.14985224    SRR062634.10448889    SRR062634.4789722    SRR062634.3203184    SRR062634.17698657- convert SAM to paired FASTQRun completed in 30 seconds, 260 milliseconds.Total number of tests run: 33Suites: completed 11, aborted 0Tests: succeeded 33, failed 0, canceled 0, ignored 1, pending 0All tests passed.[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO] [INFO] ADAM_2.10 .......................................... SUCCESS [  9.666 s][INFO] ADAM_2.10: Core .................................... SUCCESS [02:57 min][INFO] ADAM_2.10: APIs for Java ........................... SUCCESS [ 13.568 s][INFO] ADAM_2.10: CLI ..................................... SUCCESS [ 56.670 s][INFO] ------------------------------------------------------------------------[INFO] BUILD SUCCESS[INFO] ------------------------------------------------------------------------[INFO] Total time: 04:17 min[INFO] Finished at: 2016-05-12T18:58:16+08:00[INFO] Final Memory: 61M/486M[INFO] ------------------------------------------------------------------------
0 0
原创粉丝点击