ubuntu@ubuntu:~$ docker exec master benchmark WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/opt/hadoop-2 .10.1/share/hadoop/common/lib/hadoop-auth-2.10.1.jar) to method sun.security.krb5.Config.getInstance() WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.Kerberos Util WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release 21/07/31 17:48:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin -java classes where applicable WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/opt/hadoop-2 .10.1/share/hadoop/common/lib/hadoop-auth-2.10.1.jar) to method sun.security.krb5.Config.getInstance() WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.Kerberos Util WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release 21/07/31 17:51:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin -java classes where applicable WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/opt/hadoop-2 .10.1/share/hadoop/common/lib/hadoop-auth-2.10.1.jar) to method sun.security.krb5.Config.getInstance() WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.Kerberos Util WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release 21/07/31 17:54:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin -java classes where applicable Mahout: seqwiki MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath. Running on hadoop, using /opt/hadoop-2.10.1/bin/hadoop and HADOOP_CONF_DIR=/opt/hadoop-2.10.1/etc/hadoop MAHOUT-JOB: /opt/mahout-0.13.0/mahout-examples-0.13.0-job.jar WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/opt/hadoop-2 .10.1/share/hadoop/common/lib/hadoop-auth-2.10.1.jar) to method sun.security.krb5.Config.getInstance() WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.Kerberos Util WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release 21/07/31 18:26:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 21/07/31 18:26:48 INFO WikipediaToSequenceFile: Input: /user/root/wiki Out: /user/root/wiki-seq Categories: /root/cat egories All Files: false 21/07/31 18:28:45 INFO RMProxy: Connecting to ResourceManager at ubuntu/10.0.2.15:8032 21/07/31 18:29:53 WARN JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool inte rface and execute your application with ToolRunner to remedy this. 21/07/31 18:31:31 INFO FileInputFormat: Total input files to process : 1 21/07/31 18:31:38 INFO JobSubmitter: number of splits:7 21/07/31 18:32:35 INFO JobSubmitter: Submitting tokens for job: job_1627751643629_0003 21/07/31 18:33:18 INFO Configuration: resource-types.xml not found 21/07/31 18:33:18 INFO ResourceUtils: Unable to find 'resource-types.xml'. 21/07/31 18:33:19 INFO ResourceUtils: Adding resource type - name = memory-mb, units = Mi, type = COUNTABLE 21/07/31 18:33:19 INFO ResourceUtils: Adding resource type - name = vcores, units = , type = COUNTABLE 21/07/31 18:34:01 INFO YarnClientImpl: Submitted application application_1627751643629_0003 21/07/31 18:34:10 INFO Job: The url to track the job: http://ubuntu:8088/proxy/application_1627751643629_0003/ 21/07/31 18:34:10 INFO Job: Running job: job_1627751643629_0003 21/07/31 18:58:41 INFO Job: Job job_1627751643629_0003 running in uber mode : false 21/07/31 18:58:41 INFO Job: map 0% reduce 0% 21/07/31 19:39:09 INFO Job: map 1% reduce 0% 21/07/31 20:46:49 INFO Job: map 2% reduce 0% 21/07/31 21:53:19 INFO Job: map 3% reduce 0% 21/07/31 23:02:13 INFO Job: map 4% reduce 0% 21/08/01 00:10:53 INFO Job: map 5% reduce 0% 21/08/01 01:18:51 INFO Job: map 6% reduce 0% 21/08/01 02:29:57 INFO Job: map 7% reduce 0% 21/08/01 03:40:02 INFO Job: map 8% reduce 0% 21/08/01 04:48:35 INFO Job: map 9% reduce 0% 21/08/01 05:57:43 INFO Job: map 10% reduce 0% 21/08/01 07:09:06 INFO Job: map 11% reduce 0% 21/08/01 08:23:23 INFO Job: map 12% reduce 0% 21/08/01 09:32:41 INFO Job: map 13% reduce 0% 21/08/01 10:41:32 INFO Job: map 14% reduce 0% 21/08/01 11:50:18 INFO Job: map 15% reduce 0% 21/08/01 12:55:42 INFO Job: map 16% reduce 0% 21/08/01 14:02:37 INFO Job: map 17% reduce 0% 21/08/01 15:11:29 INFO Job: map 18% reduce 0% 21/08/01 16:19:01 INFO Job: map 19% reduce 0% 21/08/01 17:26:09 INFO Job: map 20% reduce 0% 21/08/01 18:34:10 INFO Job: map 21% reduce 0% 21/08/01 19:41:47 INFO Job: map 22% reduce 0% 21/08/01 20:48:44 INFO Job: map 23% reduce 0% 21/08/01 21:57:44 INFO Job: map 24% reduce 0% 21/08/01 23:06:54 INFO Job: map 25% reduce 0% 21/08/02 00:17:52 INFO Job: map 26% reduce 0% 21/08/02 01:27:55 INFO Job: map 27% reduce 0% 21/08/02 02:26:47 INFO Job: map 32% reduce 0% 21/08/02 02:58:18 INFO Job: map 33% reduce 0% 21/08/02 04:04:24 INFO Job: map 38% reduce 0% 21/08/02 04:29:40 INFO Job: map 39% reduce 0% 21/08/02 05:19:08 INFO Job: map 43% reduce 0% 21/08/02 05:19:13 INFO Job: map 44% reduce 0% 21/08/02 05:26:48 INFO Job: map 44% reduce 5% 21/08/02 05:26:56 INFO Job: map 44% reduce 14% 21/08/02 06:12:56 INFO Job: map 45% reduce 14% 21/08/02 07:51:33 INFO Job: map 46% reduce 14% 21/08/02 09:28:01 INFO Job: map 47% reduce 14% 21/08/02 11:08:03 INFO Job: map 48% reduce 14% 21/08/02 12:46:57 INFO Job: map 49% reduce 14% 21/08/02 14:24:27 INFO Job: map 50% reduce 14% 21/08/02 16:01:42 INFO Job: map 51% reduce 14% 21/08/02 17:39:58 INFO Job: map 52% reduce 14% 21/08/02 19:21:17 INFO Job: map 53% reduce 14% 21/08/02 21:00:45 INFO Job: map 54% reduce 14% 21/08/02 22:42:25 INFO Job: map 55% reduce 14% 21/08/03 00:26:06 INFO Job: map 56% reduce 14% 21/08/03 02:05:15 INFO Job: map 57% reduce 14% 21/08/03 03:43:07 INFO Job: map 58% reduce 14% 21/08/03 05:21:10 INFO Job: map 59% reduce 14% 21/08/03 07:00:10 INFO Job: map 60% reduce 14% 21/08/03 08:36:21 INFO Job: map 61% reduce 14% 21/08/03 09:37:38 INFO Job: map 66% reduce 14% 21/08/03 09:37:45 INFO Job: map 66% reduce 19% 21/08/03 10:43:57 INFO Job: map 67% reduce 19% 21/08/03 12:15:39 INFO Job: map 72% reduce 19% 21/08/03 12:15:48 INFO Job: map 72% reduce 24% 21/08/03 12:50:41 INFO Job: map 73% reduce 24% 21/08/03 14:19:16 INFO Job: map 74% reduce 24% 21/08/03 15:42:06 INFO Job: map 75% reduce 24% 21/08/03 17:04:23 INFO Job: map 76% reduce 24% 21/08/03 18:27:14 INFO Job: map 77% reduce 24% 21/08/03 19:54:41 INFO Job: map 78% reduce 24% 21/08/03 21:27:44 INFO Job: map 79% reduce 24% 21/08/03 22:58:56 INFO Job: map 80% reduce 24% 21/08/04 00:29:26 INFO Job: map 81% reduce 24% 21/08/04 02:00:34 INFO Job: map 82% reduce 24% 21/08/04 03:31:46 INFO Job: map 83% reduce 24% 21/08/04 05:01:33 INFO Job: map 84% reduce 24% 21/08/04 06:33:58 INFO Job: map 85% reduce 24% 21/08/04 08:05:33 INFO Job: map 86% reduce 24% 21/08/04 09:35:37 INFO Job: map 87% reduce 24% 21/08/04 11:03:59 INFO Job: map 88% reduce 24% 21/08/04 12:35:04 INFO Job: map 89% reduce 24% 21/08/04 14:05:36 INFO Job: map 90% reduce 24% 21/08/04 14:05:57 INFO Job: map 94% reduce 24% 21/08/04 14:06:05 INFO Job: map 94% reduce 29% 21/08/04 14:53:42 INFO Job: map 95% reduce 29% 21/08/04 17:23:03 INFO Job: map 99% reduce 29% 21/08/04 17:23:08 INFO Job: map 100% reduce 29% 21/08/04 17:23:18 INFO Job: map 100% reduce 35% 21/08/04 17:23:25 INFO Job: map 100% reduce 39% 21/08/04 17:24:32 INFO Job: map 100% reduce 67% 21/08/04 17:24:46 INFO Job: map 100% reduce 68% 21/08/04 17:25:07 INFO Job: map 100% reduce 69% 21/08/04 17:25:21 INFO Job: map 100% reduce 70% 21/08/04 17:25:34 INFO Job: map 100% reduce 71% 21/08/04 17:25:55 INFO Job: map 100% reduce 72% 21/08/04 17:26:15 INFO Job: map 100% reduce 73% 21/08/04 17:26:29 INFO Job: map 100% reduce 74% 21/08/04 17:26:49 INFO Job: map 100% reduce 75% 21/08/04 17:27:04 INFO Job: map 100% reduce 76% 21/08/04 17:27:25 INFO Job: map 100% reduce 77% 21/08/04 17:27:38 INFO Job: map 100% reduce 78% 21/08/04 17:28:00 INFO Job: map 100% reduce 79% 21/08/04 17:28:13 INFO Job: map 100% reduce 80% 21/08/04 17:28:33 INFO Job: map 100% reduce 81% 21/08/04 17:28:47 INFO Job: map 100% reduce 82% 21/08/04 17:29:08 INFO Job: map 100% reduce 83% 21/08/04 17:29:28 INFO Job: map 100% reduce 84% 21/08/04 17:29:42 INFO Job: map 100% reduce 85% 21/08/04 17:30:04 INFO Job: map 100% reduce 86% 21/08/04 17:30:17 INFO Job: map 100% reduce 87% 21/08/04 17:30:38 INFO Job: map 100% reduce 88% 21/08/04 17:30:51 INFO Job: map 100% reduce 89% 21/08/04 17:31:13 INFO Job: map 100% reduce 90% 21/08/04 17:31:33 INFO Job: map 100% reduce 91% 21/08/04 17:31:46 INFO Job: map 100% reduce 92% 21/08/04 17:32:07 INFO Job: map 100% reduce 93% 21/08/04 17:32:28 INFO Job: map 100% reduce 94% 21/08/04 17:32:41 INFO Job: map 100% reduce 95% 21/08/04 17:33:03 INFO Job: map 100% reduce 96% 21/08/04 17:33:23 INFO Job: map 100% reduce 97% 21/08/04 17:33:36 INFO Job: map 100% reduce 98% 21/08/04 17:33:57 INFO Job: map 100% reduce 99% 21/08/04 17:34:11 INFO Job: map 100% reduce 100% 21/08/04 17:34:41 INFO Job: Job job_1627751643629_0003 completed successfully 21/08/04 17:34:46 INFO ClientServiceDelegate: Application state is completed. FinalApplicationStatus=SUCCEEDED. Redir ecting to job history server 21/08/04 17:36:48 INFO Job: Counters: 50 File System Counters FILE: Number of bytes read=710492864 FILE: Number of bytes written=1253049379 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=920863953 HDFS: Number of bytes written=541100693 HDFS: Number of read operations=24 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Killed map tasks=1 Launched map tasks=8 Launched reduce tasks=1 Data-local map tasks=8 Total time spent by all maps in occupied slots (ms)=1606214008 Total time spent by all reduces in occupied slots (ms)=433762602 Total time spent by all map tasks (ms)=803107004 Total time spent by all reduce tasks (ms)=216881301 Total vcore-milliseconds taken by all map tasks=803107004 Total vcore-milliseconds taken by all reduce tasks=216881301 Total megabyte-milliseconds taken by all map tasks=1644763144192 Total megabyte-milliseconds taken by all reduce tasks=444172904448 Map-Reduce Framework Map input records=27404 Map output records=9574 Map output bytes=540834863 Map output materialized bytes=540876210 Input split bytes=686 Combine input records=0 Combine output records=0 Reduce input groups=9574 Reduce shuffle bytes=540876210 Reduce input records=9574 Reduce output records=9574 Spilled Records=22042 Shuffled Maps =7 Failed Shuffles=0 Merged Map outputs=7 GC time elapsed (ms)=150363 CPU time spent (ms)=837091310 Physical memory (bytes) snapshot=2379509760 Virtual memory (bytes) snapshot=19014438912 Total committed heap usage (bytes)=1974968320 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=920863267 File Output Format Counters Bytes Written=541100693 21/08/04 17:36:53 INFO MahoutDriver: Program took 342653951 ms (Minutes: 5710.8992) 21/08/04 17:37:24 WARN ShutdownHookManager: ShutdownHook '' timeout, java.util.concurrent.TimeoutException java.util.concurrent.TimeoutException at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:204) at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95) 21/08/04 17:37:54 WARN ShutdownHookManager: ShutdownHook 'ClientFinalizer' timeout, java.util.concurrent.TimeoutExcep tion java.util.concurrent.TimeoutException at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:204) at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95) 21/08/04 17:38:32 ERROR ShutdownHookManager: ShutdownHookManger shutdown forcefully after 30 seconds. Mahout: seq2sparse MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath. Running on hadoop, using /opt/hadoop-2.10.1/bin/hadoop and HADOOP_CONF_DIR=/opt/hadoop-2.10.1/etc/hadoop MAHOUT-JOB: /opt/mahout-0.13.0/mahout-examples-0.13.0-job.jar 21/08/04 17:53:34 INFO SparseVectorsFromSequenceFiles: Maximum n-gram size is: 2 WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/opt/hadoop-2 .10.1/share/hadoop/common/lib/hadoop-auth-2.10.1.jar) to method sun.security.krb5.Config.getInstance() WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.Kerberos Util WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release 21/08/04 17:53:52 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 21/08/04 17:55:52 INFO SparseVectorsFromSequenceFiles: Minimum LLR value: 1.0 21/08/04 17:55:52 INFO SparseVectorsFromSequenceFiles: Number of reduce tasks: 1 21/08/04 17:55:53 INFO SparseVectorsFromSequenceFiles: Tokenizing documents in /user/root/wiki-seq 21/08/04 17:56:19 INFO RMProxy: Connecting to ResourceManager at ubuntu/10.0.2.15:8032 21/08/04 17:59:23 INFO FileInputFormat: Total input files to process : 1 21/08/04 17:59:27 INFO JobSubmitter: number of splits:4 21/08/04 18:00:24 INFO JobSubmitter: Submitting tokens for job: job_1627751643629_0005 21/08/04 18:01:07 INFO Configuration: resource-types.xml not found 21/08/04 18:01:07 INFO ResourceUtils: Unable to find 'resource-types.xml'. 21/08/04 18:01:08 INFO ResourceUtils: Adding resource type - name = memory-mb, units = Mi, type = COUNTABLE 21/08/04 18:01:08 INFO ResourceUtils: Adding resource type - name = vcores, units = , type = COUNTABLE 21/08/04 18:01:19 INFO YarnClientImpl: Submitted application application_1627751643629_0005 21/08/04 18:01:27 INFO Job: The url to track the job: http://ubuntu:8088/proxy/application_1627751643629_0005/ 21/08/04 18:01:27 INFO Job: Running job: job_1627751643629_0005 21/08/04 18:42:44 INFO Job: Job job_1627751643629_0005 running in uber mode : false 21/08/04 18:42:44 INFO Job: map 0% reduce 0% 21/08/04 18:53:30 INFO Job: map 1% reduce 0% 21/08/04 18:57:48 INFO Job: map 2% reduce 0% 21/08/04 19:02:18 INFO Job: map 3% reduce 0% 21/08/04 19:06:44 INFO Job: map 4% reduce 0% 21/08/04 19:11:09 INFO Job: map 5% reduce 0% 21/08/04 19:15:24 INFO Job: map 6% reduce 0% 21/08/04 19:20:02 INFO Job: map 7% reduce 0% 21/08/04 19:24:24 INFO Job: map 8% reduce 0% 21/08/04 19:28:49 INFO Job: map 9% reduce 0% 21/08/04 19:33:20 INFO Job: map 10% reduce 0% 21/08/04 19:37:46 INFO Job: map 11% reduce 0% 21/08/04 19:42:29 INFO Job: map 12% reduce 0% 21/08/04 19:46:52 INFO Job: map 13% reduce 0% 21/08/04 19:51:21 INFO Job: map 14% reduce 0% 21/08/04 19:56:05 INFO Job: map 15% reduce 0% 21/08/04 20:00:30 INFO Job: map 16% reduce 0% 21/08/04 20:05:05 INFO Job: map 17% reduce 0% 21/08/04 20:09:37 INFO Job: map 18% reduce 0% 21/08/04 20:14:01 INFO Job: map 19% reduce 0% 21/08/04 20:18:32 INFO Job: map 20% reduce 0% 21/08/04 20:23:13 INFO Job: map 21% reduce 0% 21/08/04 20:27:39 INFO Job: map 22% reduce 0% 21/08/04 20:32:18 INFO Job: map 23% reduce 0% 21/08/04 20:37:01 INFO Job: map 24% reduce 0% 21/08/04 20:41:34 INFO Job: map 25% reduce 0% 21/08/04 20:46:10 INFO Job: map 26% reduce 0% 21/08/04 20:50:36 INFO Job: map 27% reduce 0% 21/08/04 20:55:11 INFO Job: map 28% reduce 0% 21/08/04 20:59:35 INFO Job: map 29% reduce 0% 21/08/04 21:04:02 INFO Job: map 30% reduce 0% 21/08/04 21:08:38 INFO Job: map 31% reduce 0% 21/08/04 21:13:14 INFO Job: map 32% reduce 0% 21/08/04 21:17:48 INFO Job: map 33% reduce 0% 21/08/04 21:22:15 INFO Job: map 34% reduce 0% 21/08/04 21:26:39 INFO Job: map 35% reduce 0% 21/08/04 21:31:05 INFO Job: map 36% reduce 0% 21/08/04 21:35:38 INFO Job: map 37% reduce 0% 21/08/04 21:40:07 INFO Job: map 38% reduce 0% 21/08/04 21:44:40 INFO Job: map 39% reduce 0% 21/08/04 21:49:21 INFO Job: map 40% reduce 0% 21/08/04 21:53:51 INFO Job: map 41% reduce 0% 21/08/04 21:58:16 INFO Job: map 42% reduce 0% 21/08/04 22:02:59 INFO Job: map 43% reduce 0% 21/08/04 22:07:16 INFO Job: map 44% reduce 0% 21/08/04 22:11:45 INFO Job: map 45% reduce 0% 21/08/04 22:16:19 INFO Job: map 46% reduce 0% 21/08/04 22:21:03 INFO Job: map 47% reduce 0% 21/08/04 22:25:30 INFO Job: map 48% reduce 0% 21/08/04 22:30:17 INFO Job: map 49% reduce 0% 21/08/04 22:34:44 INFO Job: map 50% reduce 0% 21/08/04 22:39:18 INFO Job: map 51% reduce 0% 21/08/04 22:44:05 INFO Job: map 52% reduce 0% 21/08/04 22:48:42 INFO Job: map 53% reduce 0% 21/08/04 22:53:11 INFO Job: map 54% reduce 0% 21/08/04 22:57:40 INFO Job: map 55% reduce 0% 21/08/04 23:02:12 INFO Job: map 56% reduce 0% 21/08/04 23:06:59 INFO Job: map 57% reduce 0% 21/08/04 23:11:48 INFO Job: map 58% reduce 0% 21/08/04 23:16:25 INFO Job: map 59% reduce 0% 21/08/04 23:20:54 INFO Job: map 60% reduce 0% 21/08/04 23:25:30 INFO Job: map 61% reduce 0% 21/08/04 23:29:55 INFO Job: map 62% reduce 0% 21/08/04 23:34:42 INFO Job: map 63% reduce 0% 21/08/04 23:39:24 INFO Job: map 64% reduce 0% 21/08/04 23:43:52 INFO Job: map 65% reduce 0% 21/08/04 23:48:27 INFO Job: map 66% reduce 0% 21/08/04 23:53:10 INFO Job: map 67% reduce 0% 21/08/04 23:57:37 INFO Job: map 68% reduce 0% 21/08/05 00:02:16 INFO Job: map 69% reduce 0% 21/08/05 00:06:57 INFO Job: map 70% reduce 0% 21/08/05 00:11:24 INFO Job: map 71% reduce 0% 21/08/05 00:16:05 INFO Job: map 72% reduce 0% 21/08/05 00:22:50 INFO Job: map 73% reduce 0% 21/08/05 00:28:20 INFO Job: map 74% reduce 0% 21/08/05 00:33:26 INFO Job: map 75% reduce 0% 21/08/05 00:40:22 INFO Job: map 76% reduce 0% 21/08/05 00:47:11 INFO Job: map 77% reduce 0% 21/08/05 00:58:43 INFO Job: map 78% reduce 0% 21/08/05 01:11:18 INFO Job: map 79% reduce 0% 21/08/05 01:24:11 INFO Job: map 80% reduce 0% 21/08/05 01:37:21 INFO Job: map 81% reduce 0% 21/08/05 01:50:13 INFO Job: map 82% reduce 0% 21/08/05 02:03:18 INFO Job: map 83% reduce 0% 21/08/05 02:16:06 INFO Job: map 84% reduce 0% 21/08/05 02:28:31 INFO Job: map 85% reduce 0% 21/08/05 02:40:59 INFO Job: map 86% reduce 0% 21/08/05 02:54:07 INFO Job: map 87% reduce 0% 21/08/05 03:06:29 INFO Job: map 88% reduce 0% 21/08/05 03:19:09 INFO Job: map 89% reduce 0% 21/08/05 03:31:44 INFO Job: map 90% reduce 0% 21/08/05 03:44:32 INFO Job: map 91% reduce 0% 21/08/05 03:56:58 INFO Job: map 92% reduce 0% 21/08/05 04:09:07 INFO Job: map 93% reduce 0% 21/08/05 04:21:39 INFO Job: map 94% reduce 0% 21/08/05 04:33:44 INFO Job: map 95% reduce 0% 21/08/05 04:45:41 INFO Job: map 96% reduce 0% 21/08/05 04:58:27 INFO Job: map 97% reduce 0% 21/08/05 05:11:27 INFO Job: map 98% reduce 0% 21/08/05 05:24:03 INFO Job: map 99% reduce 0% 21/08/05 05:36:37 INFO Job: map 100% reduce 0% 21/08/05 05:43:29 INFO Job: Job job_1627751643629_0005 completed successfully 21/08/05 05:43:34 INFO ClientServiceDelegate: Application state is completed. FinalApplicationStatus=SUCCEEDED. Redir ecting to job history server 21/08/05 05:44:06 INFO Job: Counters: 30 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=835416 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=541610287 HDFS: Number of bytes written=439038042 HDFS: Number of read operations=24 HDFS: Number of large read operations=0 HDFS: Number of write operations=8 Job Counters Killed map tasks=1 Launched map tasks=5 Data-local map tasks=5 Total time spent by all maps in occupied slots (ms)=202159434 Total time spent by all map tasks (ms)=101079717 Total vcore-milliseconds taken by all map tasks=101079717 Total megabyte-milliseconds taken by all map tasks=207011260416 Map-Reduce Framework Map input records=9574 Map output records=9574 Input split bytes=460 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=28852 CPU time spent (ms)=87452320 Physical memory (bytes) snapshot=515133440 Virtual memory (bytes) snapshot=9514283008 Total committed heap usage (bytes)=308748288 File Input Format Counters Bytes Read=541609827 File Output Format Counters Bytes Written=439038042 21/08/05 05:44:07 INFO SparseVectorsFromSequenceFiles: Creating Term Frequency Vectors 21/08/05 05:44:07 INFO DictionaryVectorizer: Creating dictionary from /user/root/wiki-vectors/tokenized-documents and saving at /user/root/wiki-vectors/wordcount 21/08/05 05:44:17 INFO RMProxy: Connecting to ResourceManager at ubuntu/10.0.2.15:8032 21/08/05 05:45:42 INFO FileInputFormat: Total input files to process : 4 21/08/05 05:45:44 INFO JobSubmitter: number of splits:4 21/08/05 05:46:03 INFO JobSubmitter: Submitting tokens for job: job_1627751643629_0007 21/08/05 05:46:06 INFO YarnClientImpl: Submitted application application_1627751643629_0007 21/08/05 05:46:07 INFO Job: The url to track the job: http://ubuntu:8088/proxy/application_1627751643629_0007/ 21/08/05 05:46:07 INFO Job: Running job: job_1627751643629_0007 21/08/05 06:19:11 INFO Job: Job job_1627751643629_0007 running in uber mode : false 21/08/05 06:19:11 INFO Job: map 0% reduce 0% 21/08/05 06:38:26 INFO Job: map 1% reduce 0% 21/08/05 07:03:53 INFO Job: map 2% reduce 0% 21/08/05 08:11:38 INFO Job: Task Id : attempt_1627751643629_0007_m_000001_1000, Status : FAILED AttemptID:attempt_1627751643629_0007_m_000001_1000 Timed out after 600 secs 21/08/05 10:03:02 INFO Job: Task Id : attempt_1627751643629_0007_m_000001_1001, Status : FAILED AttemptID:attempt_1627751643629_0007_m_000001_1001 Timed out after 600 secs 21/08/05 12:00:33 INFO Job: Task Id : attempt_1627751643629_0007_m_000001_1002, Status : FAILED AttemptID:attempt_1627751643629_0007_m_000001_1002 Timed out after 600 secs 21/08/05 12:38:35 INFO Job: map 3% reduce 0% 21/08/05 13:14:20 INFO Job: map 4% reduce 0% 21/08/05 13:58:03 INFO Job: map 3% reduce 0% 21/08/05 13:58:06 INFO Job: map 50% reduce 100% 21/08/05 13:58:08 INFO Job: map 75% reduce 100% 21/08/05 13:58:09 INFO Job: map 100% reduce 100% 21/08/05 13:58:18 INFO Job: Job job_1627751643629_0007 failed with state FAILED due to: Task failed task_162775164362 9_0007_m_000001 Job failed as tasks failed. failedMaps:1 failedReduces:0 21/08/05 13:58:23 INFO ClientServiceDelegate: Application state is completed. FinalApplicationStatus=FAILED. Redirect ing to job history server Exception in thread "main" java.lang.IllegalStateException: Job failed! at org.apache.mahout.vectorizer.collocations.llr.CollocDriver.generateCollocations(CollocDriver.java:238) at org.apache.mahout.vectorizer.collocations.llr.CollocDriver.generateAllGrams(CollocDriver.java:187) at org.apache.mahout.vectorizer.DictionaryVectorizer.createTermFrequencyVectors(DictionaryVectorizer.java:189 ) at org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:274) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.main(SparseVectorsFromSequenceFiles.java:56) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71) at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:152) at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.apache.hadoop.util.RunJar.run(RunJar.java:244) at org.apache.hadoop.util.RunJar.main(RunJar.java:158) 21/08/05 13:59:03 WARN ShutdownHookManager: ShutdownHook '' timeout, java.util.concurrent.TimeoutException java.util.concurrent.TimeoutException at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:204) at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95) 21/08/05 13:59:33 WARN ShutdownHookManager: ShutdownHook 'ClientFinalizer' timeout, java.util.concurrent.TimeoutExcep tion java.util.concurrent.TimeoutException at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:204) at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95) 21/08/05 14:00:12 ERROR ShutdownHookManager: ShutdownHookManger shutdown forcefully after 30 seconds. Mahout: split MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath. Running on hadoop, using /opt/hadoop-2.10.1/bin/hadoop and HADOOP_CONF_DIR=/opt/hadoop-2.10.1/etc/hadoop MAHOUT-JOB: /opt/mahout-0.13.0/mahout-examples-0.13.0-job.jar 21/08/05 14:14:00 WARN MahoutDriver: No split.props found on classpath, will use command-line arguments only 21/08/05 14:14:10 INFO AbstractJob: Command line arguments: {--endPhase=[2147483647], --input=[/user/root/wiki-vector s/tfidf-vectors], --method=[sequential], --overwrite=null, --randomSelectionPct=[20], --sequenceFiles=null, --startPh ase=[0], --tempDir=[temp], --testOutput=[/user/root/testing], --trainingOutput=[/user/root/training]} WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/opt/hadoop-2 .10.1/share/hadoop/common/lib/hadoop-auth-2.10.1.jar) to method sun.security.krb5.Config.getInstance() WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.Kerberos Util WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release 21/08/05 14:14:27 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 21/08/05 14:16:25 INFO HadoopUtil: Deleting /user/root/training 21/08/05 14:16:25 INFO HadoopUtil: Deleting /user/root/testing Exception in thread "main" java.io.FileNotFoundException: File does not exist: /user/root/wiki-vectors/tfidf-vectors at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1528) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1521) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1521) at org.apache.mahout.utils.SplitInput.splitDirectory(SplitInput.java:288) at org.apache.mahout.utils.SplitInput.splitDirectory(SplitInput.java:279) at org.apache.mahout.utils.SplitInput.splitDirectory(SplitInput.java:270) at org.apache.mahout.utils.SplitInput.run(SplitInput.java:134) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.mahout.utils.SplitInput.main(SplitInput.java:140) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71) at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:152) at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.apache.hadoop.util.RunJar.run(RunJar.java:244) at org.apache.hadoop.util.RunJar.main(RunJar.java:158) 21/08/05 14:16:56 WARN ShutdownHookManager: ShutdownHook '' timeout, java.util.concurrent.TimeoutException java.util.concurrent.TimeoutException at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:204) at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95) 21/08/05 14:17:35 ERROR ShutdownHookManager: ShutdownHookManger shutdown forcefully after 30 seconds. Mahout: trainnb MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath. Running on hadoop, using /opt/hadoop-2.10.1/bin/hadoop and HADOOP_CONF_DIR=/opt/hadoop-2.10.1/etc/hadoop MAHOUT-JOB: /opt/mahout-0.13.0/mahout-examples-0.13.0-job.jar 21/08/05 14:31:57 WARN MahoutDriver: No trainnb.props found on classpath, will use command-line arguments only 21/08/05 14:32:06 INFO AbstractJob: Command line arguments: {--alphaI=[1.0], --endPhase=[2147483647], --input=[/user/ root/training], --labelIndex=[/user/root/labelindex], --output=[/user/root/model], --overwrite=null, --startPhase=[0] , --tempDir=[temp], --trainComplementary=null} WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/opt/hadoop-2 .10.1/share/hadoop/common/lib/hadoop-auth-2.10.1.jar) to method sun.security.krb5.Config.getInstance() WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.Kerberos Util WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release 21/08/05 14:32:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 21/08/05 14:34:18 INFO HadoopUtil: Deleting temp 21/08/05 14:34:28 INFO CodecPool: Got brand-new compressor [.deflate] 21/08/05 14:34:46 INFO deprecation: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inpu tdir 21/08/05 14:34:46 INFO deprecation: mapred.compress.map.output is deprecated. Instead, use mapreduce.map.output.compr ess 21/08/05 14:34:46 INFO deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.o utputdir 21/08/05 14:35:08 INFO RMProxy: Connecting to ResourceManager at ubuntu/10.0.2.15:8032 21/08/05 14:38:02 INFO FileInputFormat: Total input files to process : 0 21/08/05 14:38:05 INFO JobSubmitter: number of splits:0 21/08/05 14:39:01 INFO JobSubmitter: Submitting tokens for job: job_1627751643629_0008 21/08/05 14:39:43 INFO Configuration: resource-types.xml not found 21/08/05 14:39:43 INFO ResourceUtils: Unable to find 'resource-types.xml'. 21/08/05 14:39:43 INFO ResourceUtils: Adding resource type - name = memory-mb, units = Mi, type = COUNTABLE 21/08/05 14:39:43 INFO ResourceUtils: Adding resource type - name = vcores, units = , type = COUNTABLE 21/08/05 14:39:53 INFO YarnClientImpl: Submitted application application_1627751643629_0008 21/08/05 14:40:01 INFO Job: The url to track the job: http://ubuntu:8088/proxy/application_1627751643629_0008/ 21/08/05 14:40:02 INFO Job: Running job: job_1627751643629_0008 21/08/05 15:01:59 INFO Job: Job job_1627751643629_0008 running in uber mode : false 21/08/05 15:01:59 INFO Job: map 0% reduce 0% 21/08/05 15:09:39 INFO Job: map 0% reduce 100% 21/08/05 15:09:54 INFO Job: Job job_1627751643629_0008 completed successfully 21/08/05 15:09:59 INFO ClientServiceDelegate: Application state is completed. FinalApplicationStatus=SUCCEEDED. Redir ecting to job history server 21/08/05 15:10:38 INFO Job: Counters: 37 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=210494 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=0 HDFS: Number of bytes written=97 HDFS: Number of read operations=3 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched reduce tasks=1 Total time spent by all reduces in occupied slots (ms)=835526 Total time spent by all reduce tasks (ms)=417763 Total vcore-milliseconds taken by all reduce tasks=417763 Total megabyte-milliseconds taken by all reduce tasks=855578624 Map-Reduce Framework Combine input records=0 Combine output records=0 Reduce input groups=0 Reduce shuffle bytes=0 Reduce input records=0 Reduce output records=0 Spilled Records=0 Shuffled Maps =0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=512 CPU time spent (ms)=40930 Physical memory (bytes) snapshot=105046016 Virtual memory (bytes) snapshot=2384826368 Total committed heap usage (bytes)=55595008 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Output Format Counters Bytes Written=97 21/08/05 15:10:47 INFO RMProxy: Connecting to ResourceManager at ubuntu/10.0.2.15:8032 21/08/05 15:12:18 INFO FileInputFormat: Total input files to process : 1 21/08/05 15:12:20 INFO JobSubmitter: number of splits:1 21/08/05 15:12:36 INFO JobSubmitter: Submitting tokens for job: job_1627751643629_0009 21/08/05 15:12:39 INFO YarnClientImpl: Submitted application application_1627751643629_0009 21/08/05 15:12:40 INFO Job: The url to track the job: http://ubuntu:8088/proxy/application_1627751643629_0009/ 21/08/05 15:12:40 INFO Job: Running job: job_1627751643629_0009 21/08/05 15:35:59 INFO Job: Job job_1627751643629_0009 running in uber mode : false 21/08/05 15:35:59 INFO Job: map 0% reduce 0% 21/08/05 15:43:14 INFO Job: Task Id : attempt_1627751643629_0009_m_000000_1000, Status : FAILED Error: java.lang.IllegalArgumentException: Wrong numLabels: 0. Must be > 0! at com.google.common.base.Preconditions.checkArgument(Preconditions.java:88) at org.apache.mahout.classifier.naivebayes.training.WeightsMapper.setup(WeightsMapper.java:44) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:793) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:177) at java.base/java.security.AccessController.doPrivileged(Native Method) at java.base/javax.security.auth.Subject.doAs(Subject.java:423) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:171) 21/08/05 15:49:54 INFO Job: Task Id : attempt_1627751643629_0009_m_000000_1001, Status : FAILED Error: java.lang.IllegalArgumentException: Wrong numLabels: 0. Must be > 0! at com.google.common.base.Preconditions.checkArgument(Preconditions.java:88) at org.apache.mahout.classifier.naivebayes.training.WeightsMapper.setup(WeightsMapper.java:44) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:793) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:177) at java.base/java.security.AccessController.doPrivileged(Native Method) at java.base/javax.security.auth.Subject.doAs(Subject.java:423) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:171) 21/08/05 15:56:26 INFO Job: Task Id : attempt_1627751643629_0009_m_000000_1002, Status : FAILED Error: java.lang.IllegalArgumentException: Wrong numLabels: 0. Must be > 0! at com.google.common.base.Preconditions.checkArgument(Preconditions.java:88) at org.apache.mahout.classifier.naivebayes.training.WeightsMapper.setup(WeightsMapper.java:44) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:793) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:177) at java.base/java.security.AccessController.doPrivileged(Native Method) at java.base/javax.security.auth.Subject.doAs(Subject.java:423) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:171) 21/08/05 16:03:03 INFO Job: map 100% reduce 0% 21/08/05 16:03:05 INFO Job: map 100% reduce 100% 21/08/05 16:03:16 INFO Job: Job job_1627751643629_0009 failed with state FAILED due to: Task failed task_162775164362 9_0009_m_000000 Job failed as tasks failed. failedMaps:1 failedReduces:0 21/08/05 16:03:21 INFO ClientServiceDelegate: Application state is completed. FinalApplicationStatus=FAILED. Redirect ing to job history server 21/08/05 16:03:30 INFO MahoutDriver: Program took 5492159 ms (Minutes: 91.53608333333334) 21/08/05 16:04:00 WARN ShutdownHookManager: ShutdownHook '' timeout, java.util.concurrent.TimeoutException java.util.concurrent.TimeoutException at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:204) at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95) 21/08/05 16:04:41 ERROR ShutdownHookManager: ShutdownHookManger shutdown forcefully after 30 seconds. Mahout: testnb MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath. Running on hadoop, using /opt/hadoop-2.10.1/bin/hadoop and HADOOP_CONF_DIR=/opt/hadoop-2.10.1/etc/hadoop MAHOUT-JOB: /opt/mahout-0.13.0/mahout-examples-0.13.0-job.jar 21/08/05 16:18:13 WARN MahoutDriver: No testnb.props found on classpath, will use command-line arguments only 21/08/05 16:18:22 INFO AbstractJob: Command line arguments: {--endPhase=[2147483647], --input=[/user/root/testing], - -labelIndex=[/user/root/labelindex], --model=[/user/root/model], --output=[/user/root/output], --overwrite=null, --ru nSequential=null, --startPhase=[0], --tempDir=[temp]} WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/opt/hadoop-2 .10.1/share/hadoop/common/lib/hadoop-auth-2.10.1.jar) to method sun.security.krb5.Config.getInstance() WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.Kerberos Util WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release 21/08/05 16:18:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Exception in thread "main" java.io.FileNotFoundException: File does not exist: /user/root/model/naiveBayesModel.bin at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:72) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:62) at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getBlockLocations(FSDirStatAndListingOp.java: 150) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1854) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:742) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNam enodeProtocolServerSideTranslatorPB.java:388) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMe thod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:507) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1034) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1003) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:931) at java.base/java.security.AccessController.doPrivileged(Native Method) at java.base/javax.security.auth.Subject.doAs(Subject.java:423) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2854) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.jav a:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessor Impl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121) at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88) at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:872) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:859) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:848) at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:348) at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:307) at org.apache.hadoop.hdfs.DFSInputStream.(DFSInputStream.java:292) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1087) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:324) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:324) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:919) at org.apache.mahout.classifier.naivebayes.NaiveBayesModel.materialize(NaiveBayesModel.java:111) at org.apache.mahout.classifier.naivebayes.test.TestNaiveBayesDriver.runSequential(TestNaiveBayesDriver.java: 113) at org.apache.mahout.classifier.naivebayes.test.TestNaiveBayesDriver.run(TestNaiveBayesDriver.java:89) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.mahout.classifier.naivebayes.test.TestNaiveBayesDriver.main(TestNaiveBayesDriver.java:66) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71) at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:152) at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.apache.hadoop.util.RunJar.run(RunJar.java:244) at org.apache.hadoop.util.RunJar.main(RunJar.java:158) Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /user/root/mode l/naiveBayesModel.bin at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:72) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:62) at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getBlockLocations(FSDirStatAndListingOp.java: 150) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1854) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:742) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNam enodeProtocolServerSideTranslatorPB.java:388) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMe thod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:507) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1034) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1003) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:931) at java.base/java.security.AccessController.doPrivileged(Native Method) at java.base/javax.security.auth.Subject.doAs(Subject.java:423) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2854) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1549) at org.apache.hadoop.ipc.Client.call(Client.java:1495) at org.apache.hadoop.ipc.Client.call(Client.java:1394) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy13.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProto colTranslatorPB.java:263) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy14.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:870) ... 30 more 21/08/05 16:20:59 WARN ShutdownHookManager: ShutdownHook '' timeout, java.util.concurrent.TimeoutException java.util.concurrent.TimeoutException at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:204) at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95) 21/08/05 16:21:37 ERROR ShutdownHookManager: ShutdownHookManger shutdown forcefully after 30 seconds. Benchmark time: 425350755ms