1 % Testing the JDK
  2 
  3 ## Overview
  4 
  5 The bulk of JDK tests use [jtreg](https://openjdk.org/jtreg/), a regression
  6 test framework and test runner built for the JDK's specific needs. Other test
  7 frameworks are also used. The different test frameworks can be executed
  8 directly, but there is also a set of make targets intended to simplify the
  9 interface, and figure out how to run your tests for you.
 10 
 11 ## Running tests locally with `make test`
 12 
 13 This is the easiest way to get started. Assuming you've built the JDK locally,
 14 execute:
 15 
 16     $ make test
 17 
 18 This will run a default set of tests against the JDK, and present you with the
 19 results. `make test` is part of a family of test-related make targets which
 20 simplify running tests, because they invoke the various test frameworks for
 21 you. The "make test framework" is simple to start with, but more complex ad-hoc
 22 combination of tests is also possible. You can always invoke the test
 23 frameworks directly if you want even more control.
 24 
 25 Some example command-lines:
 26 
 27     $ make test-tier1
 28     $ make test-jdk_lang JTREG="JOBS=8"
 29     $ make test TEST=jdk_lang
 30     $ make test-only TEST="gtest:LogTagSet gtest:LogTagSetDescriptions" GTEST="REPEAT=-1"
 31     $ make test TEST="hotspot:hotspot_gc" JTREG="JOBS=1;TIMEOUT_FACTOR=8;JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"
 32     $ make test TEST="jtreg:test/hotspot:hotspot_gc test/hotspot/jtreg/native_sanity/JniVersion.java"
 33     $ make test TEST="micro:java.lang.reflect" MICRO="FORK=1;WARMUP_ITER=2"
 34     $ make exploded-test TEST=tier2
 35 
 36 "tier1" and "tier2" refer to tiered testing, see further down. "TEST" is a test
 37  selection argument which the make test framework will use to try to find the
 38  tests you want. It iterates over the available test frameworks, and if the
 39  test isn't present in one, it tries the next one. The main target `test` uses
 40  the jdk-image as the tested product. There is also an alternate target
 41  `exploded-test` that uses the exploded image instead. Not all tests will run
 42  successfully on the exploded image, but using this target can greatly improve
 43  rebuild times for certain workflows.
 44 
 45 Previously, `make test` was used to invoke an old system for running tests, and
 46 `make run-test` was used for the new test framework. For backward compatibility
 47 with scripts and muscle memory, `run-test` and variants like
 48 `exploded-run-test` or `run-test-tier1` are kept as aliases.
 49 
 50 ### Configuration
 51 
 52 To be able to run JTReg tests, `configure` needs to know where to find the
 53 JTReg test framework. If it is not picked up automatically by configure, use
 54 the `--with-jtreg=<path to jtreg home>` option to point to the JTReg framework.
 55 Note that this option should point to the JTReg home, i.e. the top directory,
 56 containing `lib/jtreg.jar` etc. (An alternative is to set the `JT_HOME`
 57 environment variable to point to the JTReg home before running `configure`.)
 58 
 59 To be able to run microbenchmarks, `configure` needs to know where to find the
 60 JMH dependency. Use `--with-jmh=<path to JMH jars>` to point to a directory
 61 containing the core JMH and transitive dependencies. The recommended
 62 dependencies can be retrieved by running `sh make/devkit/createJMHBundle.sh`,
 63 after which `--with-jmh=build/jmh/jars` should work.
 64 
 65 When tests fail or timeout, jtreg runs its failure handler to capture necessary
 66 data from the system where the test was run. This data can then be used to
 67 analyze the test failures. Collecting this data involves running various
 68 commands (which are listed in files residing in
 69 `test/failure_handler/src/share/conf`) and some of these commands use `sudo`.
 70 If the system's `sudoers` file isn't configured to allow running these
 71 commands, then it can result in password being prompted during the failure
 72 handler execution. Typically, when running locally, collecting this additional
 73 data isn't always necessary. To disable running the failure handler, use
 74 `--enable-jtreg-failure-handler=no` when running `configure`. If, however, you
 75 want to let the failure handler to run and don't want to be prompted for sudo
 76 password, then you can configure your `sudoers` file appropriately. Please read
 77 the necessary documentation of your operating system to see how to do that;
 78 here we only show one possible way of doing that - edit the
 79 `/etc/sudoers.d/sudoers` file to include the following line:
 80 
 81 ```
 82 johndoe ALL=(ALL) NOPASSWD: /sbin/dmesg
 83 ```
 84 
 85 This line configures `sudo` to _not_ prompt for password for the `/sbin/dmesg`
 86 command (this is one of the commands that is listed in the files at
 87 `test/failure_handler/src/share/conf`), for the user `johndoe`. Here `johndoe`
 88 is the user account under which the jtreg tests are run. Replace the username
 89 with a relevant user account of your system.
 90 
 91 ## Test selection
 92 
 93 All functionality is available using the `test` make target. In this use case,
 94 the test or tests to be executed is controlled using the `TEST` variable. To
 95 speed up subsequent test runs with no source code changes, `test-only` can be
 96 used instead, which do not depend on the source and test image build.
 97 
 98 For some common top-level tests, direct make targets have been generated. This
 99 includes all JTReg test groups, the hotspot gtest, and custom tests (if
100 present). This means that `make test-tier1` is equivalent to `make test
101 TEST="tier1"`, but the latter is more tab-completion friendly. For more complex
102 test runs, the `test TEST="x"` solution needs to be used.
103 
104 The test specifications given in `TEST` is parsed into fully qualified test
105 descriptors, which clearly and unambiguously show which tests will be run. As
106 an example, `:tier1` will expand to include all subcomponent test directories
107 that define `tier1`, for example: `jtreg:$(TOPDIR)/test/hotspot/jtreg:tier1
108 jtreg:$(TOPDIR)/test/jdk:tier1 jtreg:$(TOPDIR)/test/langtools:tier1 ...`. You
109 can always submit a list of fully qualified test descriptors in the `TEST`
110 variable if you want to shortcut the parser.
111 
112 ### Common Test Groups
113 
114 Ideally, all tests are run for every change but this may not be practical due
115 to the limited testing resources, the scope of the change, etc.
116 
117 The source tree currently defines a few common test groups in the relevant
118 `TEST.groups` files. There are test groups that cover a specific component, for
119 example `hotspot_gc`. It is a good idea to look into `TEST.groups` files to get
120 a sense what tests are relevant to a particular JDK component.
121 
122 Component-specific tests may miss some unintended consequences of a change, so
123 other tests should also be run. Again, it might be impractical to run all
124 tests, and therefore _tiered_ test groups exist. Tiered test groups are not
125 component-specific, but rather cover the significant parts of the entire JDK.
126 
127 Multiple tiers allow balancing test coverage and testing costs. Lower test
128 tiers are supposed to contain the simpler, quicker and more stable tests.
129 Higher tiers are supposed to contain progressively more thorough, slower, and
130 sometimes less stable tests, or the tests that require special configuration.
131 
132 Contributors are expected to run the tests for the areas that are changed, and
133 the first N tiers they can afford to run, but at least tier1.
134 
135 A brief description of the tiered test groups:
136 
137 - `tier1`: This is the most fundamental test tier. Roughly speaking, a failure
138   of a test in this tier has the potential to indicate a problem that would
139   affect many Java programs. Tests in `tier1` include tests of HotSpot, core
140   APIs in the `java.base` module, and the `javac` compiler. Multiple developers
141   run these tests every day. Because of the widespread use, the tests in
142   `tier1` are carefully selected and optimized to run fast, and to run in the
143   most stable manner. As a guideline, nearly all individual tests in `tier1`
144   are expected to run to completion in ten seconds or less when run on common
145   configurations used for development. Long-running tests, even of core
146   functionality, should occur in higher tiers or be covered in other kinds of
147   testing. The test failures in `tier1` are usually followed up on quickly,
148   either with fixes, or adding relevant tests to problem list. GitHub Actions
149   workflows, if enabled, run `tier1` tests.
150 
151 - `tier2`: This test group covers even more ground. These contain, among other
152   things, tests that either run for too long to be at `tier1`, or may require
153   special configuration, or tests that are less stable, or cover the broader
154   range of non-core JVM and JDK features/components (for example, XML).
155 
156 - `tier3`: This test group includes more stressful tests, the tests for corner
157   cases not covered by previous tiers, plus the tests that require GUIs. As
158   such, this suite should either be run with low concurrency (`TEST_JOBS=1`),
159   or without headful tests(`JTREG_KEYWORDS=\!headful`), or both.
160 
161 - `tier4`: This test group includes every other test not covered by previous
162   tiers. It includes, for example, `vmTestbase` suites for Hotspot, which run
163   for many hours even on large machines. It also runs GUI tests, so the same
164   `TEST_JOBS` and `JTREG_KEYWORDS` caveats apply.
165 
166 ### JTReg
167 
168 JTReg tests can be selected either by picking a JTReg test group, or a
169 selection of files or directories containing JTReg tests. Documentation can be
170 found at [https://openjdk.org/jtreg/](https://openjdk.org/jtreg/), note
171 especially the extensive [FAQ](https://openjdk.org/jtreg/faq.html).
172 
173 JTReg test groups can be specified either without a test root, e.g. `:tier1`
174 (or `tier1`, the initial colon is optional), or with, e.g. `hotspot:tier1`,
175 `test/jdk:jdk_util` or `$(TOPDIR)/test/hotspot/jtreg:hotspot_all`. The test
176 root can be specified either as an absolute path, or a path relative to the JDK
177 top directory, or the `test` directory. For simplicity, the hotspot JTReg test
178 root, which really is `hotspot/jtreg` can be abbreviated as just `hotspot`.
179 
180 When specified without a test root, all matching groups from all test roots
181 will be added. Otherwise, only the group from the specified test root will be
182 added.
183 
184 Individual JTReg tests or directories containing JTReg tests can also be
185 specified, like `test/hotspot/jtreg/native_sanity/JniVersion.java` or
186 `hotspot/jtreg/native_sanity`. Just like for test root selection, you can
187 either specify an absolute path (which can even point to JTReg tests outside
188 the source tree), or a path relative to either the JDK top directory or the
189 `test` directory. `hotspot` can be used as an alias for `hotspot/jtreg` here as
190 well.
191 
192 As long as the test groups or test paths can be uniquely resolved, you do not
193 need to enter the `jtreg:` prefix. If this is not possible, or if you want to
194 use a fully qualified test descriptor, add `jtreg:`, e.g.
195 `jtreg:test/hotspot/jtreg/native_sanity`.
196 
197 ### Gtest
198 
199 **Note:** To be able to run the Gtest suite, you need to configure your build
200 to be able to find a proper version of the gtest source. For details, see the
201 section ["Running Tests" in the build
202 documentation](building.html#running-tests).
203 
204 Since the Hotspot Gtest suite is so quick, the default is to run all tests.
205 This is specified by just `gtest`, or as a fully qualified test descriptor
206 `gtest:all`.
207 
208 If you want, you can single out an individual test or a group of tests, for
209 instance `gtest:LogDecorations` or `gtest:LogDecorations.level_test_vm`. This
210 can be particularly useful if you want to run a shaky test repeatedly.
211 
212 For Gtest, there is a separate test suite for each JVM variant. The JVM variant
213 is defined by adding `/<variant>` to the test descriptor, e.g.
214 `gtest:Log/client`. If you specify no variant, gtest will run once for each JVM
215 variant present (e.g. server, client). So if you only have the server JVM
216 present, then `gtest:all` will be equivalent to `gtest:all/server`.
217 
218 ### Microbenchmarks
219 
220 Which microbenchmarks to run is selected using a regular expression following
221 the `micro:` test descriptor, e.g., `micro:java.lang.reflect`. This delegates
222 the test selection to JMH, meaning package name, class name and even benchmark
223 method names can be used to select tests.
224 
225 Using special characters like `|` in the regular expression is possible, but
226 needs to be escaped multiple times: `micro:ArrayCopy\\\\\|reflect`.
227 
228 ### Special tests
229 
230 A handful of odd tests that are not covered by any other testing framework are
231 accessible using the `special:` test descriptor. Currently, this includes
232 `failure-handler` and `make`.
233 
234   * Failure handler testing is run using `special:failure-handler` or just
235     `failure-handler` as test descriptor.
236 
237   * Tests for the build system, including both makefiles and related
238     functionality, is run using `special:make` or just `make` as test
239     descriptor. This is equivalent to `special:make:all`.
240 
241     A specific make test can be run by supplying it as argument, e.g.
242     `special:make:idea`. As a special syntax, this can also be expressed as
243     `make-idea`, which allows for command lines as `make test-make-idea`.
244 
245 ## Test results and summary
246 
247 At the end of the test run, a summary of all tests run will be presented. This
248 will have a consistent look, regardless of what test suites were used. This is
249 a sample summary:
250 
251     ==============================
252     Test summary
253     ==============================
254        TEST                                          TOTAL  PASS  FAIL ERROR
255     >> jtreg:jdk/test:tier1                           1867  1865     2     0 <<
256        jtreg:langtools/test:tier1                     4711  4711     0     0
257        jtreg:nashorn/test:tier1                        133   133     0     0
258     ==============================
259     TEST FAILURE
260 
261 Tests where the number of TOTAL tests does not equal the number of PASSed tests
262 will be considered a test failure. These are marked with the `>> ... <<` marker
263 for easy identification.
264 
265 The classification of non-passed tests differs a bit between test suites. In
266 the summary, ERROR is used as a catch-all for tests that neither passed nor are
267 classified as failed by the framework. This might indicate test framework
268 error, timeout or other problems.
269 
270 In case of test failures, `make test` will exit with a non-zero exit value.
271 
272 All tests have their result stored in `build/$BUILD/test-results/$TEST_ID`,
273 where TEST_ID is a path-safe conversion from the fully qualified test
274 descriptor, e.g. for `jtreg:jdk/test:tier1` the TEST_ID is
275 `jtreg_jdk_test_tier1`. This path is also printed in the log at the end of the
276 test run.
277 
278 Additional work data is stored in `build/$BUILD/test-support/$TEST_ID`. For
279 some frameworks, this directory might contain information that is useful in
280 determining the cause of a failed test.
281 
282 ## Test suite control
283 
284 It is possible to control various aspects of the test suites using make control
285 variables.
286 
287 These variables use a keyword=value approach to allow multiple values to be
288 set. So, for instance, `JTREG="JOBS=1;TIMEOUT_FACTOR=8"` will set the JTReg
289 concurrency level to 1 and the timeout factor to 8. This is equivalent to
290 setting `JTREG_JOBS=1 JTREG_TIMEOUT_FACTOR=8`, but using the keyword format
291 means that the `JTREG` variable is parsed and verified for correctness, so
292 `JTREG="TMIEOUT_FACTOR=8"` would give an error, while `JTREG_TMIEOUT_FACTOR=8`
293 would just pass unnoticed.
294 
295 To separate multiple keyword=value pairs, use `;` (semicolon). Since the shell
296 normally eats `;`, the recommended usage is to write the assignment inside
297 quotes, e.g. `JTREG="...;..."`. This will also make sure spaces are preserved,
298 as in `JTREG="JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"`.
299 
300 (Other ways are possible, e.g. using backslash:
301 `JTREG=JOBS=1\;TIMEOUT_FACTOR=8`. Also, as a special technique, the string
302 `%20` will be replaced with space for certain options, e.g.
303 `JTREG=JAVA_OPTIONS=-XshowSettings%20-Xlog:gc+ref=debug`. This can be useful if
304 you have layers of scripts and have trouble getting proper quoting of command
305 line arguments through.)
306 
307 As far as possible, the names of the keywords have been standardized between
308 test suites.
309 
310 ### General keywords (TEST_OPTS)
311 
312 Some keywords are valid across different test suites. If you want to run tests
313 from multiple test suites, or just don't want to care which test suite specific
314 control variable to use, then you can use the general TEST_OPTS control
315 variable.
316 
317 There are also some keywords that applies globally to the test runner system,
318 not to any specific test suites. These are also available as TEST_OPTS
319 keywords.
320 
321 #### JOBS
322 
323 Currently only applies to JTReg.
324 
325 #### TIMEOUT_FACTOR
326 
327 Currently only applies to [JTReg -timeoutFactor](#timeout_factor-1).
328 
329 #### JAVA_OPTIONS
330 
331 Applies to JTReg, GTest and Micro.
332 
333 #### VM_OPTIONS
334 
335 Applies to JTReg, GTest and Micro.
336 
337 #### JCOV
338 
339 This keyword applies globally to the test runner system. If set to `true`, it
340 enables JCov coverage reporting for all tests run. To be useful, the JDK under
341 test must be run with a JDK built with JCov instrumentation (`configure
342 --with-jcov=<path to directory containing lib/jcov.jar>`, `make jcov-image`).
343 
344 The simplest way to run tests with JCov coverage report is to use the special
345 target `jcov-test` instead of `test`, e.g. `make jcov-test TEST=jdk_lang`. This
346 will make sure the JCov image is built, and that JCov reporting is enabled.
347 
348 To include JCov coverage for just a subset of all modules, you can use the
349 `--with-jcov-modules` arguments to `configure`, e.g.
350 `--with-jcov-modules=jdk.compiler,java.desktop`.
351 
352 For more fine-grained control, you can pass arbitrary filters to JCov using
353 `--with-jcov-filters`, and you can specify a specific JDK to instrument
354 using `--with-jcov-input-jdk`.
355 
356 The JCov report is stored in `build/$BUILD/test-results/jcov-output/report`.
357 
358 Please note that running with JCov reporting can be very memory intensive.
359 
360 #### JCOV_DIFF_CHANGESET
361 
362 While collecting code coverage with JCov, it is also possible to find coverage
363 for only recently changed code. JCOV_DIFF_CHANGESET specifies a source
364 revision. A textual report will be generated showing coverage of the diff
365 between the specified revision and the repository tip.
366 
367 The report is stored in
368 `build/$BUILD/test-results/jcov-output/diff_coverage_report` file.
369 
370 #### AOT_JDK
371 
372 See [Testing Ahead-of-time optimizations](#testing-ahead-of-time-optimizations).
373 
374 ### JTReg keywords
375 
376 #### JOBS
377 
378 The test concurrency (`-concurrency`).
379 
380 Defaults to TEST_JOBS (if set by `--with-test-jobs=`), otherwise it defaults to
381 JOBS, except for Hotspot, where the default is *number of CPU cores/2*, but
382 never more than *memory size in GB/2*.
383 
384 #### TIMEOUT_FACTOR
385 
386 The `TIMEOUT_FACTOR` is forwarded to JTReg framework itself
387 (`-timeoutFactor`). Also, some test cases that programmatically wait a
388 certain amount of time will apply this factor. If we run in forced
389 compilation mode (`-Xcomp`), the build system will automatically
390 adjust this factor to compensate for less performance. Defaults to 1.
391 
392 #### FAILURE_HANDLER_TIMEOUT
393 
394 Sets the argument `-timeoutHandlerTimeout` for JTReg. The default value is 0.
395 This is only valid if the failure handler is built.
396 
397 #### TEST_THREAD_FACTORY
398 
399 Sets the `-testThreadFactory` for JTReg. It should be the fully qualified
400 classname of a class which implements `java.util.concurrent.ThreadFactory`. One
401 such implementation class, named Virtual, is currently part of the JDK build in
402 the `test/jtreg_test_thread_factory/` directory. This class gets compiled
403 during the test image build. The implementation of the Virtual class creates a
404 new virtual thread for executing each test class.
405 
406 #### JVMTI_STRESS_AGENT
407 
408 Executes JTReg tests with JVM TI stress agent. The stress agent is the part of
409 test library and located in `test/lib/jdk/test/lib/jvmti/libJvmtiStressAgent.cpp`.
410 The value of this argument is set as JVM TI agent options.
411 This mode uses ProblemList-jvmti-stress-agent.txt as an additional exclude list.
412 
413 #### TEST_MODE
414 
415 The test mode (`agentvm` or `othervm`).
416 
417 Defaults to `agentvm`.
418 
419 #### ASSERT
420 
421 Enable asserts (`-ea -esa`, or none).
422 
423 Set to `true` or `false`. If true, adds `-ea -esa`. Defaults to true, except
424 for hotspot.
425 
426 #### VERBOSE
427 
428 The verbosity level (`-verbose`).
429 
430 Defaults to `fail,error,summary`.
431 
432 #### RETAIN
433 
434 What test data to retain (`-retain`).
435 
436 Defaults to `fail,error`.
437 
438 #### MAX_MEM
439 
440 Limit memory consumption (`-Xmx` and `-vmoption:-Xmx`, or none).
441 
442 Limit memory consumption for JTReg test framework and VM under test. Set to 0
443 to disable the limits.
444 
445 Defaults to 512m, except for hotspot, where it defaults to 0 (no limit).
446 
447 #### MAX_OUTPUT
448 
449 Set the property `javatest.maxOutputSize` for the launcher, to change the
450 default JTReg log limit.
451 
452 #### KEYWORDS
453 
454 JTReg keywords sent to JTReg using `-k`. Please be careful in making sure that
455 spaces and special characters (like `!`) are properly quoted. To avoid some
456 issues, the special value `%20` can be used instead of space.
457 
458 #### EXTRA_PROBLEM_LISTS
459 
460 Use additional problem lists file or files, in addition to the default
461 ProblemList.txt located at the JTReg test roots.
462 
463 If multiple file names are specified, they should be separated by space (or, to
464 help avoid quoting issues, the special value `%20`).
465 
466 The file names should be either absolute, or relative to the JTReg test root of
467 the tests to be run.
468 
469 #### RUN_PROBLEM_LISTS
470 
471 Use the problem lists to select tests instead of excluding them.
472 
473 Set to `true` or `false`. If `true`, JTReg will use `-match:` option, otherwise
474 `-exclude:` will be used. Default is `false`.
475 
476 #### OPTIONS
477 
478 Additional options to the JTReg test framework.
479 
480 Use `JTREG="OPTIONS=--help all"` to see all available JTReg options.
481 
482 #### JAVA_OPTIONS
483 
484 Additional Java options for running test classes (sent to JTReg as
485 `-javaoption`).
486 
487 #### VM_OPTIONS
488 
489 Additional Java options to be used when compiling and running classes (sent to
490 JTReg as `-vmoption`).
491 
492 This option is only needed in special circumstances. To pass Java options to
493 your test classes, use `JAVA_OPTIONS`.
494 
495 #### LAUNCHER_OPTIONS
496 
497 Additional Java options that are sent to the java launcher that starts the
498 JTReg harness.
499 
500 #### RETRY_COUNT
501 
502 Retry failed tests up to a set number of times, until they pass. This allows to
503 pass the tests with intermittent failures. Defaults to 0.
504 
505 #### REPEAT_COUNT
506 
507 Repeat the tests up to a set number of times, stopping at first failure. This
508 helps to reproduce intermittent test failures. Defaults to 0.
509 
510 #### REPORT
511 
512 Use this report style when reporting test results (sent to JTReg as `-report`).
513 Defaults to `files`.
514 
515 ### Gtest keywords
516 
517 #### REPEAT
518 
519 The number of times to repeat the tests (`--gtest_repeat`).
520 
521 Default is 1. Set to -1 to repeat indefinitely. This can be especially useful
522 combined with `OPTIONS=--gtest_break_on_failure` to reproduce an intermittent
523 problem.
524 
525 #### OPTIONS
526 
527 Additional options to the Gtest test framework.
528 
529 Use `GTEST="OPTIONS=--help"` to see all available Gtest options.
530 
531 ### Microbenchmark keywords
532 
533 #### FORK
534 
535 Override the number of benchmark forks to spawn. Same as specifying `-f <num>`.
536 
537 #### ITER
538 
539 Number of measurement iterations per fork. Same as specifying `-i <num>`.
540 
541 #### TIME
542 
543 Amount of time to spend in each measurement iteration, in seconds. Same as
544 specifying `-r <num>`
545 
546 #### WARMUP_ITER
547 
548 Number of warmup iterations to run before the measurement phase in each fork.
549 Same as specifying `-wi <num>`.
550 
551 #### WARMUP_TIME
552 
553 Amount of time to spend in each warmup iteration. Same as specifying `-w
554 <num>`.
555 
556 #### RESULTS_FORMAT
557 
558 Specify to have the test run save a log of the values. Accepts the same values
559 as `-rff`, i.e., `text`, `csv`, `scsv`, `json`, or `latex`.
560 
561 #### TEST_JDK
562 
563 The path to the JDK that will be used to run the benchmarks.
564 
565 Defaults to `build/<CONF-NAME>/jdk`.
566 
567 #### BENCHMARKS_JAR
568 
569 The path to the JAR containing the benchmarks.
570 
571 Defaults to `test/micro/benchmarks.jar`.
572 
573 #### VM_OPTIONS
574 
575 Additional VM arguments to provide to forked off VMs. Same as `-jvmArgs <args>`
576 
577 #### OPTIONS
578 
579 Additional arguments to send to JMH.
580 
581 ## Notes for Specific Tests
582 
583 ### Docker Tests
584 
585 Docker tests with default parameters may fail on systems with glibc versions
586 not compatible with the one used in the default docker image (e.g., Oracle
587 Linux 7.6 for x86). For example, they pass on Ubuntu 16.04 but fail on Ubuntu
588 18.04 if run like this on x86:
589 
590 ```
591 $ make test TEST="jtreg:test/hotspot/jtreg/containers/docker"
592 ```
593 
594 To run these tests correctly, additional parameters for the correct docker
595 image are required on Ubuntu 18.04 by using `JAVA_OPTIONS`.
596 
597 ```
598 $ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" \
599     JTREG="JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu
600     -Djdk.test.docker.image.version=latest"
601 ```
602 
603 ### Non-US locale
604 
605 If your locale is non-US, some tests are likely to fail. To work around this
606 you can set the locale to US. On Unix platforms simply setting `LANG="en_US"`
607 in the environment before running tests should work. On Windows or macOS,
608 setting `JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"` helps for
609 most, but not all test cases.
610 
611 For example:
612 
613 ```
614 $ export LANG="en_US" && make test TEST=...
615 $ make test JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US" TEST=...
616 ```
617 
618 ### PKCS11 Tests
619 
620 It is highly recommended to use the latest NSS version when running PKCS11
621 tests. Improper NSS version may lead to unexpected failures which are hard to
622 diagnose. For example, sun/security/pkcs11/Secmod/AddTrustedCert.java may fail
623 on Ubuntu 18.04 with the default NSS version in the system. To run these tests
624 correctly, the system property `jdk.test.lib.artifacts.<NAME>` is required on
625 Ubuntu 18.04 to specify the alternative NSS lib directory. The `<NAME>`
626 component should be replaced with the name element of the appropriate
627 `@Artifact` class. (See `test/jdk/sun/security/pkcs11/PKCS11Test.java`)
628 
629 For example:
630 
631 ```
632 $ make test TEST="jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java" \
633     JTREG="JAVA_OPTIONS=-Djdk.test.lib.artifacts.nsslib-linux_aarch64=/path/to/NSS-libs"
634 ```
635 
636 For more notes about the PKCS11 tests, please refer to
637 test/jdk/sun/security/pkcs11/README.
638 
639 ### Testing Ahead-of-time Optimizations
640 
641 One way to improve test coverage of ahead-of-time (AOT) optimizations in
642 the JDK is to run existing jtreg test cases in a special "AOT_JDK" mode.
643 Example:
644 
645 ```
646 $ make test JTREG="AOT_JDK=onestep" \
647     TEST=open/test/hotspot/jtreg/runtime/invokedynamic
648 ```
649 
650 In this testing mode, we first perform an AOT training run
651 (see https://openjdk.org/jeps/483) of a special test program
652 ([test/setup_aot/TestSetupAOT.java](../test/setup_aot/TestSetupAOT.java))
653 that accesses about 5,0000 classes in the JDK core libraries.
654 Optimization artifacts for these classes (such as pre-linked
655 lambda expressions, execution profiles, and pre-generated native code)
656 are stored into an AOT cache file, which will be used by all the JVMs
657 launched by the selected jtreg test cases.
658 
659 When the jtreg tests call into the core libraries classes that are in
660 the AOT cache, we will be able to test the AOT optimizations that were
661 used on those classes.
662 
663 Please note that not all existing jtreg test cases can be executed with
664 the AOT_JDK mode. See
665 [test/hotspot/jtreg/ProblemList-AotJdk.txt](../test/hotspot/jtreg/ProblemList-AotJdk.txt)
666 and [test/jdk/ProblemList-AotJdk.txt](../test/jdk/ProblemList-AotJdk.txt).
667 
668 Also, test cases that were written specifically to test AOT, such as the tests
669 under [test/hotspot/jtreg/runtime/cds](../test/hotspot/jtreg/runtime/cds/),
670 cannot be executed with the AOT_JDK mode.
671 
672 Valid values for `AOT_JDK` are `onestep` and `twostep`. These control how
673 the AOT cache is generated. See https://openjdk.org/jeps/514 for details.
674 All other values are ignored.
675 
676 ### Testing with alternative security providers
677 
678 Some security tests use a hardcoded provider for `KeyFactory`, `Cipher`,
679 `KeyPairGenerator`, `KeyGenerator`, `AlgorithmParameterGenerator`,
680 `KeyAgreement`, `Mac`, `MessageDigest`, `SecureRandom`, `Signature`,
681 `AlgorithmParameters`, `Configuration`, `Policy`, or `SecretKeyFactory` objects.
682 Specify the `-Dtest.provider.name=NAME` property to use a different provider for
683 the service(s).
684 
685 ### Client UI Tests
686 
687 #### System key shortcuts
688 
689 Some Client UI tests use key sequences which may be reserved by the operating
690 system. Usually that causes the test failure. So it is highly recommended to
691 disable system key shortcuts prior testing. The steps to access and disable
692 system key shortcuts for various platforms are provided below.
693 
694 ##### macOS
695 
696 Choose Apple menu; System Preferences, click Keyboard, then click Shortcuts;
697 select or deselect desired shortcut.
698 
699 For example,
700 test/jdk/javax/swing/TooltipManager/JMenuItemToolTipKeyBindingsTest/JMenuItemToolTipKeyBindingsTest.java
701 fails on macOS because it uses `CTRL + F1` key sequence to show or hide tooltip
702 message but the key combination is reserved by the operating system. To run the
703 test correctly the default global key shortcut should be disabled using the
704 steps described above, and then deselect "Turn keyboard access on or off"
705 option which is responsible for `CTRL + F1` combination.
706 
707 ##### Linux
708 
709 Open the Activities overview and start typing Settings; Choose Settings, click
710 Devices, then click Keyboard; set or override desired shortcut.
711 
712 ##### Windows
713 
714 Type `gpedit` in the Search and then click Edit group policy; navigate to User
715 Configuration -> Administrative Templates -> Windows Components -> File
716 Explorer; in the right-side pane look for "Turn off Windows key hotkeys" and
717 double click on it; enable or disable hotkeys.
718 
719 Note: restart is required to make the settings take effect.
720 
721 #### Robot API
722 
723 Most automated Client UI tests use `Robot` API to control the UI. Usually, the
724 default operating system settings need to be adjusted for Robot to work
725 correctly. The detailed steps how to access and update these settings for
726 different platforms are provided below.
727 
728 ##### macOS
729 
730 `Robot` is not permitted to control your Mac by default since macOS 10.15. To
731 allow it, choose Apple menu -> System Settings, click Privacy & Security; then
732 click Accessibility and ensure the following apps are allowed to control your
733 computer: *Java* and *Terminal*. If the tests are run from an IDE, the IDE
734 should be granted this permission too.
735 
736 ##### Windows
737 
738 On Windows if Cygwin terminal is used to run the tests, there is a delay in
739 focus transfer. Usually it causes automated UI test failure. To disable the
740 delay, type `regedit` in the Search and then select Registry Editor; navigate
741 to the following key: `HKEY_CURRENT_USER\Control Panel\Desktop`; make sure the
742 `ForegroundLockTimeout` value is set to 0.
743 
744 Additional information about Client UI tests configuration for various
745 operating systems can be obtained at [Automated client GUI testing system set
746 up
747 requirements](https://wiki.openjdk.org/display/ClientLibs/Automated+client+GUI+testing+system+set+up+requirements)
748 
749 ## Editing this document
750 
751 If you want to contribute changes to this document, edit `doc/testing.md` and
752 then run `make update-build-docs` to generate the same changes in
753 `doc/testing.html`.
754 
755 ---
756 # Override some definitions in the global css file that are not optimal for
757 # this document.
758 header-includes:
759  - '<style type="text/css">pre, code, tt { color: #1d6ae5; }</style>'
760 ---