These specific test resources are already included in the "testArchives"
configuration of the "com.ibm.wala.core.tests" subproject, upon which
the "com.ibm.wala.dalvik.test" tests already depend. So there's no need
to also copy these resources into the "com.ibm.wala.dalvik.test" test
resources area as well.
Previously I hadn't realized that Gradle's "java" plugin would generate
default "cleanTest" tasks for us. By defining my own "cleanTest" tasks
we were replacing the generated ones, but what we really wanted to do
was augment them with additional files to delete.
Every dependency task listed here is already a dependency of at least
one subproject's "processTestResources" task, and each
"processTestResources" task already depends on the corresponding
"afterEclipseBuildshipImport" task. So listing these tasks here too
is unnecessary.
All Kawa-related downloads now use our existing VerifiedDownload task
class. This gives us Gradle-integrated progress reporting,
incremental build support, build caching, correct dependencies, etc.
Using Kawa to compile Scheme into bytecode now also has proper
dependency management, incremental build support, and build caching.
Same goes for bundling these compiled bytecode files into jar archives
for later use in regression tests.
Also, when downloading kawa-chess, grab a specific commit hash rather
than whatever is the most recent master commit. If this project
changes in the future, we don't want our tests to break unexpectedly.
Perhaps we'd want to pick up any new kawa-chess commits; perhaps not.
Either way, that should be a conscious decision rather than something
that can happen behind our backs.
This specific task runs an external command, and we consider the task
successful if that command exits without error. We don't actually
examine the stdout or stderr of the command being run.
However, it is still useful to log the stdout and stderr to a file,
and to declare that file to be the output of the task. Otherwise, the
task has no declared outputs at all. A task with no outputs is
ineligible for caching and is always considered to be out-of-date.
squash! Declare a task's outputs, enabling incremental build and caching
The issue here is a planned change to how "publishing" blocks work.
Per
<https://docs.gradle.org/4.9/userguide/publishing_maven.html#publishing_maven:deferred_configuration>,
the right way to prepare for this change is to enable it and check for
unexpected changes in what gets published to a local repository. I
have done this, and find no unexpected changes.
So we are actually ready for Gradle 5.0; the warning is a false
positive for us. Leaving the future change enabled means we won't
keep seeing this warning. It also means that any further changes to
our use of "publishing" will be tested under that future change, which
is a good way to avoid surprises later.
Gradle won't pass absolute path when build libcast. We need to set install_name manually otherwise `dyld` would not able to find libcast at runtime.
This is only needed on macos since `ld` will look up all runtime search path automatically.
Fixes#328, which requested better diagnostic messages in the case of
a missing C/C++ compiler toolchain. Gradle actually has perfectly
good, informative messages in that case. Unfortunately, we were
killing the build by dereferencing null before Gradle had a chance to
explain. Now we bail out of some setup work early to avoid the null
dereference, thereby letting Gradle explain things properly.
Under some circumstances, Gradle seems to decide that the destination
file being absent is the download task's expected outcome. It caches
this state, and refuses to retry the download in the future since it
thinks the task is up-to-date. We can correct this by telling Gradle
that the task should not be considered up-to-date if the file is
missing, as recommended by
<https://discuss.gradle.org/t/task-up-to-date-but-outputfile-not-created/17568/2>.
In particular, using the "all" package (which includes source) allows
IntelliJ IDEA to provide autocompletion and other nice features that
are unavailable when using the "bin" package.
Fixes#322
We add an option `createPhantomSuperclasses` to `ClassHierarchy`. When set, if a superclass is missing, we create a new `PhantomClass` in its place and allow the subclass to be added.
To use, you can create the `ClassHierarchy` with the new `ClassHierarchyFactory.makeWithPhantom` methods.