Merge "Reorder assert-max-image-size and AVB signing"
diff --git a/Changes.md b/Changes.md
new file mode 100644
index 0000000..05f54b8
--- /dev/null
+++ b/Changes.md
@@ -0,0 +1,105 @@
+# Build System Changes for Android.mk Writers
+
+## Deprecating envsetup.sh variables in Makefiles
+
+It is not required to source envsetup.sh before running a build. Many scripts,
+including a majority of our automated build systems, do not do so. Make will
+transparently make every environment variable available as a make variable.
+This means that relying on environment variables only set up in envsetup.sh will
+produce different output for local users and scripted users.
+
+Many of these variables also include absolute path names, which we'd like to
+keep out of the generated files, so that you don't need to do a full rebuild if
+you move the source tree.
+
+To fix this, we're marking the variables that are set in envsetup.sh as
+deprecated in the makefiles. This will trigger a warning every time one is read
+(or written) inside Kati. Once all the warnings have been removed, we'll switch
+this to obsolete, and any references will become errors.
+
+### envsetup.sh variables with make equivalents
+
+| instead of                                                   | use                  |
+|--------------------------------------------------------------|----------------------|
+| OUT {#OUT}                                                   | OUT_DIR              |
+| ANDROID_HOST_OUT {#ANDROID_HOST_OUT}                         | HOST_OUT             |
+| ANDROID_PRODUCT_OUT {#ANDROID_PRODUCT_OUT}                   | PRODUCT_OUT          |
+| ANDROID_HOST_OUT_TESTCASES {#ANDROID_HOST_OUT_TESTCASES}     | HOST_OUT_TESTCASES   |
+| ANDROID_TARGET_OUT_TESTCASES {#ANDROID_TARGET_OUT_TESTCASES} | TARGET_OUT_TESTCASES |
+
+All of the make variables may be relative paths from the current directory, or
+absolute paths if the output directory was specified as an absolute path. If you
+need an absolute variable, convert it to absolute during a rule, so that it's
+not expanded into the generated ninja file:
+
+``` make
+$(PRODUCT_OUT)/gen.img: my/src/path/gen.sh
+	export PRODUCT_OUT=$$(cd $(PRODUCT_OUT); pwd); cd my/src/path; ./gen.sh -o $${PRODUCT_OUT}/gen.img
+```
+
+### ANDROID_BUILD_TOP  {#ANDROID_BUILD_TOP}
+
+In Android.mk files, you can always assume that the current directory is the
+root of the source tree, so this can just be replaced with '.' (which is what
+$TOP is hardcoded to), or removed entirely. If you need an absolute path, see
+the instructions above.
+
+### Stop using PATH directly  {#PATH}
+
+This isn't only set by envsetup.sh, but it is modified by it. Due to that it's
+rather easy for this to change between different shells, and it's not ideal to
+reread the makefiles every time this changes.
+
+In most cases, you shouldn't need to touch PATH at all. When you need to have a
+rule reference a particular binary that's part of the source tree or outputs,
+it's preferrable to just use the path to the file itself (since you should
+already be adding that as a dependency).
+
+Depending on the rule, passing the file path itself may not be feasible due to
+layers of unchangable scripts/binaries. In that case, be sure to add the
+dependency, but modify the PATH within the rule itself:
+
+``` make
+$(TARGET): myscript my/path/binary
+	PATH=my/path:$$PATH myscript -o $@
+```
+
+### Stop using PYTHONPATH directly  {#PYTHONPATH}
+
+Like PATH, this isn't only set by envsetup.sh, but it is modified by it. Due to
+that it's rather easy for this to change between different shells, and it's not
+ideal to reread the makefiles every time.
+
+The best solution here is to start switching to Soong's python building support,
+which packages the python interpreter, libraries, and script all into one file
+that no longer needs PYTHONPATH. See fontchain_lint for examples of this:
+
+* [external/fonttools/Lib/fontTools/Android.bp] for python_library_host
+* [frameworks/base/Android.bp] for python_binary_host
+* [frameworks/base/data/fonts/Android.mk] to execute the python binary
+
+If you still need to use PYTHONPATH, do so within the rule itself, just like
+path:
+
+``` make
+$(TARGET): myscript.py $(sort $(shell find my/python/lib -name '*.py'))
+	PYTHONPATH=my/python/lib:$$PYTHONPATH myscript.py -o $@
+```
+
+### Other envsetup.sh variables  {#other_envsetup_variables}
+
+* ANDROID_TOOLCHAIN
+* ANDROID_TOOLCHAIN_2ND_ARCH
+* ANDROID_DEV_SCRIPTS
+* ANDROID_EMULATOR_PREBUILTS
+* ANDROID_PRE_BUILD_PATHS
+
+These are all exported from envsetup.sh, but don't have clear equivalents within
+the makefile system. If you need one of them, you'll have to set up your own
+version.
+
+
+[build/soong/Changes.md]: https://android.googlesource.com/platform/build/soong/+/master/Changes.md
+[external/fonttools/Lib/fontTools/Android.bp]: https://android.googlesource.com/platform/external/fonttools/+/master/Lib/fontTools/Android.bp
+[frameworks/base/Android.bp]: https://android.googlesource.com/platform/frameworks/base/+/master/Android.bp
+[frameworks/base/data/fonts/Android.mk]: https://android.googlesource.com/platform/frameworks/base/+/master/data/fonts/Android.mk
diff --git a/README.md b/README.md
new file mode 100644
index 0000000..47809a9
--- /dev/null
+++ b/README.md
@@ -0,0 +1,23 @@
+# Android Make Build System
+
+This is the Makefile-based portion of the Android Build System.
+
+For documentation on how to run a build, see [Usage.txt](Usage.txt)
+
+For a list of behavioral changes useful for Android.mk writers see
+[Changes.md](Changes.md)
+
+For an outdated reference on Android.mk files, see
+[build-system.html](/core/build-system.html). Our Android.mk files look similar,
+but are entirely different from the Android.mk files used by the NDK build
+system. When searching for documentation elsewhere, ensure that it is for the
+platform build system -- most are not.
+
+This Makefile-based system is in the process of being replaced with [Soong], a
+new build system written in Go. During the transition, all of these makefiles
+are read by [Kati], and generate a ninja file instead of being executed
+directly. That's combined with a ninja file read by Soong so that the build
+graph of the two systems can be combined and run as one.
+
+[Kati]: https://github.com/google/kati
+[Soong]: https://android.googlesource.com/platform/build/soong/+/master
diff --git a/README.txt b/Usage.txt
similarity index 100%
rename from README.txt
rename to Usage.txt
diff --git a/core/Makefile b/core/Makefile
index bea9b3e..4a3f2c4 100644
--- a/core/Makefile
+++ b/core/Makefile
@@ -530,6 +530,20 @@
 $(call dist-for-goals,droidcore,$(SOONG_TO_CONVERT))
 
 # -----------------------------------------------------------------
+# Modules use -Wno-error, or added default -Wall -Werror
+WALL_WERROR := $(PRODUCT_OUT)/wall_werror.txt
+$(WALL_WERROR):
+	@rm -f $@
+	echo "# Modules using -Wno-error" >> $@
+	for m in $(sort $(SOONG_MODULES_USING_WNO_ERROR) $(MODULES_USING_WNO_ERROR)); do echo $$m >> $@; done
+	echo "# Modules added default -Wall -Werror" >> $@
+	for m in $(sort $(SOONG_MODULES_ADDED_WERROR) $(MODULES_ADDED_WERROR)); do echo $$m >> $@; done
+	echo "# Modules added default -Wall" >> $@
+	for m in $(sort $(SOONG_MODULES_ADDED_WALL) $(MODULES_ADDED_WALL)); do echo $$m >> $@; done
+
+$(call dist-for-goals,droidcore,$(WALL_WERROR))
+
+# -----------------------------------------------------------------
 # The dev key is used to sign this package, and as the key required
 # for future OTA packages installed by this system.  Actual product
 # deliverables will be re-signed by hand.  We expect this file to
@@ -627,6 +641,8 @@
 	@echo "make $@: ignoring dependencies"
 	$(hide) $(MKBOOTFS) -d $(TARGET_OUT) $(TARGET_ROOT_OUT) | $(MINIGZIP) > $(INSTALLED_RAMDISK_TARGET)
 
+INSTALLED_BOOTIMAGE_TARGET := $(PRODUCT_OUT)/boot.img
+
 ifneq ($(strip $(TARGET_NO_KERNEL)),true)
 
 # -----------------------------------------------------------------
@@ -665,8 +681,6 @@
     --os_version $(PLATFORM_VERSION) \
     --os_patch_level $(PLATFORM_SECURITY_PATCH)
 
-INSTALLED_BOOTIMAGE_TARGET := $(PRODUCT_OUT)/boot.img
-
 # BOARD_USES_RECOVERY_AS_BOOT = true must have BOARD_BUILD_SYSTEM_ROOT_IMAGE = true.
 ifeq ($(BOARD_USES_RECOVERY_AS_BOOT),true)
 ifneq ($(BOARD_BUILD_SYSTEM_ROOT_IMAGE),true)
@@ -749,6 +763,13 @@
 endif # BOARD_USES_RECOVERY_AS_BOOT
 
 else	# TARGET_NO_KERNEL
+ifdef BOARD_PREBUILT_BOOTIMAGE
+ifneq ($(BOARD_BUILD_SYSTEM_ROOT_IMAGE),true)
+# Remove when b/63676296 is resolved.
+$(error Prebuilt bootimage is only supported for AB targets)
+endif
+$(eval $(call copy-one-file,$(BOARD_PREBUILT_BOOTIMAGE),$(INSTALLED_BOOTIMAGE_TARGET)))
+else
 INTERNAL_KERNEL_CMDLINE := $(strip $(BOARD_KERNEL_CMDLINE))
 # HACK: The top-level targets depend on the bootimage.  Not all targets
 # can produce a bootimage, though, and emulator targets need the ramdisk
@@ -757,6 +778,7 @@
 #       kernel-less inputs.
 INSTALLED_BOOTIMAGE_TARGET := $(INSTALLED_RAMDISK_TARGET)
 endif
+endif
 
 # -----------------------------------------------------------------
 # NOTICE files
@@ -1189,14 +1211,6 @@
 RECOVERY_RESOURCE_ZIP :=
 endif
 
-ifeq ($(TARGET_PRIVATE_RES_DIRS),)
-  $(info No private recovery resources for TARGET_DEVICE $(TARGET_DEVICE))
-endif
-
-ifeq ($(recovery_fstab),)
-  $(info No recovery.fstab for TARGET_DEVICE $(TARGET_DEVICE))
-endif
-
 INTERNAL_RECOVERYIMAGE_ARGS := \
 	$(addprefix --second ,$(INSTALLED_2NDBOOTLOADER_TARGET)) \
 	--kernel $(recovery_kernel) \
@@ -2192,7 +2206,7 @@
   $(shell find system/update_engine/scripts -name \*.pyc -prune -o -type f -print | sort) \
   $(shell find build/target/product/security -type f -name \*.x509.pem -o -name \*.pk8 -o \
       -name verity_key | sort) \
-  $(shell find device vendor -type f -name \*.pk8 -o -name verifiedboot\* -o \
+  $(shell find device $(wildcard vendor) -type f -name \*.pk8 -o -name verifiedboot\* -o \
       -name \*.x509.pem -o -name oem\*.prop | sort)
 
 OTATOOLS_RELEASETOOLS := \
@@ -2314,6 +2328,7 @@
 		$(INSTALLED_USERDATAIMAGE_TARGET) \
 		$(INSTALLED_CACHEIMAGE_TARGET) \
 		$(INSTALLED_VENDORIMAGE_TARGET) \
+		$(INSTALLED_VBMETAIMAGE_TARGET) \
 		$(INSTALLED_DTBOIMAGE_TARGET) \
 		$(INTERNAL_SYSTEMOTHERIMAGE_FILES) \
 		$(INSTALLED_ANDROID_INFO_TXT_TARGET) \
@@ -2534,10 +2549,22 @@
 	@# If breakpad symbols have been generated, add them to the zip.
 	$(hide) $(ACP) -r $(TARGET_OUT_BREAKPAD) $(zip_root)/BREAKPAD
 endif
+# BOARD_BUILD_DISABLED_VBMETAIMAGE is used to build a special vbmeta.img
+# that disables AVB verification. The content is fixed and we can just copy
+# it to $(zip_root)/IMAGES without passing some info into misc_info.txt for
+# regeneration.
+ifeq (true,$(BOARD_BUILD_DISABLED_VBMETAIMAGE))
+	$(hide) mkdir -p $(zip_root)/IMAGES
+	$(hide) cp $(INSTALLED_VBMETAIMAGE_TARGET) $(zip_root)/IMAGES/
+endif
 ifdef BOARD_PREBUILT_VENDORIMAGE
 	$(hide) mkdir -p $(zip_root)/IMAGES
 	$(hide) cp $(INSTALLED_VENDORIMAGE_TARGET) $(zip_root)/IMAGES/
 endif
+ifdef BOARD_PREBUILT_BOOTIMAGE
+	$(hide) mkdir -p $(zip_root)/IMAGES
+	$(hide) cp $(INSTALLED_BOOTIMAGE_TARGET) $(zip_root)/IMAGES/
+endif
 ifdef BOARD_PREBUILT_DTBOIMAGE
 	$(hide) mkdir -p $(zip_root)/PREBUILT_IMAGES
 	$(hide) cp $(INSTALLED_DTBOIMAGE_TARGET) $(zip_root)/PREBUILT_IMAGES/
diff --git a/core/binary.mk b/core/binary.mk
index bd1e601..e54edbe 100644
--- a/core/binary.mk
+++ b/core/binary.mk
@@ -1330,6 +1330,22 @@
 asm_objects += $(asm_objects_asm)
 endif
 
+###################################################################
+## When compiling a CFI enabled target, use the .cfi variant of any
+## static dependencies (where they exist).
+##################################################################
+define use_soong_cfi_static_libraries
+  $(foreach l,$(1),$(if $(filter $(l),$(SOONG_CFI_STATIC_LIBRARIES)),\
+      $(l).cfi,$(l)))
+endef
+
+ifneq ($(filter cfi,$(my_sanitize)),)
+  my_whole_static_libraries := $(call use_soong_cfi_static_libraries,\
+    $(my_whole_static_libraries))
+  my_static_libraries := $(call use_soong_cfi_static_libraries,\
+    $(my_static_libraries))
+endif
+
 ###########################################################
 ## When compiling against the VNDK, use LL-NDK libraries
 ###########################################################
@@ -1664,13 +1680,31 @@
     my_cflags += -DANDROID_STRICT
 endif
 
-# Add -Werror if LOCAL_PATH is in the WARNING_DISALLOWED project list,
-# or not in the WARNING_ALLOWED project list.
-ifneq (,$(strip $(call find_warning_disallowed_projects,$(LOCAL_PATH))))
-  my_cflags_no_override += -Werror
-else
-  ifeq (,$(strip $(call find_warning_allowed_projects,$(LOCAL_PATH))))
-    my_cflags_no_override += -Werror
+# Check if -Werror or -Wno-error is used in C compiler flags.
+# Modules defined in $(SOONG_ANDROID_MK) are checked in soong's cc.go.
+ifneq ($(LOCAL_MODULE_MAKEFILE),$(SOONG_ANDROID_MK))
+  # Header libraries do not need cflags.
+  ifneq (HEADER_LIBRARIES,$(LOCAL_MODULE_CLASS))
+    # Prebuilt modules do not need cflags.
+    ifeq (,$(LOCAL_PREBUILT_MODULE_FILE))
+      my_all_cflags := $(my_cflags) $(my_cppflags) $(my_cflags_no_override)
+      # Issue warning if -Wno-error is used.
+      ifneq (,$(filter -Wno-error,$(my_all_cflags)))
+        $(eval MODULES_USING_WNO_ERROR := $(MODULES_USING_WNO_ERROR) $(LOCAL_MODULE_MAKEFILE):$(LOCAL_MODULE))
+      else
+        # Issue warning if -Werror is not used. Add it.
+        ifeq (,$(filter -Werror,$(my_all_cflags)))
+          # Add -Wall -Werror unless the project is in the WARNING_ALLOWED project list.
+          ifeq (,$(strip $(call find_warning_allowed_projects,$(LOCAL_PATH))))
+            $(eval MODULES_ADDED_WERROR := $(MODULES_ADDED_WERROR) $(LOCAL_MODULE_MAKEFILE):$(LOCAL_MODULE))
+            my_cflags := -Wall -Werror $(my_cflags)
+          else
+            $(eval MODULES_ADDED_WALL := $(MODULES_ADDED_WALL) $(LOCAL_MODULE_MAKEFILE):$(LOCAL_MODULE))
+            my_cflags := -Wall $(my_cflags)
+          endif
+        endif
+      endif
+    endif
   endif
 endif
 
diff --git a/core/clear_vars.mk b/core/clear_vars.mk
index bd605ec..99bd691 100644
--- a/core/clear_vars.mk
+++ b/core/clear_vars.mk
@@ -226,10 +226,14 @@
 LOCAL_SDK_VERSION:=
 LOCAL_SHARED_ANDROID_LIBRARIES:=
 LOCAL_SHARED_LIBRARIES:=
+LOCAL_SOONG_HEADER_JAR :=
+LOCAL_SOONG_DEX_JAR :=
+LOCAL_SOONG_JACOCO_REPORT_CLASSES_JAR :=
 # '',true
 LOCAL_SOURCE_FILES_ALL_GENERATED:=
 LOCAL_SRC_FILES:=
 LOCAL_SRC_FILES_EXCLUDE:=
+LOCAL_SRCJARS:=
 LOCAL_STATIC_ANDROID_LIBRARIES:=
 LOCAL_STATIC_JAVA_AAR_LIBRARIES:=
 LOCAL_STATIC_JAVA_LIBRARIES:=
diff --git a/core/config.mk b/core/config.mk
index bebc186..4bae6b3 100644
--- a/core/config.mk
+++ b/core/config.mk
@@ -59,7 +59,24 @@
 .DELETE_ON_ERROR:
 
 # Mark variables deprecated/obsolete
-$(KATI_deprecated_var PATH,Do not use PATH directly)
+CHANGES_URL := https://android.googlesource.com/platform/build/+/master/Changes.md
+$(KATI_deprecated_var PATH,Do not use PATH directly. See $(CHANGES_URL)#PATH)
+$(KATI_deprecated_var PYTHONPATH,Do not use PYTHONPATH directly. See $(CHANGES_URL)#PYTHONPATH)
+$(KATI_deprecated_var OUT,Use OUT_DIR instead. See $(CHANGES_URL)#OUT)
+$(KATI_deprecated_var ANDROID_HOST_OUT,Use HOST_OUT instead. See $(CHANGES_URL)#ANDROID_HOST_OUT)
+$(KATI_deprecated_var ANDROID_PRODUCT_OUT,Use PRODUCT_OUT instead. See $(CHANGES_URL)#ANDROID_PRODUCT_OUT)
+$(KATI_deprecated_var ANDROID_HOST_OUT_TESTCASES,Use HOST_OUT_TESTCASES instead. See $(CHANGES_URL)#ANDROID_HOST_OUT_TESTCASES)
+$(KATI_deprecated_var ANDROID_TARGET_OUT_TESTCASES,Use TARGET_OUT_TESTCASES instead. See $(CHANGES_URL)#ANDROID_TARGET_OUT_TESTCASES)
+$(KATI_deprecated_var ANDROID_BUILD_TOP,Use '.' instead. See $(CHANGES_URL)#ANDROID_BUILD_TOP)
+$(KATI_deprecated_var \
+  ANDROID_TOOLCHAIN \
+  ANDROID_TOOLCHAIN_2ND_ARCH \
+  ANDROID_DEV_SCRIPTS \
+  ANDROID_EMULATOR_PREBUILTS \
+  ANDROID_PRE_BUILD_PATHS \
+  ,See $(CHANGES_URL)#other_envsetup_variables)
+
+CHANGES_URL :=
 
 # Used to force goals to build.  Only use for conditionally defined goals.
 .PHONY: FORCE
@@ -67,6 +84,9 @@
 
 ORIGINAL_MAKECMDGOALS := $(MAKECMDGOALS)
 
+dist_goal := $(strip $(filter dist,$(MAKECMDGOALS)))
+MAKECMDGOALS := $(strip $(filter-out dist,$(MAKECMDGOALS)))
+
 # Tell python not to spam the source tree with .pyc files.  This
 # only has an effect on python 2.6 and above.
 export PYTHONDONTWRITEBYTECODE := 1
@@ -658,7 +678,6 @@
 
 FINDBUGS_DIR := external/owasp/sanitizer/tools/findbugs/bin
 FINDBUGS := $(FINDBUGS_DIR)/findbugs
-JACOCO_CLI_JAR := $(HOST_OUT_JAVA_LIBRARIES)/jacoco-cli$(COMMON_JAVA_PACKAGE_SUFFIX)
 
 # Tool to merge AndroidManifest.xmls
 ANDROID_MANIFEST_MERGER := $(JAVA) -classpath prebuilts/devtools/tools/lib/manifest-merger.jar com.android.manifmerger.Main merge
@@ -898,38 +917,7 @@
 APPS_DEFAULT_VERSION_NAME := $(PLATFORM_VERSION)
 endif
 
-# Projects clean of compiler warnings should be compiled with -Werror.
-# If most modules in a directory such as external/ have warnings,
-# the directory should be in ANDROID_WARNING_ALLOWED_PROJECTS list.
-# When some of its subdirectories are cleaned up, the subdirectories
-# can be added into ANDROID_WARNING_DISALLOWED_PROJECTS list, e.g.
-# external/fio/.
-ANDROID_WARNING_DISALLOWED_PROJECTS := \
-    art/% \
-    bionic/% \
-    external/fio/% \
-    hardware/interfaces/% \
-
-define find_warning_disallowed_projects
-    $(filter $(ANDROID_WARNING_DISALLOWED_PROJECTS),$(1)/)
-endef
-
-# Projects with compiler warnings are compiled without -Werror.
-ANDROID_WARNING_ALLOWED_PROJECTS := \
-    bootable/% \
-    cts/% \
-    dalvik/% \
-    development/% \
-    device/% \
-    external/% \
-    frameworks/% \
-    hardware/% \
-    packages/% \
-    system/% \
-    test/vts/% \
-    tools/adt/idea/android/ultimate/get_modification_time/jni/% \
-    vendor/% \
-
+# ANDROID_WARNING_ALLOWED_PROJECTS is generated by build/soong.
 define find_warning_allowed_projects
     $(filter $(ANDROID_WARNING_ALLOWED_PROJECTS),$(1)/)
 endef
diff --git a/core/config_sanitizers.mk b/core/config_sanitizers.mk
index 8bd9248..9415143 100644
--- a/core/config_sanitizers.mk
+++ b/core/config_sanitizers.mk
@@ -132,6 +132,12 @@
   my_sanitize_diag := $(filter-out cfi,$(my_sanitize_diag))
 endif
 
+# Disable CFI for host targets
+ifdef LOCAL_IS_HOST_MODULE
+  my_sanitize := $(filter-out cfi,$(my_sanitize))
+  my_sanitize_diag := $(filter-out cfi,$(my_sanitize_diag))
+endif
+
 # Support for local sanitize blacklist paths.
 ifneq ($(my_sanitize)$(my_global_sanitize),)
   ifneq ($(LOCAL_SANITIZE_BLACKLIST),)
diff --git a/core/definitions.mk b/core/definitions.mk
index 13bd47c..c8368b4 100644
--- a/core/definitions.mk
+++ b/core/definitions.mk
@@ -2569,9 +2569,9 @@
 define uncompress-dexs
 $(hide) if (zipinfo $@ '*.dex' 2>/dev/null | grep -v ' stor ' >/dev/null) ; then \
   rm -rf $(dir $@)uncompresseddexs && mkdir $(dir $@)uncompresseddexs; \
-  unzip $@ '*.dex' -d $(dir $@)uncompresseddexs && \
-  zip -d $@ '*.dex' && \
-  ( cd $(dir $@)uncompresseddexs && find . -type f | sort | zip -D -X -0 ../$(notdir $@) -@ ) && \
+  unzip -q $@ '*.dex' -d $(dir $@)uncompresseddexs && \
+  zip -qd $@ '*.dex' && \
+  ( cd $(dir $@)uncompresseddexs && find . -type f | sort | zip -qD -X -0 ../$(notdir $@) -@ ) && \
   rm -rf $(dir $@)uncompresseddexs; \
   fi
 endef
@@ -2581,9 +2581,9 @@
 define uncompress-shared-libs
 $(hide) if (zipinfo $@ $(PRIVATE_EMBEDDED_JNI_LIBS) 2>/dev/null | grep -v ' stor ' >/dev/null) ; then \
   rm -rf $(dir $@)uncompressedlibs && mkdir $(dir $@)uncompressedlibs; \
-  unzip $@ $(PRIVATE_EMBEDDED_JNI_LIBS) -d $(dir $@)uncompressedlibs && \
-  zip -d $@ 'lib/*.so' && \
-  ( cd $(dir $@)uncompressedlibs && find lib -type f | sort | zip -D -X -0 ../$(notdir $@) -@ ) && \
+  unzip -q $@ $(PRIVATE_EMBEDDED_JNI_LIBS) -d $(dir $@)uncompressedlibs && \
+  zip -qd $@ 'lib/*.so' && \
+  ( cd $(dir $@)uncompressedlibs && find lib -type f | sort | zip -qD -X -0 ../$(notdir $@) -@ ) && \
   rm -rf $(dir $@)uncompressedlibs; \
   fi
 endef
@@ -3281,6 +3281,30 @@
 # run test
 $(strip $(call test-validate-paths-are-subdirs))
 
+###########################################################
+## Validate jacoco class filters and convert them to
+## file arguments
+## Jacoco class filters are comma-separated lists of class
+## files (android.app.Application), and may have '*' as the
+## last character to match all classes in a package
+## including subpackages.
+define jacoco-class-filter-to-file-args
+$(strip $(call jacoco-validate-file-args,\
+  $(subst $(comma),$(space),\
+    $(subst .,/,\
+      $(strip $(1))))))
+endef
+
+define jacoco-validate-file-args
+$(strip $(1)\
+  $(call validate-paths-are-subdirs,$(1))
+  $(foreach arg,$(1),\
+    $(if $(findstring ?,$(arg)),$(call pretty-error,\
+      '?' filters are not supported in LOCAL_JACK_COVERAGE_INCLUDE_FILTER or LOCAL_JACK_COVERAGE_EXCLUDE_FILTER))\
+    $(if $(findstring *,$(patsubst %*,%,$(arg))),$(call pretty-error,\
+      '*' is only supported at the end of a filter in LOCAL_JACK_COVERAGE_INCLUDE_FILTER or LOCAL_JACK_COVERAGE_EXCLUDE_FILTER))\
+  ))
+endef
 
 ###########################################################
 ## Other includes
diff --git a/core/dex_preopt_libart_boot.mk b/core/dex_preopt_libart_boot.mk
index 302cc8b..8b71198 100644
--- a/core/dex_preopt_libart_boot.mk
+++ b/core/dex_preopt_libart_boot.mk
@@ -97,6 +97,8 @@
 		--runtime-arg -Xnorelocate --compile-pic \
 		--no-generate-debug-info --generate-build-id \
 		--multi-image --no-inline-from=core-oj.jar \
+		--abort-on-hard-verifier-error \
+		--abort-on-soft-verifier-error \
 		$(PRODUCT_DEX_PREOPT_BOOT_FLAGS) $(GLOBAL_DEXPREOPT_FLAGS) $(ART_BOOT_IMAGE_EXTRA_ARGS)
 
 endif
diff --git a/core/distdir.mk b/core/distdir.mk
index 89c5966..c074186 100644
--- a/core/distdir.mk
+++ b/core/distdir.mk
@@ -17,9 +17,6 @@
 # When specifying "dist", the user has asked that we copy the important
 # files from this build into DIST_DIR.
 
-dist_goal := $(strip $(filter dist,$(MAKECMDGOALS)))
-MAKECMDGOALS := $(strip $(filter-out dist,$(MAKECMDGOALS)))
-
 ifdef dist_goal
 
 # $(1): source file
diff --git a/core/jacoco.mk b/core/jacoco.mk
index 9e6fd07..f51790d 100644
--- a/core/jacoco.mk
+++ b/core/jacoco.mk
@@ -19,41 +19,22 @@
 # (at the time of authorship, it is included by java.mk and
 # java_host_library.mk)
 
-my_include_filter :=
-my_exclude_filter :=
+# determine Jacoco include/exclude filters even when coverage is not enabled
+# to get syntax checking on LOCAL_JACK_COVERAGE_(INCLUDE|EXCLUDE)_FILTER
+# copy filters from Jack but also skip some known java packages
+my_include_filter := $(strip $(LOCAL_JACK_COVERAGE_INCLUDE_FILTER))
+my_exclude_filter := $(strip $(DEFAULT_JACOCO_EXCLUDE_FILTER),$(LOCAL_JACK_COVERAGE_EXCLUDE_FILTER))
+
+my_include_args := $(call jacoco-class-filter-to-file-args, $(my_include_filter))
+my_exclude_args := $(call jacoco-class-filter-to-file-args, $(my_exclude_filter))
+
+# single-quote each arg of the include args so the '*' gets evaluated by zip
+# don't quote the exclude args they need to be evaluated by bash for rm -rf
+my_include_args := $(foreach arg,$(my_include_args),'$(arg)')
 
 ifeq ($(LOCAL_EMMA_INSTRUMENT),true)
-  # determine Jacoco include/exclude filters
-  DEFAULT_JACOCO_EXCLUDE_FILTER := org/junit/*,org/jacoco/*,org/mockito/*
-  # copy filters from Jack but also skip some known java packages
-  my_include_filter := $(strip $(LOCAL_JACK_COVERAGE_INCLUDE_FILTER))
-  my_exclude_filter := $(strip $(DEFAULT_JACOCO_EXCLUDE_FILTER),$(LOCAL_JACK_COVERAGE_EXCLUDE_FILTER))
-
-  # replace '.' with '/' and ',' with ' ', and quote each arg
-  ifneq ($(strip $(my_include_filter)),)
-    my_include_args := $(strip $(my_include_filter))
-
-    my_include_args := $(subst .,/,$(my_include_args))
-    my_include_args := '$(subst $(comma),' ',$(my_include_args))'
-  else
-    my_include_args :=
-  endif
-
-  # replace '.' with '/' and ',' with ' '
-  ifneq ($(strip $(my_exclude_filter)),)
-    my_exclude_args := $(my_exclude_filter)
-
-    my_exclude_args := $(subst .,/,$(my_exclude_args))
-    my_exclude_args := $(subst $(comma)$(comma),$(comma),$(my_exclude_args))
-    my_exclude_args := $(subst $(comma), ,$(my_exclude_args))
-  else
-    my_exclude_args :=
-  endif
-
   my_files := $(intermediates.COMMON)/jacoco
 
-  $(call validate-paths-are-subdirs,$(my_exclude_args))
-
   # make a task that unzips the classes that we want to instrument from the
   # input jar
   my_unzipped_path := $(my_files)/work/classes-to-instrument/classes
diff --git a/core/soong_config.mk b/core/soong_config.mk
index 5ebd123..a90e5af 100644
--- a/core/soong_config.mk
+++ b/core/soong_config.mk
@@ -119,6 +119,8 @@
 
 $(call add_json_bool, UseGoma,                           $(filter-out false,$(USE_GOMA)))
 
+$(call add_json_str,  DistDir,                           $(if $(dist_goal), $(DIST_DIR)))
+
 _contents := $(subst $(comma)$(newline)__SV_END,$(newline)}$(newline),$(_contents)__SV_END)
 
 $(file >$(SOONG_VARIABLES).tmp,$(_contents))
diff --git a/core/soong_java_prebuilt.mk b/core/soong_java_prebuilt.mk
index 53ad50d..ccbe745 100644
--- a/core/soong_java_prebuilt.mk
+++ b/core/soong_java_prebuilt.mk
@@ -2,6 +2,7 @@
 # Extra inputs:
 # LOCAL_SOONG_HEADER_JAR
 # LOCAL_SOONG_DEX_JAR
+# LOCAL_SOONG_JACOCO_REPORT_CLASSES_JAR
 
 ifneq ($(LOCAL_MODULE_MAKEFILE),$(SOONG_ANDROID_MK))
   $(call pretty-error,soong_java_prebuilt.mk may only be used from Soong)
@@ -18,13 +19,12 @@
 full_classes_header_jar := $(intermediates.COMMON)/classes-header.jar
 common_javalib.jar := $(intermediates.COMMON)/javalib.jar
 
-LOCAL_FULL_CLASSES_PRE_JACOCO_JAR := $(LOCAL_PREBUILT_MODULE_FILE)
+$(eval $(call copy-one-file,$(LOCAL_PREBUILT_MODULE_FILE),$(full_classes_jar)))
 
-#######################################
-include $(BUILD_SYSTEM)/jacoco.mk
-#######################################
-
-$(eval $(call copy-one-file,$(LOCAL_FULL_CLASSES_JACOCO_JAR),$(full_classes_jar)))
+ifdef LOCAL_SOONG_JACOCO_REPORT_CLASSES_JAR
+  $(eval $(call copy-one-file,$(LOCAL_SOONG_JACOCO_REPORT_CLASSES_JAR),\
+    $(intermediates.COMMON)/jacoco-report-classes.jar))
+endif
 
 ifneq ($(TURBINE_DISABLED),false)
 ifdef LOCAL_SOONG_HEADER_JAR
@@ -94,7 +94,3 @@
 my_common := COMMON
 include $(BUILD_SYSTEM)/link_type.mk
 endif # !LOCAL_IS_HOST_MODULE
-
-# Built in equivalent to include $(CLEAR_VARS)
-LOCAL_SOONG_HEADER_JAR :=
-LOCAL_SOONG_DEX_JAR :=
diff --git a/help.sh b/help.sh
index c031dcc..3f39c77 100755
--- a/help.sh
+++ b/help.sh
@@ -15,7 +15,7 @@
 m -j [<goals>]              # Execute the configured build.
 
 Usage of "m" imitates usage of the program "make".
-See '"${SCRIPT_DIR}"'/README.txt for more info about build usage and concepts.
+See '"${SCRIPT_DIR}"'/Usage.txt for more info about build usage and concepts.
 
 Common goals are:
 
diff --git a/navbar.md b/navbar.md
new file mode 100644
index 0000000..e218a78
--- /dev/null
+++ b/navbar.md
@@ -0,0 +1,4 @@
+* [Home](/README.md)
+* [Usage](/Usage.txt)
+* [Changes](/Changes.md)
+* [Outdated Reference](/core/build-system.html)
diff --git a/target/product/core_minimal.mk b/target/product/core_minimal.mk
index b2ee7bb..05e3b45 100644
--- a/target/product/core_minimal.mk
+++ b/target/product/core_minimal.mk
@@ -159,5 +159,9 @@
     tombstoned.max_tombstone_count=50
 endif
 
+PRODUCT_DEFAULT_PROPERTY_OVERRIDES += \
+    ro.logd.size.stats=64K \
+    log.tag.stats_log=I
+
 $(call inherit-product, $(SRC_TARGET_DIR)/product/runtime_libart.mk)
 $(call inherit-product, $(SRC_TARGET_DIR)/product/base.mk)
diff --git a/target/product/vndk/Android.mk b/target/product/vndk/Android.mk
index e7f07e7..6e8a85f 100644
--- a/target/product/vndk/Android.mk
+++ b/target/product/vndk/Android.mk
@@ -100,7 +100,9 @@
 LOCAL_REQUIRED_MODULES := \
     $(addsuffix .vendor,$(VNDK_CORE_LIBRARIES)) \
     $(addsuffix .vendor,$(VNDK_SAMEPROCESS_LIBRARIES)) \
-    $(LLNDK_LIBRARIES)
+    $(LLNDK_LIBRARIES) \
+    llndk.libraries.txt \
+    vndksp.libraries.txt
 
 include $(BUILD_PHONY_PACKAGE)
 endif # BOARD_VNDK_VERSION is set
diff --git a/tools/releasetools/add_img_to_target_files.py b/tools/releasetools/add_img_to_target_files.py
index bff6608..a882685 100755
--- a/tools/releasetools/add_img_to_target_files.py
+++ b/tools/releasetools/add_img_to_target_files.py
@@ -346,9 +346,15 @@
     cmd.extend(["--include_descriptors_from_image", img_path])
 
 
-def AddVBMeta(output_zip, boot_img_path, system_img_path, vendor_img_path,
-              dtbo_img_path, prefix="IMAGES/"):
-  """Create a VBMeta image and store it in output_zip."""
+def AddVBMeta(output_zip, partitions, prefix="IMAGES/"):
+  """Creates a VBMeta image and store it in output_zip.
+
+  Args:
+    output_zip: The output zip file, which needs to be already open.
+    partitions: A dict that's keyed by partition names with image paths as
+        values. Only valid partition names are accepted, which include 'boot',
+        'recovery', 'system', 'vendor', 'dtbo'.
+  """
   img = OutputFile(output_zip, OPTIONS.input_tmp, prefix, "vbmeta.img")
   if os.path.exists(img.input_name):
     print("vbmeta.img already exists in %s; not rebuilding..." % (prefix,))
@@ -361,10 +367,12 @@
   public_key_dir = tempfile.mkdtemp(prefix="avbpubkey-")
   OPTIONS.tempfiles.append(public_key_dir)
 
-  AppendVBMetaArgsForPartition(cmd, "boot", boot_img_path, public_key_dir)
-  AppendVBMetaArgsForPartition(cmd, "system", system_img_path, public_key_dir)
-  AppendVBMetaArgsForPartition(cmd, "vendor", vendor_img_path, public_key_dir)
-  AppendVBMetaArgsForPartition(cmd, "dtbo", dtbo_img_path, public_key_dir)
+  for partition, path in partitions.items():
+    assert partition in common.AVB_PARTITIONS, 'Unknown partition: %s' % (
+        partition,)
+    assert os.path.exists(path), 'Failed to find %s for partition %s' % (
+        path, partition)
+    AppendVBMetaArgsForPartition(cmd, partition, path, public_key_dir)
 
   args = OPTIONS.info_dict.get("avb_vbmeta_args")
   if args and args.strip():
@@ -544,6 +552,10 @@
     if fp:
       OPTIONS.info_dict["avb_salt"] = hashlib.sha256(fp).hexdigest()
 
+  # A map between partition names and their paths, which could be used when
+  # generating AVB vbmeta image.
+  partitions = dict()
+
   def banner(s):
     print("\n\n++++ " + s + " ++++\n\n")
 
@@ -553,8 +565,8 @@
       "IMAGES/boot.img", "boot.img", OPTIONS.input_tmp, "BOOT")
   # boot.img may be unavailable in some targets (e.g. aosp_arm64).
   if boot_image:
-    boot_img_path = os.path.join(OPTIONS.input_tmp, "IMAGES", "boot.img")
-    if not os.path.exists(boot_img_path):
+    partitions['boot'] = os.path.join(OPTIONS.input_tmp, "IMAGES", "boot.img")
+    if not os.path.exists(partitions['boot']):
       boot_image.WriteToDir(OPTIONS.input_tmp)
       if output_zip:
         boot_image.AddToZip(output_zip)
@@ -565,9 +577,9 @@
     recovery_image = common.GetBootableImage(
         "IMAGES/recovery.img", "recovery.img", OPTIONS.input_tmp, "RECOVERY")
     assert recovery_image, "Failed to create recovery.img."
-    recovery_img_path = os.path.join(
+    partitions['recovery'] = os.path.join(
         OPTIONS.input_tmp, "IMAGES", "recovery.img")
-    if not os.path.exists(recovery_img_path):
+    if not os.path.exists(partitions['recovery']):
       recovery_image.WriteToDir(OPTIONS.input_tmp)
       if output_zip:
         recovery_image.AddToZip(output_zip)
@@ -586,15 +598,17 @@
           recovery_two_step_image.AddToZip(output_zip)
 
   banner("system")
-  system_img_path = AddSystem(
+  partitions['system'] = system_img_path = AddSystem(
       output_zip, recovery_img=recovery_image, boot_img=boot_image)
-  vendor_img_path = None
+
   if has_vendor:
     banner("vendor")
-    vendor_img_path = AddVendor(output_zip)
+    partitions['vendor'] = vendor_img_path = AddVendor(output_zip)
+
   if has_system_other:
     banner("system_other")
     AddSystemOther(output_zip)
+
   if not OPTIONS.is_signing:
     banner("userdata")
     AddUserdata(output_zip)
@@ -605,15 +619,13 @@
     banner("partition-table")
     AddPartitionTable(output_zip)
 
-  dtbo_img_path = None
   if OPTIONS.info_dict.get("has_dtbo") == "true":
     banner("dtbo")
-    dtbo_img_path = AddDtbo(output_zip)
+    partitions['dtbo'] = AddDtbo(output_zip)
 
   if OPTIONS.info_dict.get("avb_enable") == "true":
     banner("vbmeta")
-    AddVBMeta(output_zip, boot_img_path, system_img_path, vendor_img_path,
-              dtbo_img_path)
+    AddVBMeta(output_zip, partitions)
 
   # For devices using A/B update, copy over images from RADIO/ and/or
   # VENDOR_IMAGES/ to IMAGES/ and make sure we have all the needed
diff --git a/tools/releasetools/blockimgdiff.py b/tools/releasetools/blockimgdiff.py
index 6bca99e..1ef55ff 100644
--- a/tools/releasetools/blockimgdiff.py
+++ b/tools/releasetools/blockimgdiff.py
@@ -1178,42 +1178,10 @@
   def FindTransfers(self):
     """Parse the file_map to generate all the transfers."""
 
-    def AddSplitTransfers(tgt_name, src_name, tgt_ranges, src_ranges,
-                          style, by_id):
-      """Add one or multiple Transfer()s by splitting large files.
-
-      For BBOTA v3, we need to stash source blocks for resumable feature.
-      However, with the growth of file size and the shrink of the cache
-      partition source blocks are too large to be stashed. If a file occupies
-      too many blocks, we split it into smaller pieces by getting multiple
-      Transfer()s.
-
-      The downside is that after splitting, we may increase the package size
-      since the split pieces don't align well. According to our experiments,
-      1/8 of the cache size as the per-piece limit appears to be optimal.
-      Compared to the fixed 1024-block limit, it reduces the overall package
-      size by 30% for volantis, and 20% for angler and bullhead."""
-
-      assert style == "diff"
-      # Possibly split large files into smaller chunks.
+    def AddSplitTransfers(tgt_name, src_name, tgt_ranges, src_ranges, style,
+                          by_id):
+      """Add one or multiple Transfer()s by splitting large files."""
       pieces = 0
-
-      # Change nothing for small files.
-      if (tgt_ranges.size() <= max_blocks_per_transfer and
-          src_ranges.size() <= max_blocks_per_transfer):
-        Transfer(tgt_name, src_name, tgt_ranges, src_ranges,
-                 self.tgt.RangeSha1(tgt_ranges), self.src.RangeSha1(src_ranges),
-                 style, by_id)
-        return
-
-      if tgt_name.split(".")[-1].lower() in ("apk", "jar", "zip"):
-        split_enable = (not self.disable_imgdiff and src_ranges.monotonic and
-                        tgt_ranges.monotonic)
-        if split_enable and (self.tgt.RangeSha1(tgt_ranges) !=
-                             self.src.RangeSha1(src_ranges)):
-          large_apks.append((tgt_name, src_name, tgt_ranges, src_ranges))
-          return
-
       while (tgt_ranges.size() > max_blocks_per_transfer and
              src_ranges.size() > max_blocks_per_transfer):
         tgt_split_name = "%s-%d" % (tgt_name, pieces)
@@ -1239,6 +1207,43 @@
                  self.tgt.RangeSha1(tgt_ranges), self.src.RangeSha1(src_ranges),
                  style, by_id)
 
+    def FindZipsAndAddSplitTransfers(tgt_name, src_name, tgt_ranges,
+                                     src_ranges, style, by_id):
+      """Find all the zip archives and add split transfers for the other files.
+
+      For BBOTA v3, we need to stash source blocks for resumable feature.
+      However, with the growth of file size and the shrink of the cache
+      partition source blocks are too large to be stashed. If a file occupies
+      too many blocks, we split it into smaller pieces by getting multiple
+      Transfer()s.
+
+      The downside is that after splitting, we may increase the package size
+      since the split pieces don't align well. According to our experiments,
+      1/8 of the cache size as the per-piece limit appears to be optimal.
+      Compared to the fixed 1024-block limit, it reduces the overall package
+      size by 30% for volantis, and 20% for angler and bullhead."""
+
+      assert style == "diff"
+
+      # Change nothing for small files.
+      if (tgt_ranges.size() <= max_blocks_per_transfer and
+          src_ranges.size() <= max_blocks_per_transfer):
+        Transfer(tgt_name, src_name, tgt_ranges, src_ranges,
+                 self.tgt.RangeSha1(tgt_ranges), self.src.RangeSha1(src_ranges),
+                 style, by_id)
+        return
+
+      if tgt_name.split(".")[-1].lower() in ("apk", "jar", "zip"):
+        split_enable = (not self.disable_imgdiff and src_ranges.monotonic and
+                        tgt_ranges.monotonic)
+        if split_enable and (self.tgt.RangeSha1(tgt_ranges) !=
+                             self.src.RangeSha1(src_ranges)):
+          large_apks.append((tgt_name, src_name, tgt_ranges, src_ranges))
+          return
+
+      AddSplitTransfers(tgt_name, src_name, tgt_ranges, src_ranges,
+                        style, by_id)
+
     def AddTransfer(tgt_name, src_name, tgt_ranges, src_ranges, style, by_id,
                     split=False):
       """Wrapper function for adding a Transfer()."""
@@ -1287,7 +1292,7 @@
           assert tgt_changed + tgt_skipped.size() == tgt_size
           print('%10d %10d (%6.2f%%) %s' % (tgt_skipped.size(), tgt_size,
                 tgt_skipped.size() * 100.0 / tgt_size, tgt_name))
-          AddSplitTransfers(
+          FindZipsAndAddSplitTransfers(
               "%s-skipped" % (tgt_name,),
               "%s-skipped" % (src_name,),
               tgt_skipped, src_skipped, style, by_id)
@@ -1304,7 +1309,7 @@
             return
 
       # Add the transfer(s).
-      AddSplitTransfers(
+      FindZipsAndAddSplitTransfers(
           tgt_name, src_name, tgt_ranges, src_ranges, style, by_id)
 
     def ParseAndValidateSplitInfo(patch_size, tgt_ranges, src_ranges,
@@ -1403,10 +1408,13 @@
                src_file, tgt_file, patch_file]
         p = common.Run(cmd, stdout=subprocess.PIPE)
         p.communicate()
-        # TODO(xunchang) fall back to the normal split if imgdiff fails.
         if p.returncode != 0:
-          raise ValueError("Failed to create patch between {} and {}".format(
-              src_name, tgt_name))
+          print("Failed to create patch between {} and {},"
+                " falling back to bsdiff".format(src_name, tgt_name))
+          with transfer_lock:
+            AddSplitTransfers(tgt_name, src_name, tgt_ranges, src_ranges,
+                              "diff", self.transfers)
+          continue
 
         with open(patch_info_file) as patch_info:
           lines = patch_info.readlines()
diff --git a/tools/releasetools/check_ota_package_signature.py b/tools/releasetools/check_ota_package_signature.py
index 1f8b7bb..8106d06 100755
--- a/tools/releasetools/check_ota_package_signature.py
+++ b/tools/releasetools/check_ota_package_signature.py
@@ -22,7 +22,10 @@
 
 import argparse
 import common
+import os
+import os.path
 import re
+import site
 import subprocess
 import sys
 import tempfile
@@ -32,7 +35,12 @@
 from hashlib import sha256
 
 # 'update_payload' package is under 'system/update_engine/scripts/', which
-# should to be included in PYTHONPATH.
+# should be included in PYTHONPATH. Try to set it up automatically if
+# if ANDROID_BUILD_TOP is available.
+top = os.getenv('ANDROID_BUILD_TOP')
+if top:
+  site.addsitedir(os.path.join(top, 'system', 'update_engine', 'scripts'))
+
 from update_payload.payload import Payload
 from update_payload.update_metadata_pb2 import Signatures
 
diff --git a/tools/releasetools/common.py b/tools/releasetools/common.py
index f80d0ec..75c86cc 100644
--- a/tools/releasetools/common.py
+++ b/tools/releasetools/common.py
@@ -452,10 +452,12 @@
   else:
     cmd.extend(["--output", img.name])
 
+  # "boot" or "recovery", without extension.
+  partition_name = os.path.basename(sourcedir).lower()
+
   p = Run(cmd, stdout=subprocess.PIPE)
   p.communicate()
-  assert p.returncode == 0, "mkbootimg of %s image failed" % (
-      os.path.basename(sourcedir),)
+  assert p.returncode == 0, "mkbootimg of %s image failed" % (partition_name,)
 
   if (info_dict.get("boot_signer", None) == "true" and
       info_dict.get("verity_key", None)):
@@ -464,7 +466,7 @@
     if two_step_image:
       path = "/boot"
     else:
-      path = "/" + os.path.basename(sourcedir).lower()
+      path = "/" + partition_name
     cmd = [OPTIONS.boot_signer_path]
     cmd.extend(OPTIONS.boot_signer_args)
     cmd.extend([path, img.name,
@@ -476,7 +478,7 @@
 
   # Sign the image if vboot is non-empty.
   elif info_dict.get("vboot", None):
-    path = "/" + os.path.basename(sourcedir).lower()
+    path = "/" + partition_name
     img_keyblock = tempfile.NamedTemporaryFile()
     # We have switched from the prebuilt futility binary to using the tool
     # (futility-host) built from the source. Override the setting in the old
@@ -503,15 +505,16 @@
     avbtool = os.getenv('AVBTOOL') or info_dict["avb_avbtool"]
     part_size = info_dict["boot_size"]
     cmd = [avbtool, "add_hash_footer", "--image", img.name,
-           "--partition_size", str(part_size), "--partition_name", "boot"]
-    AppendAVBSigningArgs(cmd, "boot")
+           "--partition_size", str(part_size), "--partition_name",
+           partition_name]
+    AppendAVBSigningArgs(cmd, partition_name)
     args = info_dict.get("avb_boot_add_hash_footer_args")
     if args and args.strip():
       cmd.extend(shlex.split(args))
     p = Run(cmd, stdout=subprocess.PIPE)
     p.communicate()
     assert p.returncode == 0, "avbtool add_hash_footer of %s failed" % (
-        os.path.basename(OPTIONS.input_tmp))
+        partition_name,)
 
   img.seek(os.SEEK_SET, 0)
   data = img.read()