Don't use implementation-defined format with CPU consumers
If the virtual display surface is being consumed by the CPU, it can't
be allowed with HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED since there is
no way for the CPU consumer to find out what format gralloc chose. So
for CPU-consumer surfaces, just use the BufferQueue's default format,
which can be set by the consumer.
A better but more invasive change would be to let the consumer require
a certain format (or set of formats?), and disallow the producer from
requesting a different format.
Bug: 11479817
Change-Id: I5b20ee6ac1146550e8799b806e14661d279670c0
diff --git a/services/surfaceflinger/DisplayHardware/VirtualDisplaySurface.h b/services/surfaceflinger/DisplayHardware/VirtualDisplaySurface.h
index 57b5554..1e85ac4 100644
--- a/services/surfaceflinger/DisplayHardware/VirtualDisplaySurface.h
+++ b/services/surfaceflinger/DisplayHardware/VirtualDisplaySurface.h
@@ -132,6 +132,7 @@
const int32_t mDisplayId;
const String8 mDisplayName;
sp<IGraphicBufferProducer> mSource[2]; // indexed by SOURCE_*
+ uint32_t mDefaultOutputFormat;
//
// Inter-frame state