Camera: reset presentation timestamp more aggressively
On some platforms, VSYNC may run at a slightly lower rate
than camera output. With the current approach, the selected
timeline index may keep increase until it reaches max value,
resulting in long latency.
Address this by:
- Only use up to the first 3 expected presentation time.
- Use a bias factor which decreases the minimum presentation
interval as the timeline index increases.
- Reset presentation time as soon as capture intervals exceed
1.5 * 33ms.
Test: Run GoogleCamera at 24, 30, and 60fps and analyze frame drop rate
Test: Run camcorder preview/record for 10 iterations and observe preview
Bug: 236707639
Change-Id: Iabd4f7b17a6ee1eb6772fbb943c8800f48a40869
diff --git a/services/camera/libcameraservice/device3/Camera3OutputStream.cpp b/services/camera/libcameraservice/device3/Camera3OutputStream.cpp
index 7e2ea49..1e20ee0 100644
--- a/services/camera/libcameraservice/device3/Camera3OutputStream.cpp
+++ b/services/camera/libcameraservice/device3/Camera3OutputStream.cpp
@@ -1432,16 +1432,27 @@
int minVsyncs = (mMinExpectedDuration - vsyncEventData.frameInterval / 2) /
vsyncEventData.frameInterval;
if (minVsyncs < 0) minVsyncs = 0;
- nsecs_t minInterval = minVsyncs * vsyncEventData.frameInterval + kTimelineThresholdNs;
- // Find best timestamp in the vsync timeline:
+ nsecs_t minInterval = minVsyncs * vsyncEventData.frameInterval;
+ // Find best timestamp in the vsync timelines:
+ // - Only use at most 3 timelines to avoid long latency
// - closest to the ideal present time,
// - deadline timestamp is greater than the current time, and
// - the candidate present time is at least minInterval in the future
// compared to last present time.
- for (const auto& vsyncTime : vsyncEventData.frameTimelines) {
+ int maxTimelines = std::min(kMaxTimelines, (int)VsyncEventData::kFrameTimelinesLength);
+ float biasForShortDelay = 1.0f;
+ for (int i = 0; i < maxTimelines; i ++) {
+ const auto& vsyncTime = vsyncEventData.frameTimelines[i];
+ if (minVsyncs > 0) {
+ // Bias towards using smaller timeline index:
+ // i = 0: bias = 1
+ // i = maxTimelines-1: bias = -1
+ biasForShortDelay = 1.0 - 2.0 * i / (maxTimelines - 1);
+ }
if (std::abs(vsyncTime.expectedPresentationTime - idealPresentT) < minDiff &&
vsyncTime.deadlineTimestamp >= currentTime &&
- vsyncTime.expectedPresentationTime > mLastPresentTime + minInterval) {
+ vsyncTime.expectedPresentationTime >
+ mLastPresentTime + minInterval + biasForShortDelay * kTimelineThresholdNs) {
expectedPresentT = vsyncTime.expectedPresentationTime;
minDiff = std::abs(vsyncTime.expectedPresentationTime - idealPresentT);
}