Camera: reset presentation timestamp more aggressively

On some platforms, VSYNC may run at a slightly lower rate
than camera output. With the current approach, the selected
timeline index may keep increase until it reaches max value,
resulting in long latency.

Address this by:
- Only use up to the first 3 expected presentation time.
- Use a bias factor which decreases the minimum presentation
  interval as the timeline index increases.
- Reset presentation time as soon as capture intervals exceed
  1.5 * 33ms.

Test: Run GoogleCamera at 24, 30, and 60fps and analyze frame drop rate
Test: Run camcorder preview/record for 10 iterations and observe preview
Bug: 236707639
Change-Id: Iabd4f7b17a6ee1eb6772fbb943c8800f48a40869
diff --git a/services/camera/libcameraservice/device3/Camera3OutputStream.h b/services/camera/libcameraservice/device3/Camera3OutputStream.h
index a3e8f63..e8065ce 100644
--- a/services/camera/libcameraservice/device3/Camera3OutputStream.h
+++ b/services/camera/libcameraservice/device3/Camera3OutputStream.h
@@ -421,9 +421,10 @@
     nsecs_t mLastPresentTime = 0;
     nsecs_t mCaptureToPresentOffset = 0;
     static constexpr size_t kDisplaySyncExtraBuffer = 2;
-    static constexpr nsecs_t kSpacingResetIntervalNs = 1000000000LL; // 1 second
+    static constexpr nsecs_t kSpacingResetIntervalNs = 50000000LL; // 50 millisecond
     static constexpr nsecs_t kTimelineThresholdNs = 1000000LL; // 1 millisecond
     static constexpr float kMaxIntervalRatioDeviation = 0.05f;
+    static constexpr int kMaxTimelines = 3;
     nsecs_t syncTimestampToDisplayLocked(nsecs_t t);
 
     // Re-space frames by delaying queueBuffer so that frame delivery has