LSTM: require input layer norm weights to be omitted in case CIFG is used.
In case of CIFG LSTM, input layer norm weights are not used in the
computation.
Bug: 129126572
Test: mma
Change-Id: I57bc578606132a2c44c71ab63dd7645dcc001302
Merged-In: I57bc578606132a2c44c71ab63dd7645dcc001302
(cherry picked from commit 0480af9aa84beba07061cfc5279ad6cc9a75af4c)
diff --git a/current.txt b/current.txt
index cdd6f68..dff607d 100644
--- a/current.txt
+++ b/current.txt
@@ -451,7 +451,7 @@
92714960d1a53fc2ec557302b41c7cc93d2636d8364a44bd0f85be0c92927ff8 android.hardware.neuralnetworks@1.2::IExecutionCallback
36e1064c869965dee533c537cefbe87e54db8bd8cd45be7e0e93e00e8a43863a android.hardware.neuralnetworks@1.2::IPreparedModel
e1c734d1545e1a4ae749ff1dd9704a8e594c59aea7c8363159dc258e93e0df3b android.hardware.neuralnetworks@1.2::IPreparedModelCallback
-209a5ee694b94328afb2af2768f1fe6a69148e2cbb85ec3c340a36eed818c697 android.hardware.neuralnetworks@1.2::types
+9b3963253e521cca19fd81aeca83aee6dcfe3bdf2805c07cb2d3f64381709b71 android.hardware.neuralnetworks@1.2::types
cf7a4ba516a638f9b82a249c91fb603042c2d9ca43fd5aad9cf6c0401ed2a5d7 android.hardware.nfc@1.2::INfc
abf98c2ae08bf765db54edc8068e36d52eb558cff6706b6fd7c18c65a1f3fc18 android.hardware.nfc@1.2::types
4cb252dc6372a874aef666b92a6e9529915aa187521a700f0789065c3c702ead android.hardware.power.stats@1.0::IPowerStats
diff --git a/neuralnetworks/1.2/types.hal b/neuralnetworks/1.2/types.hal
index 8c57796..67a61c1 100644
--- a/neuralnetworks/1.2/types.hal
+++ b/neuralnetworks/1.2/types.hal
@@ -1188,8 +1188,11 @@
* value if the recurrent projection layer exists, and should otherwise
* have no value.
* * (API level >= 29) The four layer normalization weights either all have
- * values or none of them have values. Layer normalization is used when
- * values are present.
+ * values or none of them have values. Additionally, if CIFG is used,
+ * input layer normalization weights tensor is omitted and the other layer
+ * normalization weights either all have values or none of them have
+ * values. Layer normalization is used when the values of all the layer
+ * normalization weights are present.
*
* References:
*