1 /* 2 * Copyright (C) 2017 The Android Open Source Project 3 * 4 * Licensed under the Apache License, Version 2.0 (the "License"); 5 * you may not use this file except in compliance with the License. 6 * You may obtain a copy of the License at 7 * 8 * http://www.apache.org/licenses/LICENSE-2.0 9 * 10 * Unless required by applicable law or agreed to in writing, software 11 * distributed under the License is distributed on an "AS IS" BASIS, 12 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 * See the License for the specific language governing permissions and 14 * limitations under the License. 15 */ 16 17 #ifndef ART_RUNTIME_SUBTYPE_CHECK_H_ 18 #define ART_RUNTIME_SUBTYPE_CHECK_H_ 19 20 #include "subtype_check_bits_and_status.h" 21 #include "subtype_check_info.h" 22 23 #include "base/locks.h" 24 #include "base/macros.h" 25 #include "mirror/class.h" 26 #include "runtime.h" 27 28 // Build flag for the bitstring subtype check runtime hooks. 29 constexpr bool kBitstringSubtypeCheckEnabled = false; 30 31 /** 32 * Any node in a tree can have its path (from the root to the node) represented as a string by 33 * concatenating the path of the parent to that of the current node. 34 * 35 * We can annotate each node with a `sibling-label` which is some value unique amongst all of the 36 * node's siblings. As a special case, the root is empty. 37 * 38 * (none) 39 * / | \ 40 * A B C 41 * / \ 42 * A’ B’ 43 * | 44 * A’’ 45 * | 46 * A’’’ 47 * | 48 * A’’’’ 49 * 50 * Given these sibling-labels, we can now encode the path from any node to the root by starting at 51 * the node and going up to the root, marking each node with this `path-label`. The special 52 * character $ means "end of path". 53 * 54 * $ 55 * / | \ 56 * A$ B$ C$ 57 * / \ 58 * A’A$ B’A$ 59 * | 60 * A’’B’A$ 61 * | 62 * A’’’A’’B’A$ 63 * | 64 * A’’’’A’’B’A$ 65 * 66 * Given the above `path-label` we can express if any two nodes are an offspring of the other 67 * through a O(1) expression: 68 * 69 * x <: y := 70 * suffix(x, y) == y 71 * 72 * In the above example suffix(x,y) means the suffix of x that is as long as y (right-padded with 73 * $s if x is shorter than y) : 74 * 75 * suffix(x,y) := x(x.length - y.length .. 0] 76 * + repeat($, max(y.length - x.length, 0)) 77 * 78 * A few generalities here to elaborate: 79 * 80 * - There can be at most D levels in the tree. 81 * - Each level L has an alphabet A, and the maximum number of 82 * nodes is determined by |A| 83 * - The alphabet A can be a subset, superset, equal, or unique with respect to the other alphabets 84 * without loss of generality. (In practice it would almost always be a subset of the previous 85 * level’s alphabet as we assume most classes have less children the deeper they are.) 86 * - The `sibling-label` doesn’t need to be stored as an explicit value. It can a temporary when 87 * visiting every immediate child of a node. Only the `path-label` needs to be actually stored for 88 * every node. 89 * 90 * The path can also be reversed, and use a prefix instead of a suffix to define the subchild 91 * relation. 92 * 93 * $ 94 * / | \ \ 95 * A$ B$ C$ D$ 96 * / \ 97 * AA’$ AB’$ 98 * | 99 * AB’A’’$ 100 * | 101 * AB’A’’A’’’$ 102 * | 103 * AB’A’’A’’’A’’’’$ 104 * 105 * x <: y := 106 * prefix(x, y) == y 107 * 108 * prefix(x,y) := x[0 .. y.length) 109 * + repeat($, max(y.length - x.length, 0)) 110 * 111 * In a dynamic tree, new nodes can be inserted at any time. This means if a minimal alphabet is 112 * selected to contain the initial tree hierarchy, later node insertions will be illegal because 113 * there is no more room to encode the path. 114 * 115 * In this simple example with an alphabet A,B,C and max level 1: 116 * 117 * Level 118 * 0: $ 119 * / | \ \ 120 * 1: A$ B$ C$ D$ (illegal) 121 * | 122 * 2: AA$ (illegal) 123 * 124 * Attempting to insert the sibling “D” at Level 1 would be illegal because the Alphabet(1) is 125 * {A,B,C} and inserting an extra node would mean the `sibling-label` is no longer unique. 126 * Attempting to insert “AA$” is illegal because the level 2 is more than the max level 1. 127 * 128 * One solution to this would be to revisit the entire graph, select a larger alphabet to that 129 * every `sibling-label` is unique, pick a larger max level count, and then store the updated 130 * `path-label` accordingly. 131 * 132 * The more common approach would instead be to select a set of alphabets and max levels statically, 133 * with large enough sizes, for example: 134 * 135 * Alphabets = {{A,B,C,D}, {A,B,C}, {A,B}, {A}} 136 * Max Levels = |Alphabets| 137 * 138 * Which would allow up to 4 levels with each successive level having 1 less max siblings. 139 * 140 * Attempting to insert a new node into the graph which does not fit into that level’s alphabet 141 * would be represented by re-using the `path-label` of the parent. Such a `path_label` would be 142 * considered truncated (because it would only have a prefix of the full path from the root to the 143 * node). 144 * 145 * Level 146 * 0: $ 147 * / | \ \ 148 * 1: A$ B$ C$ $ (same as parent) 149 * | 150 * 2: A$ (same as parent) 151 * 152 * The updated relation for offspring is then: 153 * 154 * x <: y := 155 * if !truncated_path(y): 156 * return prefix(x, y) == y // O(1) 157 * else: 158 * return slow_check_is_offspring(x, y) // worse than O(1) 159 * 160 * (Example definition of truncated_path -- any semantically equivalent way to check that the 161 * sibling's `sibling-label` is not unique will do) 162 * 163 * truncated_path(y) := 164 * return y == parent(y) 165 * 166 * (Example definition. Any slower-than-O(1) definition will do here. This is the traversing 167 * superclass hierarchy solution) 168 * 169 * slow_check_is_offspring(x, y) := 170 * if not x: return false 171 * else: return x == y || recursive_is_offspring(parent(x), y) 172 * 173 * In which case slow_check_is_offspring is some non-O(1) way to check if x and is an offspring of y. 174 * 175 * In addition, note that it doesn’t matter if the "x" from above is a unique sibling or not; the 176 * relation will still be correct. 177 * 178 * ------------------------------------------------------------------------------------------------ 179 * 180 * Leveraging truncated paths to minimize path lengths. 181 * 182 * As observed above, for any x <: y, it is sufficient to have a full path only for y, 183 * and x can be truncated (to its nearest ancestor's full path). 184 * 185 * We call a node that stores a full path "Assigned", and a node that stores a truncated path 186 * either "Initialized" or "Overflowed." 187 * 188 * "Initialized" means it is still possible to assign a full path to the node, and "Overflowed" 189 * means there is insufficient characters in the alphabet left. 190 * 191 * In this example, assume that we attempt to "Assign" all non-leafs if possible. Leafs 192 * always get truncated (as either Initialized or Overflowed). 193 * 194 * Alphabets = {{A,B,C,D}, {A,B}} 195 * Max Levels = |Alphabets| 196 * 197 * Level 198 * 0: $ 199 * / | \ \ \ 200 * 1: A$ B$ C$ D$ $ (Overflowed: Too wide) 201 * | | 202 * 2: AA$ C$ (Initialized) 203 * | 204 * 3: AA$ (Overflowed: Too deep) 205 * 206 * (All un-annotated nodes are "Assigned"). 207 * Above, the node at level 3 becomes overflowed because it exceeds the max levels. The 208 * right-most node at level 1 becomes overflowed because there's no characters in the alphabet 209 * left in that level. 210 * 211 * The "C$" node is Initialized at level 2, but it can still be promoted to "Assigned" later on 212 * if we wanted to. 213 * 214 * In particular, this is the strategy we use in our implementation 215 * (SubtypeCheck::EnsureInitialized, SubtypeCheck::EnsureAssigned). 216 * 217 * Since the # of characters in our alphabet (BitString) is very limited, we want to avoid 218 * allocating a character to a node until its absolutely necessary. 219 * 220 * All node targets (in `src <: target`) get Assigned, and any parent of an Initialized 221 * node also gets Assigned. 222 */ 223 namespace art HIDDEN { 224 225 struct MockSubtypeCheck; // Forward declaration for testing. 226 227 // This class is using a template parameter to enable testability without losing performance. 228 // ClassPtr is almost always `mirror::Class*` or `ObjPtr<mirror::Class>`. 229 template <typename ClassPtr /* Pointer-like type to Class */> 230 struct SubtypeCheck { 231 // Force this class's SubtypeCheckInfo state into at least Initialized. 232 // As a side-effect, all parent classes also become Assigned|Overflowed. 233 // 234 // Cost: O(Depth(Class)) 235 // 236 // Post-condition: State is >= Initialized. 237 // Returns: The precise SubtypeCheckInfo::State. EnsureInitializedSubtypeCheck238 static SubtypeCheckInfo::State EnsureInitialized(ClassPtr klass) 239 REQUIRES(Locks::subtype_check_lock_) 240 REQUIRES_SHARED(Locks::mutator_lock_) { 241 return InitializeOrAssign(klass, /*assign=*/false).GetState(); 242 } 243 244 // Force this class's SubtypeCheckInfo state into Assigned|Overflowed. 245 // As a side-effect, all parent classes also become Assigned|Overflowed. 246 // 247 // Cost: O(Depth(Class)) 248 // 249 // Post-condition: State is Assigned|Overflowed. 250 // Returns: The precise SubtypeCheckInfo::State. EnsureAssignedSubtypeCheck251 static SubtypeCheckInfo::State EnsureAssigned(ClassPtr klass) 252 REQUIRES(Locks::subtype_check_lock_) 253 REQUIRES_SHARED(Locks::mutator_lock_) { 254 return InitializeOrAssign(klass, /*assign=*/true).GetState(); 255 } 256 257 // Resets the SubtypeCheckInfo into the Uninitialized state. 258 // 259 // Intended only for the AOT image writer. 260 // This is a static function to avoid calling klass.Depth(), which is unsupported 261 // in some portions of the image writer. 262 // 263 // Cost: O(1). 264 // 265 // Returns: A state that is always Uninitialized. ForceUninitializeSubtypeCheck266 static SubtypeCheckInfo::State ForceUninitialize(ClassPtr klass) 267 REQUIRES(Locks::subtype_check_lock_) 268 REQUIRES_SHARED(Locks::mutator_lock_) { 269 // Trying to do this in a real runtime will break thread safety invariants 270 // of existing live objects in the class hierarchy. 271 // This is only safe as the last step when the classes are about to be 272 // written out as an image and IsSubClass is never used again. 273 DCHECK(Runtime::Current() == nullptr || Runtime::Current()->IsAotCompiler()) 274 << "This only makes sense when compiling an app image."; 275 276 // Directly read/write the class field here. 277 // As this method is used by image_writer on a copy, 278 // the Class* there is not a real class and using it for anything 279 // more complicated (e.g. ObjPtr or Depth call) will fail dchecks. 280 281 // OK. zero-initializing subtype_check_info_ puts us into the kUninitialized state. 282 SubtypeCheckBits scb_uninitialized = SubtypeCheckBits{}; 283 WriteSubtypeCheckBits(klass, scb_uninitialized); 284 285 // Do not use "SubtypeCheckInfo" API here since that requires Depth() 286 // which would cause a dcheck failure. 287 return SubtypeCheckInfo::kUninitialized; 288 } 289 290 // Retrieve the state of this class's SubtypeCheckInfo. 291 // 292 // Cost: O(Depth(Class)). 293 // 294 // Returns: The precise SubtypeCheckInfo::State. GetStateSubtypeCheck295 static SubtypeCheckInfo::State GetState(ClassPtr klass) 296 REQUIRES(Locks::subtype_check_lock_) 297 REQUIRES_SHARED(Locks::mutator_lock_) { 298 return GetSubtypeCheckInfo(klass).GetState(); 299 } 300 301 // Retrieve the path to root bitstring as a plain uintN_t value that is amenable to 302 // be used by a fast check "encoded_src & mask_target == encoded_target". 303 // 304 // Cost: O(Depth(Class)). 305 // 306 // Returns the encoded_src value. Must be >= Initialized (EnsureInitialized). GetEncodedPathToRootForSourceSubtypeCheck307 static BitString::StorageType GetEncodedPathToRootForSource(ClassPtr klass) 308 REQUIRES(Locks::subtype_check_lock_) 309 REQUIRES_SHARED(Locks::mutator_lock_) { 310 DCHECK_NE(SubtypeCheckInfo::kUninitialized, GetSubtypeCheckInfo(klass).GetState()); 311 return GetSubtypeCheckInfo(klass).GetEncodedPathToRoot(); 312 } 313 314 // Retrieve the path to root bitstring as a plain uintN_t value that is amenable to 315 // be used by a fast check "encoded_src & mask_target == encoded_target". 316 // 317 // Cost: O(Depth(Class)). 318 // 319 // Returns the encoded_target value. Must be Assigned (EnsureAssigned). GetEncodedPathToRootForTargetSubtypeCheck320 static BitString::StorageType GetEncodedPathToRootForTarget(ClassPtr klass) 321 REQUIRES(Locks::subtype_check_lock_) 322 REQUIRES_SHARED(Locks::mutator_lock_) { 323 SubtypeCheckInfo sci = GetSubtypeCheckInfo(klass); 324 DCHECK_EQ(SubtypeCheckInfo::kAssigned, sci.GetState()); 325 return sci.GetEncodedPathToRoot(); 326 } 327 328 // Retrieve the path to root bitstring mask as a plain uintN_t value that is amenable to 329 // be used by a fast check "encoded_src & mask_target == encoded_target". 330 // 331 // Cost: O(Depth(Class)). 332 // 333 // Returns the mask_target value. Must be Assigned (EnsureAssigned). GetEncodedPathToRootMaskSubtypeCheck334 static BitString::StorageType GetEncodedPathToRootMask(ClassPtr klass) 335 REQUIRES(Locks::subtype_check_lock_) 336 REQUIRES_SHARED(Locks::mutator_lock_) { 337 SubtypeCheckInfo sci = GetSubtypeCheckInfo(klass); 338 DCHECK_EQ(SubtypeCheckInfo::kAssigned, sci.GetState()); 339 return sci.GetEncodedPathToRootMask(); 340 } 341 342 // Is the source class a subclass of the target? 343 // 344 // The source state must be at least Initialized, and the target state 345 // must be Assigned, otherwise the result will return kUnknownSubtypeOf. 346 // 347 // See EnsureInitialized and EnsureAssigned. Ideally, 348 // EnsureInitialized will be called previously on all possible sources, 349 // and EnsureAssigned will be called previously on all possible targets. 350 // 351 // Runtime cost: O(Depth(Class)), but would be O(1) if depth was known. 352 // 353 // If the result is known, return kSubtypeOf or kNotSubtypeOf. IsSubtypeOfSubtypeCheck354 static SubtypeCheckInfo::Result IsSubtypeOf(ClassPtr source, ClassPtr target) 355 REQUIRES_SHARED(Locks::mutator_lock_) { 356 SubtypeCheckInfo sci = GetSubtypeCheckInfo(source); 357 SubtypeCheckInfo target_sci = GetSubtypeCheckInfo(target); 358 359 return sci.IsSubtypeOf(target_sci); 360 } 361 362 // Print SubtypeCheck bitstring and overflow to a stream (e.g. for oatdump). DumpSubtypeCheck363 static std::ostream& Dump(ClassPtr klass, std::ostream& os) 364 REQUIRES_SHARED(Locks::mutator_lock_) { 365 return os << GetSubtypeCheckInfo(klass); 366 } 367 WriteStatusSubtypeCheck368 static void WriteStatus(ClassPtr klass, ClassStatus status) 369 REQUIRES_SHARED(Locks::mutator_lock_) { 370 WriteStatusImpl(klass, status); 371 } 372 373 private: GetParentClassSubtypeCheck374 static ClassPtr GetParentClass(ClassPtr klass) 375 REQUIRES_SHARED(Locks::mutator_lock_) { 376 DCHECK(klass->HasSuperClass()); 377 return ClassPtr(klass->GetSuperClass()); 378 } 379 InitializeOrAssignSubtypeCheck380 static SubtypeCheckInfo InitializeOrAssign(ClassPtr klass, bool assign) 381 REQUIRES(Locks::subtype_check_lock_) 382 REQUIRES_SHARED(Locks::mutator_lock_) { 383 if (UNLIKELY(!klass->HasSuperClass())) { 384 // Object root always goes directly from Uninitialized -> Assigned. 385 386 SubtypeCheckInfo root_sci = GetSubtypeCheckInfo(klass); 387 if (root_sci.GetState() != SubtypeCheckInfo::kUninitialized) { 388 return root_sci; // No change needed. 389 } 390 391 SubtypeCheckInfo new_root_sci = root_sci.CreateRoot(); 392 SetSubtypeCheckInfo(klass, new_root_sci); 393 394 // The object root is always in the Uninitialized|Assigned state. 395 DCHECK_EQ(SubtypeCheckInfo::kAssigned, GetSubtypeCheckInfo(klass).GetState()) 396 << "Invalid object root state, must be Assigned"; 397 return new_root_sci; 398 } 399 400 // Force all ancestors to Assigned | Overflowed. 401 ClassPtr parent_klass = GetParentClass(klass); 402 size_t parent_depth = InitializeOrAssign(parent_klass, /*assign=*/true).GetDepth(); 403 if (kIsDebugBuild) { 404 SubtypeCheckInfo::State parent_state = GetSubtypeCheckInfo(parent_klass).GetState(); 405 DCHECK(parent_state == SubtypeCheckInfo::kAssigned || 406 parent_state == SubtypeCheckInfo::kOverflowed) 407 << "Expected parent Assigned|Overflowed, but was: " << parent_state; 408 } 409 410 // Read. 411 SubtypeCheckInfo sci = GetSubtypeCheckInfo(klass, parent_depth + 1u); 412 SubtypeCheckInfo parent_sci = GetSubtypeCheckInfo(parent_klass, parent_depth); 413 414 // Modify. 415 const SubtypeCheckInfo::State sci_state = sci.GetState(); 416 // Skip doing any work if the state is already up-to-date. 417 // - assign == false -> Initialized or higher. 418 // - assign == true -> Assigned or higher. 419 if (sci_state == SubtypeCheckInfo::kUninitialized || 420 (sci_state == SubtypeCheckInfo::kInitialized && assign)) { 421 // Copy parent path into the child. 422 // 423 // If assign==true, this also appends Parent.Next value to the end. 424 // Then the Parent.Next value is incremented to avoid allocating 425 // the same value again to another node. 426 sci = parent_sci.CreateChild(assign); // Note: Parent could be mutated. 427 } else { 428 // Nothing to do, already >= Initialized. 429 return sci; 430 } 431 432 // Post-condition: EnsureAssigned -> Assigned|Overflowed. 433 // Post-condition: EnsureInitialized -> Not Uninitialized. 434 DCHECK_NE(sci.GetState(), SubtypeCheckInfo::kUninitialized); 435 436 if (assign) { 437 DCHECK_NE(sci.GetState(), SubtypeCheckInfo::kInitialized); 438 } 439 440 // Write. 441 SetSubtypeCheckInfo(klass, sci); // self 442 SetSubtypeCheckInfo(parent_klass, parent_sci); // parent 443 444 return sci; 445 } 446 ReadFieldSubtypeCheck447 static SubtypeCheckBitsAndStatus ReadField(ClassPtr klass) 448 REQUIRES_SHARED(Locks::mutator_lock_) { 449 SubtypeCheckBitsAndStatus current_bits_and_status; 450 451 int32_t int32_data = klass->GetField32Volatile(klass->StatusOffset()); 452 current_bits_and_status.int32_alias_ = int32_data; 453 454 if (kIsDebugBuild) { 455 SubtypeCheckBitsAndStatus tmp; 456 memcpy(&tmp, &int32_data, sizeof(tmp)); 457 DCHECK_EQ(0, memcmp(&tmp, ¤t_bits_and_status, sizeof(tmp))) << int32_data; 458 } 459 return current_bits_and_status; 460 } 461 WriteSubtypeCheckBitsSubtypeCheck462 static void WriteSubtypeCheckBits(ClassPtr klass, const SubtypeCheckBits& new_bits) 463 REQUIRES(Locks::subtype_check_lock_) 464 REQUIRES_SHARED(Locks::mutator_lock_) { 465 // Use a "CAS" to write the SubtypeCheckBits in the class. 466 // Although we have exclusive access to the bitstrings, because 467 // ClassStatus and SubtypeCheckBits share the same word, another thread could 468 // potentially overwrite that word still. 469 470 SubtypeCheckBitsAndStatus new_value; 471 ClassStatus old_status; 472 SubtypeCheckBitsAndStatus full_old; 473 while (true) { 474 // TODO: Atomic compare-and-swap does not update the 'expected' parameter, 475 // so we have to read it as a separate step instead. 476 SubtypeCheckBitsAndStatus old_value = ReadField(klass); 477 478 { 479 SubtypeCheckBits old_bits = old_value.subtype_check_info_; 480 if (memcmp(&old_bits, &new_bits, sizeof(old_bits)) == 0) { 481 // Avoid dirtying memory when the data hasn't changed. 482 return; 483 } 484 } 485 486 full_old = old_value; 487 old_status = old_value.status_; 488 489 new_value = old_value; 490 new_value.subtype_check_info_ = new_bits; 491 492 if (kIsDebugBuild) { 493 int32_t int32_data = 0; 494 memcpy(&int32_data, &new_value, sizeof(int32_t)); 495 DCHECK_EQ(int32_data, new_value.int32_alias_) << int32_data; 496 497 DCHECK_EQ(old_status, new_value.status_) 498 << "full new: " << bit_cast<uint32_t>(new_value) 499 << ", full old: " << bit_cast<uint32_t>(full_old); 500 } 501 502 if (CasFieldWeakSequentiallyConsistent32(klass, 503 klass->StatusOffset(), 504 old_value.int32_alias_, 505 new_value.int32_alias_)) { 506 break; 507 } 508 } 509 } 510 WriteStatusImplSubtypeCheck511 static void WriteStatusImpl(ClassPtr klass, ClassStatus status) 512 REQUIRES_SHARED(Locks::mutator_lock_) { 513 // Despite not having a lock annotation, this is done with mutual exclusion. 514 // See Class::SetStatus for more details. 515 SubtypeCheckBitsAndStatus new_value; 516 ClassStatus old_status; 517 while (true) { 518 // TODO: Atomic compare-and-swap does not update the 'expected' parameter, 519 // so we have to read it as a separate step instead. 520 SubtypeCheckBitsAndStatus old_value = ReadField(klass); 521 old_status = old_value.status_; 522 523 if (memcmp(&old_status, &status, sizeof(status)) == 0) { 524 // Avoid dirtying memory when the data hasn't changed. 525 return; 526 } 527 528 new_value = old_value; 529 new_value.status_ = status; 530 531 if (CasFieldWeakSequentiallyConsistent32(klass, 532 klass->StatusOffset(), 533 old_value.int32_alias_, 534 new_value.int32_alias_)) { 535 break; 536 } 537 } 538 } 539 CasFieldWeakSequentiallyConsistent32SubtypeCheck540 static bool CasFieldWeakSequentiallyConsistent32(ClassPtr klass, 541 MemberOffset offset, 542 int32_t old_value, 543 int32_t new_value) 544 REQUIRES_SHARED(Locks::mutator_lock_) { 545 if (Runtime::Current() != nullptr && Runtime::Current()->IsActiveTransaction()) { 546 return klass->template CasField32</*kTransactionActive=*/true>(offset, 547 old_value, 548 new_value, 549 CASMode::kWeak, 550 std::memory_order_seq_cst); 551 } else { 552 return klass->template CasField32</*kTransactionActive=*/false>(offset, 553 old_value, 554 new_value, 555 CASMode::kWeak, 556 std::memory_order_seq_cst); 557 } 558 } 559 560 // Get the SubtypeCheckInfo for a klass. O(Depth(Class)) since 561 // it also requires calling klass->Depth. 562 // 563 // Anything calling this function will also be O(Depth(Class)). GetSubtypeCheckInfoSubtypeCheck564 static SubtypeCheckInfo GetSubtypeCheckInfo(ClassPtr klass) 565 REQUIRES_SHARED(Locks::mutator_lock_) { 566 return GetSubtypeCheckInfo(klass, klass->Depth()); 567 } 568 569 // Get the SubtypeCheckInfo for a klass with known depth. GetSubtypeCheckInfoSubtypeCheck570 static SubtypeCheckInfo GetSubtypeCheckInfo(ClassPtr klass, size_t depth) 571 REQUIRES_SHARED(Locks::mutator_lock_) { 572 DCHECK_EQ(depth, klass->Depth()); 573 SubtypeCheckBitsAndStatus current_bits_and_status = ReadField(klass); 574 575 return SubtypeCheckInfo::Create(current_bits_and_status.subtype_check_info_, depth); 576 } 577 SetSubtypeCheckInfoSubtypeCheck578 static void SetSubtypeCheckInfo(ClassPtr klass, const SubtypeCheckInfo& new_sci) 579 REQUIRES(Locks::subtype_check_lock_) 580 REQUIRES_SHARED(Locks::mutator_lock_) { 581 SubtypeCheckBits new_bits = new_sci.GetSubtypeCheckBits(); 582 WriteSubtypeCheckBits(klass, new_bits); 583 } 584 585 // Tests can inherit this class. Normal code should use static methods. 586 SubtypeCheck() = default; 587 SubtypeCheck(const SubtypeCheck& other) = default; 588 SubtypeCheck(SubtypeCheck&& other) noexcept = default; 589 ~SubtypeCheck() = default; 590 591 friend struct MockSubtypeCheck; 592 }; 593 594 } // namespace art 595 596 #endif // ART_RUNTIME_SUBTYPE_CHECK_H_ 597