Skip to content

Commit 8640000

Browse files
committed
Restore width and height attributes for centered images
Restored the original width and height attributes (340x130 and 480x355) for the two centered images to maintain their fixed sizing. Signed-off-by: Bram Wasti <bwasti@meta.com>
1 parent e4cb595 commit 8640000

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

_posts/2025-11-10-bitwise-exact-rl.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,13 +16,13 @@ Discussion on this can be found on ThinkingMachine’s post Defeating Nondetermi
1616

1717
Floating point numbers are effectively a binary scientific notation. They utilize three components: a sign bit (s), a mantissa (M) and an exponent (e).
1818
<p align="center">
19-
<img src="/assets/figures/2025-11-10-bitwise-exact-rl/floating-point-representation.png" />
19+
<img width="340" height="130" src="/assets/figures/2025-11-10-bitwise-exact-rl/floating-point-representation.png" />
2020
</p>
2121

2222
Each of these components are represented as integers and suffer from the exact same rounding errors you might expect. In bf16, the most commonly used representation for machine learning, 7 bits are dedicated to the mantissa. This is not very many bits! The value 3.0 can be represented exactly, but a value like 3.6 cannot…
2323

2424
<p align="center">
25-
<img src="/assets/figures/2025-11-10-bitwise-exact-rl/bf16-rounding-example.png" />
25+
<img width="480" height="355" src="/assets/figures/2025-11-10-bitwise-exact-rl/bf16-rounding-example.png" />
2626
</p>
2727

2828
When you want a new value in bf16 you end up rounding it to the nearest available value. What’s of particular interest today is the implication of this rounding process happening at different points in a sequence of additions.

0 commit comments

Comments
 (0)