QUOTE(Butterflyla @ Sep 14 2022, 08:45)

I'm curious to what degree background radiation - if any - from steel forged post-1945 has on the strength of a blade. Do you happen to know off-hand?
It should be a non-issue. Isotopes of an element have the same physical and chemical properties apart from their atomic mass, and modern methods produce steel that has no significant impurities. Fresh iron ore as dug out of the ground hasn't been contaminated by nuclear test fallout or reactor waste, and absorbs a tiny amount of radioisotopes from the air during smelting. The main contaminant is cobalt-60, which has a half-life of only 5.3 years and produces beta & gamma radiation as it decays to stable nickel-60. Low-flux beta & gamma radiation do basically nothing to steel, just temporary excitation of the electrons. Cobalt & nickel are intentionally added to many steel alloys, and the percentage by mass in the end product is tested and verified to be within spec for the alloy. So you get a slightly, slightly radioactive steel with no difference in its mechanical properties that only matters if it's around the most sensitive of detectors. Worldwide anthropogenic background radiation peaked in 1963 and has fallen to 1/20th since then with the end of most nuclear testing.
Steel produced
too long ago is going to run into quality problems. Even basic modern blade steel outperforms anything from feudal times.