So moving on from being wilfully ignorant of blockchains, databases and the like you are now dismissing hashing? A little thing that underpins security on the internet, in all manner of transmission methods, has decades of solid maths behind it, is widely accepted by courts anywhere people actually want to live as top quality evidence and so on and so on. Your rants on the evils of blockchain technologies, smart contracts and the like are vaguely amusing (most luddite rants are) but misleading people on vital technologies is a very bad look.
Barring introduction of quantum computers (which would be world changing in general, and just mean you do it on those instead) it is a fairly settled area of maths.
Hashes/checksums generally run the gamut of
Detection against random corruption (say in transmission or storage), can also be combined with data rebuilding methods (see parity files) based on oversampling.
Detection of tampering.
With a tiny bit of asymmetric cryptography you can also make it so the message originator has the private keys of a setup. This is called signing, though in reality you tend to generate a list of hashes and sign that instead (only the original file should produce the hash, a far smaller amount of data to send through the checking methods. See HMAC, hashed message authentication system). Anything suitable for this being dubbed cryptographic grade.
As computing power rises and occasional leaps in mathematics* happen then some hashes fall further down the list in their suggested uses. The now ancient MD5 method having fallen down the list (we did cover it around here somewhere as it was a PS3 cluster that managed it to generate a SSL signing certificate, in conjunction with a less than stellar authority that had other failings in randomness generation.
https://hackaday.com/2008/12/30/25c3-hackers-completely-break-ssl-using-200-ps3s/ ) such that it is only suggested for corruption detection and light duty tampering for which someone breaking it is no big deal (single player save game maybe) but it is fast compared to greater methods so people still keep it around.
If some NFT or the underlying blockchain out there uses a weak one then that is an implementation issue, though I am not aware of any that have been found to use them (much less anything anybody would care about).
*if those interested in maths but unfamiliar with all this step in then it is a fantastic introduction to the idea of computing complexity. Baseline is brute force in which you compare say the output of the sun turned into computing power (see Dyson sphere, or perhaps matrioshka brain) to change one bit at a time to get a collision. Most things in common use today taking that and probably needing it to run longer than the universe has been around to break it. Anything that reduces that time is considered an attack vector, but it is still likely to have a time measured in comparison to how long the universe has been around/will be around. If you manage to reduce this to something that could be run with a lot of computing power (or near future amount thereof, timeline also depending upon how bad it would be if it was broken -- secret records that are sealed for 50 years being different to yesterday's login to a website for which all secret data is now expired).
Skip to 6:20 or so if you want but the whole thing is not bad
There is much that could be covered (haven't even mentioned trapdoor and one way functions yet, or 30 what numbers did I multiply to make it? While that is not very useful if you do things with large prime numbers and ellipses you the two major classes of asymmetric cryptography. Asymmetric being you have one key to encode and another to decode rather than one key for both aspects. Haven't mentioned avalanche effect either, aka why a small change in the underlying file should massively and unpredictably change the hash, also used in most blockchains as an underpinning concept to make each round random as it were) but I will leave it there for the time being.