I've found multiple sites suggesting the code:
md5(file_get_contents($filename));
Until recently, I hadn't noticed any issues with this locally... but then I tried to hash a 700MB file, with a 2048MB memory limit and kept getting out of memory errors...
There appears to be a limit to how long a string the md5() function can handle, and the alternative function is likely more memory efficient anyway. I would highly recommend to all who need file hashing (for detecting duplicates, not security digests) use the md5_file() function and NOT the regular string md5() function!
md5_file($filename);
Note, to those interested, as this was for a local application not a server, I was more concerned with results than memory efficiency. In a live environment, you would never want to read an entire file into memory at once when avoidable. (at the time of coding, I did not know of the alternative function)