It is often possible to improve the human eye's perception of an image, but there is no real substitute for the quality of the original source material.
Indeed: crap in -> crap out.
But slightly off topic what I do find interesting in digital imaging is where's the difference in manipulating the pixels on the camera's sensor or in software, it's all 0's and 1's.
Yes, binary data is just data but the acquisition of that data begins to take place in the analog world prior to the impact of the light on the sensor. Where the photographer aims the camera, how long the shutter stays open, what the f stop setting is, how long the distance is between the lens and the sensor, these are all analog. The quality of the lens and the technology that determines the shape of the lens is also analog. The post production software is not part of the acquisition process. A sharpening algorithm cannot compensate for aiming the camera wrong or leaving the shutter open way too long, etcetera.
Although the sensor is converting analog light into binary data, yet it is part of the acquisition process which a post production sharpening tool is not.
In theory A/D conversion (and vice versa) should be seamless given enough resources, accuracy of the converters themselves, enough CPU & Memory and speed of the I/O. With a push to infinity it all becomes "almost" analog in quality, but never quite so. I am not a math wiz but suspect somewhere there's an intersecting curve when digital becomes just as good given our own natural imperfections, yet it'll cost allot.