Its incredibly true that due to switching to a more PC-like architecture, backwards compatibility is 10x more feasible for the next round of consoles. Also factoring in Sony's seemingly renewed interest in added-value functionality ( more about the reasons why we bought launch PS4s - Sony were more attuned to the things gamers wanted, and the recent focus on providing working cross-play to add value to their system), we're more likely to see it than PS4-Earlier systems.
However, this patents explicit function is different ;
The unique identifier can be rendered by imposing a hash on the asset, and then the asset stored with its identifier in a data structure. An artist remasters the textures for presentation on a higher resolution display than envisioned in the original software, and stores them back in the data structure with their identifiers.
This is more of a way of better structuring remastered assets for higher resolution screens, the next part is more interesting;
The original software is then played on the higher resolution display, with asset (such as texture) calls being intercepted, identified, and the data structure entered to retrieve the remastered asset having a matching identifier. The remastered asset is then inserted on the fly into the game presentation.
I'm not convinced that this is explicitly part of the creation process of backwards-compatibility software but more of a developer guideline for running software on present/future consoles. Rather than having to develop multiple running versions of the game for various resolutions, it might just be a way to run ONE executable version of the game that instead of presenting different assets to different systems, the system itself will pull the assets it wants respective to the display resolution. Whether this is meant to help performance is unclear - it may even be a way to place more software security into the new hardware.