I thought going either direction needed active chips.
The DP standard includes the ability to "downshift" to HDMI (and/or DVI-D) signaling for compatibility reasons. At the moment, DP supports the highest bandwidth per port, so it is the preferred connection method for ridiculously large and/or high refresh rate monitors which require that much bandwidth in order to push the quantity and/or quality (HDR) of pixels required.
HDMI v1.4b (newest 1.x version) = 10.2Gb/s = 4K@30Hz or 1080p@120Hz max.
DP v1.4a (newest 1.x version) = 32.4Gb/s = 4K@120Hz (and even 5K@60Hz) or presumably 1080p@360Hz max.
To give an idea of how much DP is "ahead" of HDMI, DP 1.3 and newer actually embed/support the HDMI 2.0 standard.
HDMI v2.1b (newest AVAILABLE 2.x version) = 48Gb/s = 8K@50Hz (or up to 8K@120Hz with DSC/Display Stream Compression) and presumably 1080p at whatever refresh rate you want (the math suggests 540Hz but dunno about IRL).
DP v2.1a (newest AVAILABLE 2.x version) = 80Gb/s = 10K@60Hz SDR (or up to 16K@60Hz HDR with DSC) and the math says 1080p would be @900Hz but now it's just getting silly.
Both standards are about to update to newer/faster versions, with HDMI finalizing v2.2 only last month and DP releasing v2.1b sometime this Spring.
(all info pulled from Wikipedia)
Keep in mind that moving up from SDR (8bpc) to HDR (10bpc) content increases the bandwidth requirement for video by 25%, so there are those times where something says "This device outputs at 4K HDR" but it doesn't work because it's trying to do HDR and the connection
just doesn't have enough bits for it and drops back to SDR or else gives you a black screen.
tl;dr: If you have the choice, choose DisplayPort.
--Patrick