Hello All,
This is my first post, and you all seem like a knowledgable bunch, so I’m hopeful someone can help me. I’d like to create a 360 video that can playback inside an open source player for iOS/Android and the audio playback would be dependent upon where the viewer is pointing their attention. So as an example, if you shot a video in a restaurant and had people sitting all around you, where ever the viewer focused (pointed) the playback device, the audio from that area would be louder while the audio from the other areas would drop away to just background noise. If the viewer focused on another table (area of video playback) then that audio would take prominence and the others would drop away.
Is this even possible? Has anyone played with this?