In a chat I had recently with another communications geek, we talked about the well known problem of videoconferencing systems. You look at a person on the screen, and the camera is not where you are looking, so eye contact is not possible.
There have been a few solutions tried for this. You can have a display with a beam-splitting mirror that allows a camera to see a well lit subject, at some cost of quality of the image. You still need to keep the camera on the eyes. There has been some experimentation with software that would have cameras at the left and right of the screen and combine the two images to make one from a virtual camera at the eye point, or sometimes more simply to rewrite the image of the eye to move the pupil to the right place. That turns out to be hard to do because we are very discerning about eyes looking “natural” though it may become possible.
Another approach has been semi-transparent displays a camera can look through, but we like our displays to be crisp and bright. A decade ago I saw guys claiming they could build a display that could focus light without a lens, so each cell could have a sensor, but I have not seen anything come of this. In the end, most people try to place the camera near the top of the screen, and the image right under it.
Having the image under the camera makes the person look like they are looking down. This causes some women to perceive this as something else they frequently see — men staring at their chests when they talk to them. Yes, we’re pretty much all guilty of this.
So I came up with an amusing, not entirely serious answer, namely to put the camera below the image and then, for men at least, stare at her chest, or an imaginary one below the edge of the screen. Then you would be looking at the camera and thus at the other person.
Amusingly, when videophones are shown on TV, we almost always see the people staring right into them, because they are TV actors who know how to find their camera.