On my tech geek wish list this year is the holographic phone. Takee's holographic 3D phone was released for consumers last year and won the CES Innovations prize earlier this year in January.
According to reviews online, the phone itself is just like any other average phone, but the holographic images are superb. However, the holographic images can only be seen by the user and I became curious about exactly how it worked. The promotional materials from Takee says it uses 4 cameras to determine the location of the user's eyes to transmit images. However, I decided to take a look into its patents to see exactly what the phone actually does.
What I discovered was that the Takee holographic phone holds patents in aerial induction. In simple terms: the ability for touch-air control as opposed to touch screen. The cameras determine the relative positioning of the user and creates a user-touch interface parallel to the screen, thus being able to select objects in the air. Last year they received a patent for an aerial induction device with 4 points (the current consumer release Takee phone), however, this year, they were granted a patent for an aerial induction device with 6 points.
However, where did their holographic imaging system come from? And why was it only available to be seen by the user, but not by users surrounding the user and why weren't these holographic images able to be recorded by a video camera?
Looking more into its patents, I discovered that the holographic interaction device based on signal method was a patent held by Estar Display Tech Co. (the parent company of Takee).
Inventor(s): LIU MEIHONG; CHEN YIHUA
Applicant(s): SHENZHEN ESTAR DISPLAYTECH CO
Classification:- international:G06F3/01- cooperative:
Application number: CN20141842580 20141229
Priority number(s): CN20141842580 20141229
In essence, the holographic images are produced via auditory signals that are sent to the neocortex of the user's brain. This patent is based upon another patent that was developed by the Sony Corporation in 2000:
Method and system for forming an acoustic signal from neural timing difference data
US 6584357 B1
ABSTRACT: A non-invasive system and process for converting sensory data, e.g., visual, audio, taste, smell or touch, to neural firing time differences in a human brain and using acoustic signals to generate the neural firing time differences. Data related to neural firing time differences, the acoustic signals, and a user's response map may be stored in memory. The user's response map may be used to more accurately map the calculated neural firing time differences to the correct neural locations.
Publication number US6584357 B1
Publication type Grant
Application number US 09/690,786
Publication date Jun 24, 2003
Filing date Oct 17, 2000
Priority date Oct 17, 2000
Also published as US6889085, US7542805, US20030195584,US20050197679
Inventors Thomas P. Dawson
Original Assignee Sony Corporation, Sony Electronics, Inc.
In Summary: WHAT IT DOES
1. non-invasively projecting a first acoustic signal into the brain, the first acoustic signal affecting a neural firing time at a first neural location in the brain;
2. storing a user sensory response and data related to the first acoustic signal in a memory;
3. non-invasively projecting a second acoustic signal into the brain; and
4. storing a user sensory response and data related to the second acoustic signal in the memory.
This Sony patent is quite remarkable as now sensory information, such as sight, smell, touch, hearing can now be transmitted remotely via acoustic signals directly into the brain. Wouldn't it be great if we could smell and touch things as we are browsing the internet? Certainly this has a lot of potential for eCommerce, in addition to other areas.
In summary, the Takee holographic phone uses holographic imaging via auditory signaling into the neocortex of the user brain that had been first developed by Sony, so that only the user can see the images that are being projected. Then via its aerial induction interaction device, the user can select items via touch in the air (touch-air). In theory, the 4-6 cameras on the Takee phone determine the positioning of your touch-air area, but they are not really necessary to visualise the holographic images- as that is achieved via auditory signaling directly into the neocortex of the user's brain.
Although the Takee phone has been released for consumer usage last year, I can already see that there might be some obstacles for the release of this phone into the US consumer market as the US has numerous legal regulations and not certain if the FCC (Federal Communications Commission) would grant permission by phone manufacturers to send signals directly into user brains, despite the fact that other similar uses of analogous technology such as thermoacoustic signaling is used in medical devices to create 3D images of people's organs (eg, ultrasound etc).
In any case, this is an interesting sector to watch, and to be honest, although I just ordered my first smartwatch from Sweden, I am not really a fan of wearable tech, as I don't like electronic devices or electronic jewellery or any wearable tech close or in direct contact with my body (eg, Google Glass, Oculus VR helmet) but I do like the fact with this Takee phone that there is a movement away from touch-screen to touch-air, and perhaps their next generation of phones will utilise more of the Sony auditory signaling technology to transmit music and sounds in which I wouldn't need headphones to be able to hear music.
But of course, this technology will best be utilised for blind and deaf persons, and as the inventor Thomas P. Dawson (Sony Corp) wrote: