Heading is linked to a Youtube video or an image of the technology.
Senseye
Senseye
- Senseye is a cool software that allows the user to control/interact their mobile device using eye movements. Just by looking at the phone, the software can detect where your eyes are and how you are moving them, which in turn controls the movement on the screen of your mobile device.
- The image captured by the front-facing camera of the smartphone or tablet is analyzed using computer-vision algorithms. Our software can then determine the location of the eyes and estimate where you're looking on the screen with an accuracy good enough to know which icon you're looking at.
- Samsung Flexible AMOLED Display is a 3D super-thin, colorful flexible screen. The screen is so flexible that it can be rolled up. The display has even been tested with a hammer, and the display surprisingly survived the impact of the hammer.
- First up, we got to check out the super-thin and surprisingly bright and colorful flexible AMOLED displays. Thanks to a plastic encapsulation method, the flexible AMOLED displays can be rolled up like a newspaper and can even survive impacts with a hammer. Samsung says that they haven’t even begun to explore the possibilities that these types of displays can foster in the near future, but it’s clear that there is some really amazing potential here.
- [IntoMobile.com] also had a chance to check out the transparent AMOLED displays. By replacing some pixels of the display with blank, transparent elements, Samsung has managed to produce displays that offer the power consumption, brightness, color saturation and contrast benefits of AMOLED while allowing you to literally look through the display. Transparent displays compromise resolution, but the effect is amazing.
- According to Wikipedia.com, Sixth Sense is a wearable, gestural interface device created by a Ph.D student at MIT Media lab named Pranav Mistry. The technology can be used to take pictures with your fingers, also enables the user to visualize information onto various surfaces including books. The YouTube video shows a detailed demo of this technology being used by Pranav Mistry.
- The SixthSense prototype comprises a pocket projector, a mirror and a camera contained in a pendant like, wearable device. Both the projector and the camera are connected to a mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks users' hand gestures and physical objects using computer-vision based techniques.[1] The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tips of the user’s fingers. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. SixthSense supports multi-touch and multi-user interaction.
- This technology has been created by the Graphics Lab at the University of Southern California. This low-cost 3D display system provides the user with several different ways of display 3D object in a 3D display.
- The display is:
- autostereoscopic - requires no special viewing glasses
- omnidirectional - generates simultaneous views accommodating large numbers of viewers
- interactive - can update content at 200Hz
- The system works by projecting high-speed video onto a rapidly spinning mirror. As the mirror turns, it reflects a different and accurate image to each potential viewer. Our rendering algorithm can recreate both virtual and real scenes with correct occlusion, horizontal and vertical perspective, and shading.
- While flat electronic displays represent a majority of user experiences, it is important to realize that flat surfaces represent only a small portion of our physical world. Our real world is made of objects, in all their three-dimensional glory. The next generation of displays will begin to represent the physical world around us, but this progression will not succeed unless it is completely invisible to the user: no special glasses, no fuzzy pictures, and no small viewing zones.
- These contact lenses have been created by Professor Jin Zhang from University of Western Ontario. The contact lenses change color according to the glucose levels of a diabetic patient. This technology can save diabetic patients from going through the painful process of drawing blood a few days a day.
- Professor Jin Zhang developed the technology which uses engineered nanoparticles embedded into hydrogel lenses. The nanoparticles are engineered to react with the glucose molecules contained in tears. When sugar levels rise or fall, a chemical reaction causes the lens to change color, allowing the wearer to adjust their glucose accordingly.
- CellControl is a technology that stops teenagers/drivers from using their mobile devices while driving. The CellControl device is activated when the vehicle is moving, which prevents the drivers from getting distracted with phone calls or text messages. The driver is able to use his/her phone once the vehicle has come to a stop.
- Important Feature Comparisons
I have spent 2 hours on this assignment.
No comments:
Post a Comment