Technology Updates: June 2011

Can Imagine where there Damn technology will take us

Can Imagine where there Damn technology will take us
Welcome User. Have Crispy Bite of new Technology and Spice of new Inventions regularly and be updated with TechnoFreak world

Sunday, June 26, 2011

Facial Robots are here.


Now, it’s you at two different places at same time
You on your bed and your replica is attending meeting
Humans were expecting teleportation in 21st century but it seems like not possible in near future. But you know we humans are not at all going to stop and find out another way to satisfy that desire and here we are. Now the scene is like you land on your computer and see and speak in remote location and also move around up to greater extent.
Two Robots namely Anybots, VGo were already set to rock the market and the third ROBOT ready to join them. Suitable technology has developed ROBOT named “Texai”, which is about to launch in market but probably in next year. Problem with the Anybots and VGo is person who is interacting with you at remote place cannot see operator’s clearly but in place of that operator was getting good surrounding feel and look and feel of people around at remote place. Anybots use to show only the still photo of the user while VGo is just 4 feet long Robot with small resolution screen. This is the area where Texai can beat and the market of the previous inventions and go ahead of that and two way interaction will be possible later on.

VGo user says, "My robot body could do some of the basic things I would do in person: move around the office to talk and listen, see and be seen. But it couldn't do enough. In a group conversation, I would clumsily spin around attempting to take in the voices and body language outside my narrow range of vision. When I walked alongside people, I sometimes blundered into furniture, or neglected to turn when they did. Co-workers were tolerant at first, but they got frustrated with my mistakes."
Now Texai Robot has only one limitation that it as it does Two way communication efficiently, it need good bandwidth for having uninterrupted relay. But hoping good high speed internet facility is available now in mostly all the House and Business work place says Willow Garage, Co-Designer of Texia.

Friday, June 24, 2011

A Biologically Inspired Search Engine


Ever found a product in a store and wondered if you could get it cheaper somewhere else? Soon a visual search tool will be able to help. Take a snapshot of the product with your phone and it will automatically pull up online pricing information.
The technology, developed by Cortexica, a start-up spun out of research conducted at Imperial College London, has already been used to create a wine comparison app called WINEfindr. Last week, the company launched an application-programming interface (API) for the technology, which will allow others to build similar apps.
It's a bit like the bar-code scanning apps that link a physical object in the real world to online content," says Anil Bharath, a researcher at Imperial and cofounder of Cortexica. "But rather than having to create a QR code, it recognizes the object itself," he says.
Cortexica's Visual Search platform uses techniques inspired by the human vision system to compensate for different lighting conditions. It identifies key features of an object irrespective of their orientation, size, or how dark or light they appear in the image. This makes it possible to identify products at a distance or even while they are moving. Cortexica's technology can also spot logos and objects in videos.
"The technology is interesting, but they aren't giving away much," says James Ferryman of the computer-vision group at the University of Reading, in the U.K.
Ferryman notes that other visual search tools already exist, such as Google's Goggles, which recognizes many objects, labels, and landmarks and automatically searches the Web for information about them; and TinEye, a service that lets users upload an image and search the Web to find webpages on which the thing pictured appears.
Another of Cortexica's cofounders, Jeffrey Ng, says his company's technology is more accurate and scalable than any other now available.
The human vision system compares different points of an image with its neighbours’—a phenomenon known as "edge extraction"—in order to identify features in a range of different conditions. "We have basically copied that architecture," says Bharath. Cortexica uses graphical processing units (GPUs) to handle the parallel processing.
Coping with variations and resolving them is a major issue in computational vision, says Ferryman. "It's crucial. If you can't have this invariance, then you can't do reliable matching," he says.

Sunday, June 12, 2011

You will say, “Ah! Yes my house runs on Android”


Oh! Come on I am not going to stand up and switch on the damn lights and fans. Have you ever thought that, there should be something in our hand which can control all electronics of our House? It should turn off your oven sitting on sofa busy in watching something interesting on TV. All the damn electrical appliances can be managed from small device in your hand. Now this time is not too far friend.

Technological freaks at Google have come up with a concept of Controlling home via Android. Obviously it sounds weird but actually it is reality and so that in some while we will able to see in our houses. “We'd like to think of Android as the operating system for your home” Joe Britt, who is leading this project at Google. The device “Tungstens” will act as interface between tablet or android phone and home appliances. This will remotely allow operating anything.

 



Researchers have demonstrated that how this can change the gaming experience. You can have light emitting blubs and surrounds, which are in synch with your game via Tungsten device. Now in Counter Strike whenever you will “Fire in the Hole” with grenade and your room ambience will make you feel like grenade explosion and firing.

Google encourage other companies to research and develop the product based on this technology because Android is open source software. This device uses the Wi-Fi to connect to the Internet as well as new low power wireless standard of Google’s own invention to link with other devices. Moreover in demo, built-in device RFID reader helps to read the CD which is not inserted in CD player or personal computers. Another wonderful usage is to use this technology as alarm. Just think of it that when alarm triggers, LED lighting in room changes like a morning and you are waking up with sweet music rather than “te ne ne ta… te ne ne ta…”
Now, we just need to wait and watch for the arrival of things like this in our life. It will really make a man so comfortable to make us so lazy creatures more than Croc.
“Humans and Technology, Always Unstoppable.”

Sunday, June 5, 2011

Provide snaps and 3-Dimensional object is ready


Autodesk, Software Company is going to launch freeware named Photofly which can generate 3D model. Software is available for Windows platform, uploads photo to cloud server and process the photo and parameters for generating 3D model out of it.
This technology works on density of the object like laser scanner works. Although this technology won’t cost you thousands of bucks as it needs are just 35-40 photos with required angles. Say for example, if we want to make a 3D model of a person you need to capture snaps of the head and shoulders of that person and just give it to Cloud server to process it into 3D model. You can print this model with the help of 3D printers which is not so costly these days and you can have print-out of 3D models in forms of ceramics, plastics and metals.



Autodesk will be the first company to launch such an application and on other hand Microsoft Researchers are coming up with substitute product of Photofly that is ‘PhotoSynth’. Addition to that it will be application by which you facilitate your cell phone to convert its photo into 3D models says Mathews.
It might interest you to take a hawk eye view about working of this technology. The Application ‘Photofly’ goes through many processes like identifying the position of camera and triangulating it based on different views and certain distinctive features. Further going too sharp, it triangulates the view more efficiently on second round.
Yuan-Fang Wang, a computer scientist at the University of California, said that “Technology has become very robust and simple to place this in front of consumers but there are some limitations which may be concerns for the same”.  “The efficiency of the technology will be tested hard when snaps will be taken in dark light or too flashy environment”, he added. As a major implementation of this technology, paleontologist named Louise Leakey in Kenya have captured early human bones with accurate measurements of specimen, such as spacing and sizing of teeth. On other hand as major usage we can see Photofly helping Engineers in retrofitting the building and saving their time, efforts and resources. They can with the help of Photofly, quickly figure out what need to be done to make building more eco-friendly.

Saturday, June 4, 2011

Your palm becomes your iPhone

It will be past, when you are using your smartphones holding in your hands and tapping the screen. Now when we are going so strong on sixth sense technology implementation, German researchers have decided to apply the same technology on simplifying the usage of smartphones. You will be having an imaginary phone screen on your palm and you need to just tap your palm to operate your phone and the system sends the command to actual phone.
The system relies on three wonderful piece of technology, they are depth sensitive camera, software analysing the picture, and wireless radio to send the instruction back to actual phone. Patrick Baudisch, say “serves as a shortcut that frees user from the necessity to retrieve the actual physical device”. Now if you are driving a car and you need to attend the phone call, you don’t need to tap the phone screen rather you can just receive by one click of your finger on your hand. Guys, See here how it will work...


The concept is very thoughtful and innovative implementation of the Sixth Sense technology developed by Pranav Mistry and Pattie Meas. But this is all together a different way of exploiting the Sixth Sense Technology. Here first of all, there won’t be any strict gestures to learn like suggested before. Moreover this is feedback free way of using Sixth Sense Technology, which means there will be nothing like projector or any interface which will provide reflection of your acts. On first thought you may be finding bit weird but all together it’s not completely replacing the physical existence of actual smart phones rather it is just a way of making interaction with actual phone more convenient.

Crisp of the technology is capturing the gestures of the person. Camera focuses the finger position on the palm and subtracts the background. This technology has worked in harsh lighting conditions and on over brightness too. Software identifies finger positions and correlates with the object position on the actual screen of iPhone. As said mentioned earlier Wi-Fi radio transmits these movements to the phone.

Study carried out in October, it has been found out that usually people finds correct position of 2/3 icons on the blank iPhone and on the tapping they are 80% accurate. Daniel Vogal, postdoctoral fellow of University of Waterloo, says “It’s a little bit like learning to touch type on the keyboard, but without any formal system or the benefit of the feel of the keys.” He added that voice control can also be a better serving option but have some limitations like it can fail in noisy environment. Anyways we will always be waiting for such a wonderful piece of innovation which will make out like more lethargic but pretty interesting and full of technologies all around.