Tech

The very important reason for these small robots is to take pictures of the cats

In a shoot-out with a picture inside a cozy cafe in San Francisco, cute playful models struck. Some sprawling on the pillows are shaggy, and the limbs lift weakly. On the other side of the room, there was one statue, figuratively, at the top of a small white table while another stroll down the wooden path. Photographers took over their movements, quickly clicking from different directions and different points. The pictures were distorted, moody, and sometimes furry. This was not an ordinary scene. The subjects were cats with names such as Passion, Shiloh, Buffy, and Blinx, living in a cafe called KitTea, where visitors can pay for drinks and snacks while lounging with loyal and colorful. The photographers were engineers from the Anki Consumer Robotics Company, who took over these blurry creatures on three small Vector robots specially designed for this task. The sitting rooms are near the vector robot in KitTea. The task was to take as many pictures as possible to help Vector learn how to spot animals living in people’s homes. Data – like, gentle cats pictures – is crucial to building artificial intelligence. The collection process is becoming increasingly important because we rely on artificial intelligence to do a growing number of things, from helping self-driving cars to the streets to getting virtual assistants like Alexa to answer the sounds. This is because in order for AI to work well, it must be trained first on a lot of data – not just any kind of data, but information that reflects the types of tasks AI will work on. But it is not always easy to collect that data. One can even say that the process can be, for example, grazing cats. The Android Vector, which costs $ 250 and starts charging in October, is the intersection between a companion and a pint-sized assistant. It can give you an update of the weather, answer questions, take a picture of you, and play with the little cube that comes with the light. It is the latest model of the robot from Anki, which has sold 2 million robots so far. The vector looks like a small black bulldozer with a repetitive lift, and eyes with light bright colors. Android – its creators always refer to it as “it” – scammers and chatter, whether or not anyone plays with it, looks like a cross between a wall, a guinea pig and a bug. Vector depends on the data to see how to do all kinds of things. This includes using the front camera to recognize people and avoid collisions with objects or microphones to listen to human commands that begin with “Hey Vector” and respond appropriately. A snapshot taken by a Vector robot. One thing Vector can not do now is spot pets. “This is a problem for a robot that aims to interact with the world around it, which will include cats or dogs in many homes,” says Andrew Stein, chief computer vision engineer at Enki, a cat owner himself. “If he is intelligent about his environment and responds to a cat differently than a coffee mug sitting on his table, then he knows what the cat is, and that feels lost,” says Stein, nearby, droplets of vision loitering on a carpet. Anki’s engineers use artificial intelligence to teach Vector how to do it. An essential (and sometimes difficult) part of making this work involves collecting data – in this case, this data includes images of cats sitting, fast scrolling, scratching and smelling. The company, which is also working on dog discovery, hopes to launch a feature that will allow Vector to visualize cats and dogs at the beginning of next year. In the beginning, Stein said, Vector will simply be able to detect a cat or dog at home, and the company is studying a range of simple reactions that can be, such as taking a picture that photo owners can view in a companion smartphone application, or somehow interact with pets . Cold Cats But Getting Carriers Note the cat roaming around your living room is not as simple as showing only thousands of pictures of cats from existing databases on the Internet. Anki’s engineers have already used tens of thousands of these images to train a neural network – a kind of automated learning algorithm that is loosely designed after the way nerve cells work in the brain – to detect the underlying cats. But Stein said the images in these databases are quite different from the way cats look from Vector’s point of view, which can be high above an animal or really in front of its claws, and probably at home. Image clipping, captured by the Vector robot in KitTea. “The key is to get the data that represents what he will actually see when we publish it in people’s homes,” he said. Stain believes that these images will “synthesize” Enki’s neural network, which Victor can use to better locate furry friends. This approach is largely logical for Jason Corso, assistant professor at the University of Michigan who studies computer vision and video understanding. If Anki uses only data sets on the web, YouTube videos, or Flickr videos for cats, his data will have all the biases

Tags

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Close