All PostsFuture Tech

Does this robot chef make the perfect burger?

Future Tech: How to use your brainwaves to interact with robots & an augmented reality flashlight.

Automated burger station will make your mouth water

From Fast Company:

Through a glass case in a new restaurant in San Francisco, I’m watching the chef make lunch. That chef is a Rube Goldberg-like machine, slicing buns, adding condiments, grilling meat, and spitting out a fully prepared hamburger — all without any human intervention.

A row of brioche buns moves to the right, dropping one bun down a slot where a tiny saw slices it in half. The machine adds a little butter, toasts the bun and drops it in a box on a conveyor belt, where the machine squirts a precise amount of each sauce for the order, slices tomatoes and onions in real time, grates cheese and grinds beef to order before cooking the patty.

In five minutes, your meal emerges. I’m a vegetarian, but the meat-eating colleague I brought with me declares that the burger is very, very good.

“Growing up, my job was to make hundreds of the same burger over and over,” says Alex Vardakostas, co-founder and CEO of Creator, the new restaurant, which is opening with a soft launch today. “I saw so many opportunities where I wanted to do it a bit better, slower or more personalized, but it’s impossible when you have to manually make that many burgers with rudimentary tools.”

As a college student, as Vardakostas used sophisticated instruments in a physics program, he began to wonder if similarly advanced tools could be used in restaurants. “We don’t see it as a robot,” Vardakostas. “I see it as the ultimate kitchen instrument. It’s just a utensil. The whole thing started — if a better griddle makes a better burger, let’s go all the way. It just happens that it basically has to be as sophisticated as what some people call a robot.”

Read full story…

MIT has developed a way to correct robots through thought and hand gestures

MIT corrects robots through thought and hand gestures

From Popular Mechanics:

Even robots make mistakes sometimes. That’s why researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have built a system which allows robots to be corrected through thought and hand gestures.

The system monitors brain activity, determining if a person has noticed an error in the machine’s work. If an error is detected, the system reverts over to human control. From that point, all it takes is a flick of the wrist to get the robot back on the right course.

“This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback. By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity,” says CSAIL Director Daniela Rus, who supervised the work, in a press statement.

The team used a worker bot called “Baxter” from Rethink Robotics for testing. Using the MIT biofeedback system, the robot was able to improve its accuracy from 70 percent to 97 per cent.

Read full story…

This flashlight projects AR images onto your surroundings

From Mental Floss:

Compared to sleek smartphones and augmented reality goggles, a flashlight looks pretty low-tech. But what if you used that familiar design as a vehicle for some of today’s most exciting technology? That’s what Arvind Sanjeev accomplished with Lumen. The master’s student at the Copenhagen Institute of Interaction Design has reimagined the handheld flashlight as a platform for augmented reality.

What sets Lumen apart from other AR products, like Microsoft’s HoloLens or even the apps on your iPhone, is the straightforward design. Most people know how to use a flashlight — pick it up by the handle, click it on and point the light at whatever you wish to see. Lumen operates on a similar principle, but instead of illuminating objects with light alone, it projects relevant information onto them that enhances the way users experience reality.

Using a built-in camera and a special algorithm, the flashlight can identify the objects in its path. Direct it at a stereo and it will project its own interface with dials you can actually use. Point it at the ground and it can show an arrow leading you to your destination like a maps app. Developers can work with the interface to program their own responses to appear when Lumen lands on a certain item.

Read full story…

 

See more IT & Tech innovation stories and let us know the interesting technology stories you come across.

The Editors

The Editorial Team develops articles, company profiles and resources for the Business Hub to bring IT, tech and innovation stories to the Manitoba business community.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Close
Close