November 4, 2014 - TECHNOLOGY - Years ago, Ray Kurzweil popularized the Teminator-like moment he called the 'singularity', when artificial intelligence overtakes human thinking. Nowadays, Kurweil's vision and the quest for "conscious" sentient robots, seems to be just around the corner, if you go by the following reports:
Microchip breakthrough enables emotional response in AI robots and consumer electronic devicesEMOSHAPE (www.emospark.com) has announced the launch of a major technology breakthrough with an EPU (emotional processing unit); a patent pending technology which creates a synthesised emotional response in machines. This represents a significant advancement in the field of artificial intelligence devices and technologies.
Based on the 8 primary emotions identified by Robert Plutchik's psycho-evolutionary theory, the ground-breaking EPU algorithms effectively enable machines to respond to stimuli in line with one of the 8 primary emotions - anger, fear, sadness, disgust, surprise, anticipation, trust, and joy. This video demonstrates how the EPU empowers machines with empathy: http://youtu.be/nMRjSijBsLc
This is the first time that the science and technology industry has empowered machines to respond with human emotions, which is set to deliver a yet undiscovered level of user experience between people and emotionally enabled technology.
Patrick Levy Rosenthal CEO of Emoshape said: “How can any inanimate object interact with humans and learn how to please them without empathy? The EPU advancement represents a step change for the future of technological goods such as smartphones, computers, toys, medicine, finance and robotics.”
The US and London-based company is now set to launch the production of their first A.I. home console in time for Christmas 2014. The company is also seeking private investment to help roll out mass production of the EmoSPARK cube. Emoshape has now opened up its capital to investors via fundable.com (http://fundable.com/emoshape-lic).
The EmoSPARK cube can fit in the palm of the hand with a purpose of being a digital friend. It monitors a person’s facial expressions and emotions by capturing images through an external camera. The images are then processed until the cube can recognise who the person using it is, and their relationship to others. It will monitor users’ responses to the world around them with a focus on the user’s reaction to music.
EmoSPARK will learn what is liked and what isn’t liked by using seven emotions to create a personal map of the user’s personality. It can track joy, sadness, disgust, fear, anger, trust, anticipation and surprise.
Founded in 2014 and privately funded, Emoshape is dedicated to providing powerful and easy-to-use emotional technologies. Emoshape is a company associated with evolutionary technology that will realise people’s dreams. - Herald Online.
WATCH: Foretelling/Predictive Programming - "Chappie" Trailer starring Hugh Jackman.
Artificial Intelligence Outperforms Average High School SeniorArtificial intelligence in Japan is getting closer to entering college. AI software scored higher on the English section of Japan’s standardized college entrance test than the average Japanese high school senior, its developers said.
The software, known as To-Robo, almost doubled its score on a multiple choice test from its performance a year ago, indicating progress toward a goal set by its developers to eventually pass the entrance exam for Tokyo University, Japan’s most prestigious college.
“The average score for the English section of the standardized entrance exam was 93.1 (out of 200), but the AI scored 95,” a spokesman for NTT Science and Core Technology Laboratory Group said. Last year the software scored 52.
The NTT lab is developing the software alongside the National Institute of Informatics, and is in charge of developing the software’s English capabilities. The project began in 2011 with a 10-year time frame for reaching its goal.
Questions from the test were turned into data that could be recognized by the software. To-Robo then processed the information, distinguishing the logic of exchanges and correctly identifying the right answer out of multiple choices.
For example, it was able to correctly choose the answer that best fits the following conversation:
A: I hear your father is in the hospital.
B: Yes, and he has to have an operation next week.
A: 【 】. Let me know if I can do anything.
B: Thanks a lot.
–That’s a relief.
–That’s too bad.
While NTT lab said To-Robo was getting better at completing conversations, structuring sentences appropriately and grasping the context of a dialogue, it added that the software still needs to improve at understanding more complex exchanges and comprehending the emotions of speakers.
The technology may be developed for human use in the future, the lab said, with translation seen among its possible applications. - WSJ.
DARPA-Funded Researchers Have Tested a Drone That Can Learn
Almost seven years ago, we learned that DARPA was investing millions of dollars in neuromorphic chips. That's a fancy term for a computer chip that mimics a biological cortex—a brain chip. Today, researchers are getting closer. And of course, they're putting those brain chips in drones.
Responding to DARPA's challenge, HRL Laboratories' Center for Neural and Emergent Systems just tested a tiny drone with a prototype neuromorphic chip. The drone packs 576 silicon neurons that communicate through spikes in electricity and respond to data from optical, ultrasound, and infrared sensors. And thanks to that brain-like chip, the little robot doesn't necessarily need a human to tell it what to do. It can learn and act on its own.
It sounds like something out of a science fiction movie, a tiny aircraft that flies around deciding what to surveil or, more frighteningly, what to shoot. MIT's Technology Review explains how the test worked:
The first time the drone was flown into each room, the unique pattern of incoming sensor data from the walls, furniture, and other objects caused a pattern of electrical activity in the neurons that the chip had never experienced before. That triggered it to report that it was in a new space, and also caused the ways its neurons connected to one another to change, in a crude mimic of learning in a real brain. Those changes meant that next time the craft entered the same room, it recognized it and signaled as such.
So that's pretty cool. No seriously, that kind of technological prowess is nothing short of astonishing. However, it's hard to deny that a future full of drones with tiny electronic brians is a little bit frightening. They'll surely do lots of good. But that conversation about the ethics of artificial intelligence will only escalate as AI takes to the skies. - Gizmodo.