close
    • chevron_right

      Google’s RT-2 AI model brings us one step closer to WALL-E

      news.movim.eu / ArsTechnica · Friday, 28 July, 2023 - 21:32

    A Google robot controlled by RT-2.

    Enlarge / A Google robot controlled by RT-2. (credit: Google)

    On Friday, Google DeepMind announced Robotic Transformer 2 (RT-2), a "first-of-its-kind" vision-language-action (VLA) model that uses data scraped from the Internet to enable better robotic control through plain language commands. The ultimate goal is to create general-purpose robots that can navigate human environments, similar to fictional robots like WALL-E or C-3PO.

    When a human wants to learn a task, we often read and observe. In a similar way, RT-2 utilizes a large language model (the tech behind ChatGPT ) that has been trained on text and images found online. RT-2 uses this information to recognize patterns and perform actions even if the robot hasn't been specifically trained to do those tasks—a concept called generalization.

    For example, Google says that RT-2 can allow a robot to recognize and throw away trash without having been specifically trained to do so. It uses its understanding of what trash is and how it is usually disposed to guide its actions. RT-2 even sees discarded food packaging or banana peels as trash, despite the potential ambiguity.

    Read 10 remaining paragraphs | Comments

    • chevron_right

      Hypersensitive robot hand is eerily human in how it can feel things

      news.movim.eu / ArsTechnica · Monday, 22 May, 2023 - 16:53

    Image of robotic fingers gripping a mirrored disco ball with light reflected off it.

    Enlarge (credit: Columbia University ROAM Lab )

    From bionic limbs to sentient androids, robotic entities in science fiction blur the boundaries between biology and machine. Real-life robots are far behind in comparison. While we aren’t going to reach the level of Star Trek’s Data anytime soon, there is now a robot hand with a sense of touch that is almost human.

    One thing robots have not been able to achieve is a level of sensitivity and dexterity high enough to feel and handle things as humans do. Enter a robot hand developed by a team of researchers at Columbia University. (Five years ago, we covered their work back when this achievement was still a concept.)

    This hand doesn’t just pick things up and put them down on command. It is so sensitive that it can actually “feel” what it is touching, and it's dextrous enough to easily change the position of its fingers so it can better hold objects, a maneuver known as "finger gaiting." It is so sensitive it can even do all this in the dark, figuring everything out by touch.

    Read 10 remaining paragraphs | Comments

    • chevron_right

      Credible Handwriting Machine

      news.movim.eu / Schneier · Friday, 19 May, 2023 - 20:19 · 1 minute

    In case you don’t have enough to worry about, someone has built a credible handwriting machine:

    This is still a work in progress, but the project seeks to solve one of the biggest problems with other homework machines, such as this one that I covered a few months ago after it blew up on social media. The problem with most homework machines is that they’re too perfect. Not only is their content output too well-written for most students, but they also have perfect grammar and punctuation ­ something even we professional writers fail to consistently achieve. Most importantly, the machine’s “handwriting” is too consistent. Humans always include small variations in their writing, no matter how honed their penmanship.

    Devadath is on a quest to fix the issue with perfect penmanship by making his machine mimic human handwriting. Even better, it will reflect the handwriting of its specific user so that AI-written submissions match those written by the student themselves.

    Like other machines, this starts with asking ChatGPT to write an essay based on the assignment prompt. That generates a chunk of text, which would normally be stylized with a script-style font and then output as g-code for a pen plotter. But instead, Devadeth created custom software that records examples of the user’s own handwriting. The software then uses that as a font, with small random variations, to create a document image that looks like it was actually handwritten.

    Watch the video.

    My guess is that this is another detection/detection avoidance arms race.

    • chevron_right

      A grasshopper-like soft material can jump 200 times above its thickness

      news.movim.eu / ArsTechnica · Saturday, 11 March, 2023 - 12:55

    Grasshopper on green leaves

    Enlarge (credit: Stefania Pelfini, La Waziya Photography )

    Superhumans don't exist in the real world, but someday you might see super robots. Obviously, robots can be made that are stronger, faster, and better than humans, but do you think there is a limit to how much better we can make them?

    Thanks to the ongoing developments in material science and soft robotics, scientists are now developing new technologies that could allow future robots to push the limits of non-human biology. For instance, a team of researchers at the University of Colorado Boulder recently developed a material that could give rise to soft robots capable of jumping 200 times above their own thickness. Grasshoppers, one of the most astonishing leapers on Earth, can leap into the air only up to 20 times their body lengths.

    Despite outperforming the insects, the researchers behind the rubber-like jumping material say they took their inspiration from grasshoppers. Similar to the insect, the material stores large amounts of energy in the area and then releases it all at once while making a jump .

    Read 12 remaining paragraphs | Comments

    • chevron_right

      Scientists built a tiny robot to mimic the mantis shrimp’s knock-out punch

      Jennifer Ouellette · news.movim.eu / ArsTechnica · Monday, 30 August, 2021 - 22:05 · 1 minute

    An interdisciplinary team of roboticists, engineers and biologists modeled the mechanics of the mantis shrimp’s punch and built a robot that mimics the movement.

    Enlarge / An interdisciplinary team of roboticists, engineers and biologists modeled the mechanics of the mantis shrimp’s punch and built a robot that mimics the movement. (credit: Second Bay Studios and Roy Caldwell/Harvard SEAS)

    The mantis shrimp boasts one of the most powerful, ultrafast punches in nature—it's on par with the force generated by a .22 caliber bullet. This makes the creature an attractive object of study for scientists eager to learn more about the relevant biomechanics. Among other uses, it could lead to small robots capable of equally fast, powerful movements. Now a team of Harvard University researchers have come up with a new biomechanical model for the mantis shrimp's mighty appendage, and they built a tiny robot to mimic that movement, according to a recent paper published in the Proceedings of the National Academy of Sciences (PNAS).

    “We are fascinated by so many remarkable behaviors we see in nature, in particular when these behaviors meet or exceed what can be achieved by human-made devices,” said senior author Robert Wood, a roboticist at Harvard University's John A. Paulson School of Engineering and Applied Sciences (SEAS). “The speed and force of mantis shrimp strikes, for example, are a consequence of a complex underlying mechanism. By constructing a robotic model of a mantis shrimp striking appendage, we are able to study these mechanisms in unprecedented detail.”

    Wood's research group made headlines several years ago when they constructed RoboBee , a tiny robot capable of partially untethered flight. The ultimate goal of that initiative is to build a swarm of tiny interconnected robots capable of sustained untethered flight—a significant technological challenge, given the insect-sized scale, which changes the various forces at play. In 2019, Wood's group announced their achievement of the lightest insect-scale robot so far to have achieved sustained, untethered flight—an improved version called the RoboBee X-Wing. (Kenny Breuer, writing in Nature, described it as a "a tour de force of system design and engineering.")

    Read 11 remaining paragraphs | Comments

    index?i=gLH8YNAK1Ms:gNcyer1s3ag:V_sGLiPBpWUindex?i=gLH8YNAK1Ms:gNcyer1s3ag:F7zBnMyn0Loindex?d=qj6IDK7rITsindex?d=yIl2AUoC8zA
    • chevron_right

      Programming a robot to teach itself how to move

      John Timmer · news.movim.eu / ArsTechnica · Tuesday, 11 May, 2021 - 16:19 · 1 minute

    image of three small pieces of hardware connected by tubes.

    Enlarge / The robotic train. (credit: Oliveri et. al.)

    One of the most impressive developments in recent years has been the production of AI systems that can teach themselves to master the rules of a larger system. Notable successes have included experiments with chess and Starcraft . Given that self-teaching capability, it's tempting to think that computer-controlled systems should be able to teach themselves everything they need to know to operate. Obviously, for a complex system like a self-driving car, we're not there yet. But it should be much easier with a simpler system, right?

    Maybe not. A group of researchers in Amsterdam attempted to take a very simple mobile robot and create a system that would learn to optimize its movement through a learn-by-doing process. While the system the researchers developed was flexible and could be effective, it ran into trouble due to some basic features of the real world, like friction.

    Roving robots

    The robots in the study were incredibly simple and were formed from a varying number of identical units. Each had an on-board controller, battery, and motion sensor. A pump controlled a piece of inflatable tubing that connected a unit to a neighboring unit. When inflated, the tubing generated a force that pushed the two units apart. When deflated, the tubing would pull the units back together.

    Read 14 remaining paragraphs | Comments

    index?i=AmJUptPtYCA:GdHNpTRO87o:V_sGLiPBpWUindex?i=AmJUptPtYCA:GdHNpTRO87o:F7zBnMyn0Loindex?d=qj6IDK7rITsindex?d=yIl2AUoC8zA
    • chevron_right

      Amazon to roll out tools to monitor factory workers and machines

      Financial Times · news.movim.eu / ArsTechnica · Tuesday, 1 December, 2020 - 19:55

    Amazon to roll out tools to monitor factory workers and machines

    Enlarge (credit: Emanuele Cremaschi | Getty Images)

    Amazon is rolling out cheap new tools that will allow factories everywhere to monitor their workers and machines, as the tech giant looks to boost its presence in the industrial sector.

    Launched by Amazon’s cloud arm AWS, the new machine learning-based services include hardware to monitor the health of heavy machinery, and computer vision capable of detecting whether workers are complying with social distancing.

    Amazon said it had created a two-inch, low-cost sensor—Monitron—that can be attached to equipment to monitor abnormal vibrations or temperatures and predict future faults.

    Read 14 remaining paragraphs | Comments

    index?i=mE4VKMqM3vg:dX_Olc1PM_0:V_sGLiPBpWUindex?i=mE4VKMqM3vg:dX_Olc1PM_0:F7zBnMyn0Loindex?d=qj6IDK7rITsindex?d=yIl2AUoC8zA