Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Software automatically generates knitting instructions for 3-D shapes

29.03.2018

CMU researchers foresee machines capable of on-demand knitting

Carnegie Mellon University computer scientists have developed a system that can translate a wide variety of 3-D shapes into stitch-by-stitch instructions that enable a computer-controlled knitting machine to automatically produce those shapes.


James McCann, assistant professor of robotics, and Carnegie Mellon graduate students Lea Albaugh and Vidya Narayanan check a computer-controlled knitting machine. Their system translates a 3-D shapes into stitch-by-stitch instructions so the machine can automatically produce them.

Credit: Carnegie Mellon University/Michael Henninger

Researchers in the Carnegie Mellon Textiles Lab have used the system to produce a variety of plush toys and garments. What's more, James McCann, assistant professor in the Robotics Institute and leader of the lab, said this ability to generate knitting instructions without need of human expertise could make on-demand machine knitting possible.

McCann's vision is to use the same machines that routinely crank out thousands of knitted hats, gloves and other apparel to produce customized pieces one at a time or in small quantities. Gloves, for instance, might be designed to precisely fit a customer's hands. Athletic shoe uppers, sweaters and hats might have unique color patterns or ornamentation.

"Knitting machines could become as easy to use as 3-D printers," McCann said.

That's in stark contrast to the world of knitting today.

"Now, if you run a floor of knitting machines, you also have a department of engineers," said McCann, who noted that garment designers rarely have the specialized expertise necessary to program the machines. "It's not a sustainable way of doing one-off customized pieces.

In their latest work, to be presented this summer at SIGGRAPH 2018, the Conference on Computer Graphics and Interactive Techniques in Vancouver, Canada, McCann and his colleagues developed a method for transforming 3-D meshes -- a common method for modeling 3-D shapes -- into instructions for V-bed knitting machines.

These widely used machines manipulate loops of yarn with hook-shaped needles, which lie in parallel needle beds angled toward each other in an inverted V shape. The machines are highly capable, but are limited in comparison with hand knitting, said Vidya Narayanan, a Ph.D. student in computer science.

The CMU algorithm takes these constraints into account, she said, producing instructions for patterns that work within the limits of the machine and reduce the risk of yarn breaks or jams.

A front-end design system such as this is common in 3-D printing and in computer-driven machine shops, but not in the knitting world, McCann said. Likewise, 3-D printing and machine shops use common languages and file formats to run their equipment, while knitting machines use a variety of languages and tools that are specific to particular brands of knitting machines. McCann led an earlier effort to create a common knitting format, called Knitout, which is capable of being implemented with any brand of knitting machine.

Further work is needed to make on-demand knitting a reality. For instance, the system now only produces smooth knitted cloth, without the patterned stitching that can make knitted garments distinctive. The knitting ecosystem also needs to be expanded, with design tools that will work with any machine. But progress could be rapid at this point, McCann said.

"The knitting hardware is already really good," he explained. "It's the software that needs a little push. And software can improve rapidly because we can iterate so much faster."

In addition to McCann and Narayanan, the research team included Jessica Hodgins, professor of computer science and robotics; Lea Albaugh, a Ph.D. student in the Human-Computer Interaction Institute; and Stelian Coros, a faculty member at ETH Zurich and an adjunct professor of robotics at CMU.

The research paper, along with a video, is available on GitHub.

Media Contact

Byron Spice
bspice@cs.cmu.edu
412-268-9068

 @CMUScience

http://www.cmu.edu 

Byron Spice | EurekAlert!

More articles from Information Technology:

nachricht Controlling robots with brainwaves and hand gestures
20.06.2018 | Massachusetts Institute of Technology, CSAIL

nachricht Innovative autonomous system for identifying schools of fish
20.06.2018 | IMDEA Networks Institute

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Temperature-controlled fiber-optic light source with liquid core

In a recent publication in the renowned journal Optica, scientists of Leibniz-Institute of Photonic Technology (Leibniz IPHT) in Jena showed that they can accurately control the optical properties of liquid-core fiber lasers and therefore their spectral band width by temperature and pressure tuning.

Already last year, the researchers provided experimental proof of a new dynamic of hybrid solitons– temporally and spectrally stationary light waves resulting...

Im Focus: Overdosing on Calcium

Nano crystals impact stem cell fate during bone formation

Scientists from the University of Freiburg and the University of Basel identified a master regulator for bone regeneration. Prasad Shastri, Professor of...

Im Focus: AchemAsia 2019 will take place in Shanghai

Moving into its fourth decade, AchemAsia is setting out for new horizons: The International Expo and Innovation Forum for Sustainable Chemical Production will take place from 21-23 May 2019 in Shanghai, China. With an updated event profile, the eleventh edition focusses on topics that are especially relevant for the Chinese process industry, putting a strong emphasis on sustainability and innovation.

Founded in 1989 as a spin-off of ACHEMA to cater to the needs of China’s then developing industry, AchemAsia has since grown into a platform where the latest...

Im Focus: First real-time test of Li-Fi utilization for the industrial Internet of Things

The BMBF-funded OWICELLS project was successfully completed with a final presentation at the BMW plant in Munich. The presentation demonstrated a Li-Fi communication with a mobile robot, while the robot carried out usual production processes (welding, moving and testing parts) in a 5x5m² production cell. The robust, optical wireless transmission is based on spatial diversity; in other words, data is sent and received simultaneously by several LEDs and several photodiodes. The system can transmit data at more than 100 Mbit/s and five milliseconds latency.

Modern production technologies in the automobile industry must become more flexible in order to fulfil individual customer requirements.

Im Focus: Sharp images with flexible fibers

An international team of scientists has discovered a new way to transfer image information through multimodal fibers with almost no distortion - even if the fiber is bent. The results of the study, to which scientist from the Leibniz-Institute of Photonic Technology Jena (Leibniz IPHT) contributed, were published on 6thJune in the highly-cited journal Physical Review Letters.

Endoscopes allow doctors to see into a patient’s body like through a keyhole. Typically, the images are transmitted via a bundle of several hundreds of optical...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

VideoLinks
Industry & Economy
Event News

Munich conference on asteroid detection, tracking and defense

13.06.2018 | Event News

2nd International Baltic Earth Conference in Denmark: “The Baltic Sea region in Transition”

08.06.2018 | Event News

ISEKI_Food 2018: Conference with Holistic View of Food Production

05.06.2018 | Event News

 
Latest News

Graphene assembled film shows higher thermal conductivity than graphite film

22.06.2018 | Materials Sciences

Fast rising bedrock below West Antarctica reveals an extremely fluid Earth mantle

22.06.2018 | Earth Sciences

Zebrafish's near 360 degree UV-vision knocks stripes off Google Street View

22.06.2018 | Life Sciences

VideoLinks
Science & Research
Overview of more VideoLinks >>>