로고


뉴스


  • 트위터
  • 인스타그램1604
  • 유튜브20240110

외국소식

인쇄 스크랩 URL 트위터 페이스북 목록

포켓몬이 현실에 나타나고 사람들이 그림 속을 거니는 풍경

정준모

Algorithm Clones Artists’ Styles, Turning Videos into Living Paintings




Have you ever wished you could watch Pixar’s Ice Age rendered in the post-Impressionist painting style of Vincent van Gogh, or Star Wars in the expressionistic style of Edvard Munch’s The Scream? No? Well, apparently, researchers at Germany’s University of Freiburg have: They’ve created a computer algorithm that essentially clones the styles of famous artists and turns them into video filters, making footage look like living paintings.


To demonstrate their findings, the team transformed clips from films like Cloud Atlas, The Jungle Book, and the British TV program Miss Marple into animated paintings in the styles of Munch, Matisse, Picasso, Kandinsky, and Turner. These are just a few examples –– the transfer could be applied in any given style.



To some art history nerds, the idea might seem sacrilege  –– van Gogh didn’t paint “Starry Night” so it could later be superimposed onto a talking woolly mammoth. And the robot art forger’s results are not as impressive as the recent human attempt to animate van Gogh with 12 oil paintings per second. Whatever you think of the results, though, they represent a sophisticated new application of computer vision technology. Researchers Manuel Ruder, Alexey Dosovitskiy, and Thomas Brox describe their work in a paper, “Artistic style transfer for videos.” The style transfer uses deep-learning algorithms, which allow a computer to analyze images layer by layer, identifying complex relationships between tiny bits of visual data. For example, the first layers extract broad patterns, like color, while the deeper layers focus on details of line and shape. This process relies on artificial neural networks — so-called because they operate similarly to networks in the human brain.





Last year, researchers at the University of Tubingen discovered that such algorithms can effectively separate an image’s style from its content. These findings gave rise to apps like AI painter — basically, an art historical makeover of Instagram filters. But until now, the technique could only be applied to still images; previous attempts to apply it to video were choppy and visually illegible. Ruder and his team built upon this work to create classic art-styled video filters, leading to the examples you see here.


Tech details aside, the research raises some questions about art-making and artificial intelligence. Of course, these experiments capture none of the emotion and spiritual depth of paintings they mimic, they just approximate the brushwork and color schemes. But if van Gogh’s style can be approximated by an algorithm, what does that say about the nature of artistic style? And if computers can “understand” art like this, what does that say about how humans understand art? MIT Technology Review suggests the next step in this kind of computer vision is applying artistic styles to virtual reality –– not that artists haven’t already experimented with the idea.


We’ll have to wait on the practical applications of this technique. It’s been suggested that we’re less than a decade away from SmartTVs equipped with these arty filter options, so you’ll finally be able to watch The Real Housewives of New Jersey covered in Jackson Pollock splatter.


h/t Engadget 




http://hyperallergic.com/298574/algorithm-clones-artists-styles-turning-videos-into-living-paintings/



하단 정보

FAMILY SITE

03015 서울 종로구 홍지문1길 4 (홍지동44) 김달진미술연구소 T +82.2.730.6214 F +82.2.730.9218