{"id":366,"date":"2019-11-02T16:16:18","date_gmt":"2019-11-02T15:16:18","guid":{"rendered":"https:\/\/teatronika.org\/en\/?p=366"},"modified":"2019-11-02T16:16:42","modified_gmt":"2019-11-02T15:16:42","slug":"neural-network-storytelling","status":"publish","type":"post","link":"https:\/\/teatronika.org\/en\/2019\/11\/02\/neural-network-storytelling\/","title":{"rendered":"Neural Network storytelling"},"content":{"rendered":"<p>You may have heard about GPT2 and its generated <a href=\"https:\/\/openai.com\/blog\/better-language-models\/#sample1\">Unicorn press release<\/a>. So its been a while now since we want to make a post about text generation Neural Networks like GPT2.<\/p>\n<p>Following a <a href=\"http:\/\/teatronika.org\/en\/2018\/09\/14\/conversaciones-con-una-i-a\/\">previous post<\/a> on the subject, we believe that these could be the next Teatronika tool for life robot improvisation!\u00a0 And if you are skeptical, take a look at what it can do in its online version <a href=\"http:\/\/talktotransformer.com\">talktotransformer.com<\/a>. Just follow the link and input the beginning of a story. For instance, when we input the two sentences:<\/p>\n<p><span style=\"color: #008000\">\u201cTeatronika is a project which tries to bring robotics closer to arts. Teatronika is building performances in which robots are the only actors\u2026\u201d\u00a0<\/span><\/p>\n<p class=\"p1\">GPT2 generation continued (<span style=\"color: #993300\"><strong>I repeat we didn\u2019t write that<\/strong><\/span>, the text was automatically generated pressing a button. I mean really!):<\/p>\n<p class=\"p1\">So, we left it at:\u00a0 <span style=\"color: #008000\">\u201c\u2026 are the only actors\u201d<\/span> and continued <span style=\"color: #008080\">\u201din the story, in a variety of formats, including on stage. Our program was an experiment in bringing people together to play with computers and to experience some of life as it actually happened by using the physical tools.<\/span><\/p>\n<p class=\"p1\"><span style=\"color: #008080\">What is the idea behind Teatronika?<\/span><\/p>\n<p class=\"p1\"><span style=\"color: #008080\">The idea is to bring robots and artificial intelligence closer to us. To find things that we need to know on a basic level (or, if you prefer, to know about us by understanding our behaviors) and also, how to work together and create a collaborative culture.<\/span><\/p>\n<p class=\"p1\"><span style=\"color: #008080\">How have you been inspired by this idea in the past few years? What technologies or events have influenced the project your organization has been involved in lately?\u201d<\/span><\/p>\n<p>Mind-blowing&#8230;<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-368 \" src=\"https:\/\/teatronika.org\/wp-content\/uploads\/2019\/11\/TeatronikaTalkToTransformer-1024x824.png\" alt=\"\" width=\"607\" height=\"488\" srcset=\"https:\/\/teatronika.org\/wp-content\/uploads\/2019\/11\/TeatronikaTalkToTransformer-1024x824.png 1024w, https:\/\/teatronika.org\/wp-content\/uploads\/2019\/11\/TeatronikaTalkToTransformer-250x201.png 250w, https:\/\/teatronika.org\/wp-content\/uploads\/2019\/11\/TeatronikaTalkToTransformer-768x618.png 768w, https:\/\/teatronika.org\/wp-content\/uploads\/2019\/11\/TeatronikaTalkToTransformer.png 1786w\" sizes=\"auto, (max-width: 607px) 100vw, 607px\" \/><\/p>\n<p>So a bit of context <a href=\"https:\/\/openai.com\/blog\/better-language-models\/\">GPT2<\/a> is a &#8220;trained large-scale unsupervised language model which generates coherent paragraphs of text, achieves state-of-the-art performance on many language modeling benchmarks, and performs rudimentary reading comprehension, machine translation, question answering, and summarisation\u2014all without task-specific training.&#8221; (from OpenAI web). In <a href=\"https:\/\/towardsdatascience.com\/examining-the-transformer-architecture-part-1-the-openai-gpt-2-controversy-feceda4363bb\">this medium article<\/a> there are more details.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>You may have heard about GPT2 and its generated Unicorn press release. So its been a while now since we want to make a post about text generation Neural Networks like GPT2. Following a previous post on the subject, we believe that these could be the next Teatronika tool for life robot improvisation!\u00a0 And if &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/teatronika.org\/en\/2019\/11\/02\/neural-network-storytelling\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Neural Network storytelling&#8221;<\/span><\/a><\/p>\n","protected":false},"author":9,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-366","post","type-post","status-publish","format-standard","hentry","category-sin-categoria"],"acf":[],"_links":{"self":[{"href":"https:\/\/teatronika.org\/en\/wp-json\/wp\/v2\/posts\/366","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/teatronika.org\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/teatronika.org\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/teatronika.org\/en\/wp-json\/wp\/v2\/users\/9"}],"replies":[{"embeddable":true,"href":"https:\/\/teatronika.org\/en\/wp-json\/wp\/v2\/comments?post=366"}],"version-history":[{"count":5,"href":"https:\/\/teatronika.org\/en\/wp-json\/wp\/v2\/posts\/366\/revisions"}],"predecessor-version":[{"id":373,"href":"https:\/\/teatronika.org\/en\/wp-json\/wp\/v2\/posts\/366\/revisions\/373"}],"wp:attachment":[{"href":"https:\/\/teatronika.org\/en\/wp-json\/wp\/v2\/media?parent=366"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/teatronika.org\/en\/wp-json\/wp\/v2\/categories?post=366"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/teatronika.org\/en\/wp-json\/wp\/v2\/tags?post=366"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}