How Prime Video updates its app for more than 8,000 device types

海外精选
海外精选的内容汇集了全球优质的亚马逊云科技相关技术内容。同时,内容中提到的“AWS” 是 “Amazon Web Services” 的缩写,在此网站不作为商标展示。
0
0
{"value":"At ++[Prime Video](https://www.amazon.science/tag/amazon-prime-video)++, we’re delivering content to millions of customers on more than 8,000 device types, such as gaming consoles, TVs, set-top boxes, and USB-powered streaming sticks. When we want to do an update, every one of those devices requires a separate native release, posing a difficult trade-off between updatability and performance.\n\nIn the past year, we’ve been using WebAssembly (Wasm), a framework that allows code written in high-level languages to run efficiently on any device, to help resolve that trade-off. Because we are excited to contribute to the Wasm ecosystem, Amazon has joined the Bytecode Alliance, a consortium dedicated to developing secure, efficient, modular, and portable runtime environments built atop standards such as Wasm.\n\nBy using Wasm instead of JavaScript for certain elements of the Prime Video app, we’ve reduced the average frame times on a mid-range TV from 28 milliseconds to 18. The worst-case frame times also decreased from 40 milliseconds to 25. And in ongoing work we’re driving the frame down still further.\n\n#### **Division of labor**\n\nTo enable efficient updates on a wide variety of devices while still maintaining performance, the Prime Video app has two parts: a high-performance engine written in C++ that is stored on-device and an easy-to-update component that is downloaded every time the app launches.\n\n![image.png](https://dev-media.amazoncloud.cn/0c9c6246f19f40609499fb3d81d5eff2_image.png)\n\nThe original architecture of the Prime Video app, with a layer of C++ code stored on-device and layers of JavaScript code downloaded at run time.\n\nIn the figure above, the stuff on device is a thin C++ layer that includes a JavaScript virtual machine (VM) and the components required to run the Prime Video application, which handle input, the media pipeline, and such processes as such as network access, image decoding, and window events handling.\n\nThe stuff we download (at run time) includes the application code, along with low-level components that handle scene management, the animation system, graphics rendering, layout, and resource management, among other things. Historically, these components all used JavaScript\n\nThis architecture split allows us to deliver new features and bug fixes without having to go through the very slow process of updating the C++ layer. The downloadable code is delivered through a fully automated continuous integration and delivery pipeline that can release updates as often as every few hours. However, we have a constant tension between writing code that’s performant (C++) and writing less-performant code that we can update with ease (JavaScript).\n\n#### **WebAssembly**\n\nWasm provides a compilation target for programming languages that offer more expressivity than JavaScript does, such as C or Rust. Like JavaScript code, compiled Wasm binaries run on a VM that provides a uniform interface between code and hardware, regardless of device.\n\nWasm was initially intended for web browsers, but there are now standalone applications of Wasm VMs outside the browser, such as running Internet-of-things software, game mods, and server-side workloads.\n\nOur Wasm investigations started in August 2020, when we built some prototypes to compare the performance of Wasm VMs and JavaScript VMs in simulations involving the type of work our low-level JavaScript components were doing. In those experiments, code written in Rust and compiled to Wasm was 10 to 25 times as fast as JavaScript.\n\nWe can’t just rewrite the Prime Video application in Rust and run it on a Wasm VM, however, as it still needs to run on legacy devices and browsers that don’t have Wasm support. We also don’t want to create a new app only for the new architecture, as we value deploying the same application across environments.\n\nThis is why we moved only the low-level systems from JavaScript to Wasm. In this way, we still bring performance benefits to the application, without the application teams’ having to know or care that we run certain systems on a Wasm VM.\n\nThis is what our new architecture looks like:\n\n![image.png](https://dev-media.amazoncloud.cn/5bafc47677e74d39826e401b7c25225e_image.png)\n\nThe new architecture, with WebAssembly.\n\nThe Wasm binaries are deployed with the JavaScript code, through the same fully automated pipeline that can take a program from code commit to running on customers’ devices in a few hours.\n\n#### **The switch**\n\nThe figure above shows the new architecture with a Wasm VM and a JavaScript VM running in separate threads. But how did we transition from the first architecture to the second one without rewriting the app?\n\nThe first step is updating the stuff on device to include the Wasm VM, so it can now run both versions of a given software component (JavaScript only or JavaScript and Wasm). This allows us to gradually release the Wasm components to a subset of customers.\n\nWe had to modify how the Prime Video application communicates with these components. At a high level, the application works by creating a scene — a representation of a visual scene — which consists of nodes whose implementations are device specific. A host node (e.g., view, image, text) is a data structure that has all the necessary information to update and render a component of the visual scene.\n\nAt startup, we check if we’re running on a device that has Wasm available. If it does, we create lightweight host nodes in JavaScript that don’t do anything other than send commands to the Wasm VM. The “real” host nodes are created in Wasm when these commands are handled.\n\nWe use messages to communicate between the two VMs because we don’t want the JavaScript VM work to interrupt the Wasm VM work. The job of the Wasm components is to update nodes and pump frames out to the screen as fast as possible without any interruptions.\n\nThe hard part was doing this switch in a way that preserves the behavior of the JavaScript systems. We sometimes had to duplicate the “incorrect” behavior of the JavaScript renderer in the new Wasm version, because the app relied on it for some edge cases. Making sure the JavaScript VM code never calls any dangerous function on the wrong thread has also added extra difficulties.\n\n#### **Results**\n\nAs I mentioned, the switch to Rust and Wasm has improved the applications’ frame rate stability and speed. To reach our goal of reliable 60-frame-per-second frame generation and improve input latency, we will move more systems to Wasm, such as focus management and layout.\n\nThe total memory consumption for the Wasm VM, including the module instance, environment, and the module itself is at most 7.5 megabytes. By moving these systems to Wasm, we have saved a total of 30 megabytes of JavaScript heap memory. Memory is a scarce resource on most of the devices we deploy on, so this is a welcome reduction.\n\nThe binary size of our Wasm module is 150 kilobytes when compressed (750 kilobytes uncompressed, after symbol stripping). The module’s small size, coupled with the fast VM start time, means that the addition of Wasm doesn’t affect the app start-up time.\n\nUsing Rust has enabled programmers of all experience levels to contribute code without requiring reviewers to carefully scrutinize every line for safety pitfalls. We trust the compiler, and we can focus our code reviews on functionality, not language corner cases.\n\nAnother benefit of using Rust is having access to an ecosystem of high-quality libraries. For instance, we built an application that overlays debugger information on an application scene render using ++[egui](https://github.com/emilk/egui)++, a Rust GUI library. Integrating egui with our Wasm renderer took a couple of hours of work and offers us an easy way to gain insights into the engine’s internals.\n\n![image.png](https://dev-media.amazoncloud.cn/145e444eb788488481a4ceeba7030967_image.png)\n\nAn image generated by the renderer, with debugging information overlaid.\n\nOverall, we think that this investment in Rust and WebAssembly has paid off: after a year and 37,000 lines of Rust code, we have significantly improved performance, stability, and CPU consumption and reduced memory utilization.\n\nABOUT THE AUTHOR\n\n#### **[Alexandru Ene](https://www.amazon.science/author/alexandru-ene)**\n\nAlexandru Ene is a principal software engineer with Amazon Prime Video.","render":"<p>At <ins><a href=\"https://www.amazon.science/tag/amazon-prime-video\" target=\"_blank\">Prime Video</a></ins>, we’re delivering content to millions of customers on more than 8,000 device types, such as gaming consoles, TVs, set-top boxes, and USB-powered streaming sticks. When we want to do an update, every one of those devices requires a separate native release, posing a difficult trade-off between updatability and performance.</p>\n<p>In the past year, we’ve been using WebAssembly (Wasm), a framework that allows code written in high-level languages to run efficiently on any device, to help resolve that trade-off. Because we are excited to contribute to the Wasm ecosystem, Amazon has joined the Bytecode Alliance, a consortium dedicated to developing secure, efficient, modular, and portable runtime environments built atop standards such as Wasm.</p>\n<p>By using Wasm instead of JavaScript for certain elements of the Prime Video app, we’ve reduced the average frame times on a mid-range TV from 28 milliseconds to 18. The worst-case frame times also decreased from 40 milliseconds to 25. And in ongoing work we’re driving the frame down still further.</p>\n<h4><a id=\"Division_of_labor_6\"></a><strong>Division of labor</strong></h4>\n<p>To enable efficient updates on a wide variety of devices while still maintaining performance, the Prime Video app has two parts: a high-performance engine written in C++ that is stored on-device and an easy-to-update component that is downloaded every time the app launches.</p>\n<p><img src=\"https://dev-media.amazoncloud.cn/0c9c6246f19f40609499fb3d81d5eff2_image.png\" alt=\"image.png\" /></p>\n<p>The original architecture of the Prime Video app, with a layer of C++ code stored on-device and layers of JavaScript code downloaded at run time.</p>\n<p>In the figure above, the stuff on device is a thin C++ layer that includes a JavaScript virtual machine (VM) and the components required to run the Prime Video application, which handle input, the media pipeline, and such processes as such as network access, image decoding, and window events handling.</p>\n<p>The stuff we download (at run time) includes the application code, along with low-level components that handle scene management, the animation system, graphics rendering, layout, and resource management, among other things. Historically, these components all used JavaScript</p>\n<p>This architecture split allows us to deliver new features and bug fixes without having to go through the very slow process of updating the C++ layer. The downloadable code is delivered through a fully automated continuous integration and delivery pipeline that can release updates as often as every few hours. However, we have a constant tension between writing code that’s performant (C++) and writing less-performant code that we can update with ease (JavaScript).</p>\n<h4><a id=\"WebAssembly_20\"></a><strong>WebAssembly</strong></h4>\n<p>Wasm provides a compilation target for programming languages that offer more expressivity than JavaScript does, such as C or Rust. Like JavaScript code, compiled Wasm binaries run on a VM that provides a uniform interface between code and hardware, regardless of device.</p>\n<p>Wasm was initially intended for web browsers, but there are now standalone applications of Wasm VMs outside the browser, such as running Internet-of-things software, game mods, and server-side workloads.</p>\n<p>Our Wasm investigations started in August 2020, when we built some prototypes to compare the performance of Wasm VMs and JavaScript VMs in simulations involving the type of work our low-level JavaScript components were doing. In those experiments, code written in Rust and compiled to Wasm was 10 to 25 times as fast as JavaScript.</p>\n<p>We can’t just rewrite the Prime Video application in Rust and run it on a Wasm VM, however, as it still needs to run on legacy devices and browsers that don’t have Wasm support. We also don’t want to create a new app only for the new architecture, as we value deploying the same application across environments.</p>\n<p>This is why we moved only the low-level systems from JavaScript to Wasm. In this way, we still bring performance benefits to the application, without the application teams’ having to know or care that we run certain systems on a Wasm VM.</p>\n<p>This is what our new architecture looks like:</p>\n<p><img src=\"https://dev-media.amazoncloud.cn/5bafc47677e74d39826e401b7c25225e_image.png\" alt=\"image.png\" /></p>\n<p>The new architecture, with WebAssembly.</p>\n<p>The Wasm binaries are deployed with the JavaScript code, through the same fully automated pipeline that can take a program from code commit to running on customers’ devices in a few hours.</p>\n<h4><a id=\"The_switch_40\"></a><strong>The switch</strong></h4>\n<p>The figure above shows the new architecture with a Wasm VM and a JavaScript VM running in separate threads. But how did we transition from the first architecture to the second one without rewriting the app?</p>\n<p>The first step is updating the stuff on device to include the Wasm VM, so it can now run both versions of a given software component (JavaScript only or JavaScript and Wasm). This allows us to gradually release the Wasm components to a subset of customers.</p>\n<p>We had to modify how the Prime Video application communicates with these components. At a high level, the application works by creating a scene — a representation of a visual scene — which consists of nodes whose implementations are device specific. A host node (e.g., view, image, text) is a data structure that has all the necessary information to update and render a component of the visual scene.</p>\n<p>At startup, we check if we’re running on a device that has Wasm available. If it does, we create lightweight host nodes in JavaScript that don’t do anything other than send commands to the Wasm VM. The “real” host nodes are created in Wasm when these commands are handled.</p>\n<p>We use messages to communicate between the two VMs because we don’t want the JavaScript VM work to interrupt the Wasm VM work. The job of the Wasm components is to update nodes and pump frames out to the screen as fast as possible without any interruptions.</p>\n<p>The hard part was doing this switch in a way that preserves the behavior of the JavaScript systems. We sometimes had to duplicate the “incorrect” behavior of the JavaScript renderer in the new Wasm version, because the app relied on it for some edge cases. Making sure the JavaScript VM code never calls any dangerous function on the wrong thread has also added extra difficulties.</p>\n<h4><a id=\"Results_54\"></a><strong>Results</strong></h4>\n<p>As I mentioned, the switch to Rust and Wasm has improved the applications’ frame rate stability and speed. To reach our goal of reliable 60-frame-per-second frame generation and improve input latency, we will move more systems to Wasm, such as focus management and layout.</p>\n<p>The total memory consumption for the Wasm VM, including the module instance, environment, and the module itself is at most 7.5 megabytes. By moving these systems to Wasm, we have saved a total of 30 megabytes of JavaScript heap memory. Memory is a scarce resource on most of the devices we deploy on, so this is a welcome reduction.</p>\n<p>The binary size of our Wasm module is 150 kilobytes when compressed (750 kilobytes uncompressed, after symbol stripping). The module’s small size, coupled with the fast VM start time, means that the addition of Wasm doesn’t affect the app start-up time.</p>\n<p>Using Rust has enabled programmers of all experience levels to contribute code without requiring reviewers to carefully scrutinize every line for safety pitfalls. We trust the compiler, and we can focus our code reviews on functionality, not language corner cases.</p>\n<p>Another benefit of using Rust is having access to an ecosystem of high-quality libraries. For instance, we built an application that overlays debugger information on an application scene render using <ins><a href=\"https://github.com/emilk/egui\" target=\"_blank\">egui</a></ins>, a Rust GUI library. Integrating egui with our Wasm renderer took a couple of hours of work and offers us an easy way to gain insights into the engine’s internals.</p>\n<p><img src=\"https://dev-media.amazoncloud.cn/145e444eb788488481a4ceeba7030967_image.png\" alt=\"image.png\" /></p>\n<p>An image generated by the renderer, with debugging information overlaid.</p>\n<p>Overall, we think that this investment in Rust and WebAssembly has paid off: after a year and 37,000 lines of Rust code, we have significantly improved performance, stability, and CPU consumption and reduced memory utilization.</p>\n<p>ABOUT THE AUTHOR</p>\n<h4><a id=\"Alexandru_Enehttpswwwamazonscienceauthoralexandruene_74\"></a><strong><a href=\"https://www.amazon.science/author/alexandru-ene\" target=\"_blank\">Alexandru Ene</a></strong></h4>\n<p>Alexandru Ene is a principal software engineer with Amazon Prime Video.</p>\n"}
目录
亚马逊云科技解决方案 基于行业客户应用场景及技术领域的解决方案
联系亚马逊云科技专家
亚马逊云科技解决方案
基于行业客户应用场景及技术领域的解决方案
联系专家
0
目录
关闭