ONNXOvjeren akaunt

@onnxai

ONNX is an open format for representing deep learning models, allowing AI developers to more easily move models between state-of-the-art tools.

Vrijeme pridruživanja: rujan 2017.

Tweetovi

Blokirali ste korisnika/cu @onnxai

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @onnxai

  1. 24. sij

    ONNX.js is an open source library for running models on browsers and on Node.js. It utilizes multi-threading in a JavaScript AI inference engine to offer significant performance improvements. Learn more here:

    Poništi
  2. 22. sij

    brings to AI by enabling developers to more easily switch between frameworks. Learn more:

    Poništi
  3. 11. sij

    The preview release of 1.5 adds multiple new features and bug fixes. Learn more here:

    Poništi
  4. 23. pro 2019.

    provides by enabling model development in one and deployment in another. Read more about deploying ONNX models in production.

    Poništi
  5. 20. pro 2019.

    See how performance compares across different libraries designed for image recognition inside browsers, including ONNX.js.

    Poništi
  6. 18. pro 2019.

    is an SDK that provides a seamless end-to-end service which converts raw streaming data into actional insights on Jetson or any T4 platform. Learn more:

    Poništi
  7. 12. pro 2019.

    Azure SQL Database Edge allows to put cloud-trained models on the edge to detect anomalies and apply business logic. It supports multiple languages and uses ONNX to convert models.

    Poništi
  8. 5. pro 2019.

    Learn how to run a model on a raspberry-pi or other using and DNN Compiler.

    Poništi
  9. 26. stu 2019.

    OLive (ONNX Go Live) automates model shipping. This sequence of docker images efficiently integrates model conversion, correctness test, and performance tuning into a single pipeline, outputting production-ready ONNX models.

    Poništi
  10. 20. stu 2019.

    ONNX is joining the to continue the open-governance model, which encourages community participation and contributions. Learn how LF AI will support and foster the next wave of innovation and adoption for .

    Poništi
  11. 16. stu 2019.

    The ONNX exporter allows trained models to be easily exported to the ONNX model format. Learn about the latest updates including increased model coverage, improved performance, and support for multiple ONNX opset versions for multiple backends.

    Poništi
  12. 4. stu 2019.

    Runtime 1.0 is available. Our new release brings CPU/GPU performance optimizations, expands the list of execution providers, accelerates ONNX model shipping, and a host of other features. Learn more details here:

    Poništi
  13. 23. lis 2019.

    Learn how to build a “Visual Alert” system for an IoT camera by using Cognitive Services to train an image classifier, and exporting that model to run locally on the IoT device.

    Poništi
  14. 17. lis 2019.

    The next ONNX Community Workshop will be held 11/18 in Shanghai. If you are using ONNX in your services & applications, building software or hardware that supports ONNX, or contributing to ONNX, we invite you to join us. Submit a proposal to attend:

    Poništi
  15. 4. lis 2019.

    ONNX v1.6 is now available. The major updates since the last release include availability of sequence and map types, and support for sparse tensors. Learn more:

    Poništi
  16. 26. ruj 2019.

    Visualize your machine learning models on , or with , an open-source model visualization tool that supports the ONNX model format. Learn more here:

    Poništi
  17. 20. ruj 2019.

    Accelerate and optimize your machine learning models using and ONNX Runtime. Hear from experts on how ONNX Runtime accelerates Bing Semantic Precise Image Search in this video:

    Poništi
  18. 18. ruj 2019.

    is a retargetable compilation framework designed for proprietary deep learning accelerators. Check out the new features included in the release of v1.2.0 here:

    Poništi
  19. 5. ruj 2019.

    Runtime 0.5 is now available with support for edge hardware acceleration in collaboration with and . The release also includes new features targeted towards improving ease of use for experimentation and deployment. To learn more:

    Poništi
  20. 29. kol 2019.

    . shares how and continue to collaborate to simplify AI deployments at the edge with the new integration of OpenVINO Toolkit with ONNX Runtime. Read on for more here:

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·