, we've been doing some research into how phones and AR glasses can recognize products at a glance.
This lets you bring up relevant information like price, description, stock quantities, and reviews.
[1/6] 👇
Have you ever thought of using another editor? Expanding your mind 🤯...
No? There are amazing editors out there to check out ...Cough *not VSCode* cough...
The exemplary
I built a little command line tool to help me organize and quickly access the code snippets I use frequently. If you enjoy terminal tools, check it out:
https://github.com/maaslalani/nap
New release! 📼 VHS: a tool for generating terminal GIFs with code. VHS lets you write code to perform actions on a terminal and then renders terminal GIFs from it. Use it to demo or integration test your CLIs and TUIs!
The @charmcli crew showing open source how it’s done. Less than 30 minutes between me asking the question and the feature being PR’d and code reviewed.
Looking to level up your shell scripts? All you need is a bit of gum! See the README at https://github.com/charmbracelet/gum… for lots of great and practical examples.
New release! 🎀 Gum: a tool for glamorous shell scripts. It provides highly configurable utilities to help you write useful and delightful scripts with just a few lines of code.
with a minimal—yet very intuitive—UI. We can’t stop scribbling in our terminals!
https://github.com/maaslalani/draw
Built with 🧋 Bubble Tea and 💄 Lip Gloss
🎫Check out Slides, a terminal based Markdown presentation tool written in Go with Bubble Tea, Bubbles, Lip Gloss and Glamour.
Do your next deck on the command line! 😎
🔭 New This Week: The Project Overview page gives you a single place to view all activity across a project.
- Track how many tests you've conducted
- Check team activity like notes and session invites
- Jump to new sessions, heatmaps, and testers
As more merchants start getting 3D models, they'll be able to unlock new and powerful opportunities like this. From interactive web content, to AR, to ML models, to CG photography, the possibilities are endless. https://shopify.ca/blog/3d-models-video…
[6/6]
Why is all this useful? Using computer vision to recognize products allows buyers to get information quickly and enables retail staff to surface meaningful information even if it's their first day on the job.
[5/6]
We can even recognize what part of a product you're looking at. This could bring up contextual information about that part or play audio to guide you through a product demonstration.
[4/6]
The identification is done by taking a photo or video and sending it to our ML server. By offloading the processing to the cloud, we can power sophisticated ML models on less powerful devices (like AR glasses). The first example in this thread is running in a web browser!
[3/6]
Building a machine learning model to recognize a product requires hundreds of images. We needed a shortcut.
We discovered we could use a realistic 3D model of the product to generate thousands of images from different angles and lighting conditions.
[2/6]
🔥 Introducing Aggregated Heatmaps
Hawkeye now automatically aggregates test results across all your testers. Learn where your users look the most and hover over individual UI elements to see detailed stats.
Try it for yourself: http://dashboard.usehawkeye.com
Excited to introduce Hawkeye 2.0: a new testing platform that uses eye tracking to give you unparalleled insights into user behavior. ✨👀
Sign up for free at http://usehawkeye.com
We're excited to introduce our new blog series Looking Deeper! In each post, we'll conduct an eye tracking study of a different product and report back our findings.
To kick things off, we asked twenty UCSB students to shop the Urban Outfitters website https://blog.usehawkeye.com/looking-deeper-urban-outfitters/…
In the past months, I've spent 10+ hours a day mostly just reviewing PRs & triaging issues. I maintain 1100+ npm packages with 2 billion npm downloads a month in total. Most popular JS project rely on some of my packages.