Stories by mkagenius
Gemini CLI has native support for sandbox via Docker but this integration[1] extends it to Apple Containers on M1/M2/M3 Mac.<p>1. <a href="https://github.com/BandarLabs/coderunner?tab=readme-ov-file#option-3-gemini-cli">https://github.com/BandarLabs/...
Hello HN,<p>Keeping up with security vulnerabilities each week can be overwhelming. To make it easier, we’ve created The Exploit Podcast—an automated weekly podcast covering CVEs and security news.<p>You can listen on:<p>Apple Podcasts: <a href="https://podcasts.apple.com/us/podc...
Show HN: Can this model run on my Mac mini?
(mac-ai-model-compatibility-checker-911416006313.us-west1.run.app)
We used Apple Containers to run LLM generated code locally.<p>Leverage remote LLMs (like ChatGPT or Claude Sonnet 4) to work with your local files—such as videos—securely on your Mac. The LLM runs in a sandboxed environment where it can install external libraries, generate code, and execute it local...
The combination of a browser, a LLM and a sandbox seems like the perfect turing machine for the AI agent world - it can probably achieve any task there is to automate.<p>There can be many ways to play around with this combination but using visual workflow and nodes seemed the most intuitive way, so ...
Creates a tool to get the latest CVEs in a LLM friendly form. You can ingest them into NotebookLM for example and generate a podcast.<p>I have used it to create a podcast channel for weekly CVEs discussion (<a href="https://podcasts.apple.com/us/podcast/the-exploit-podcast-c...