🛠

Mlx Local Inference

by bendusy review skill
6
7 votes

# MLX Local Inference Stack Full local AI inference on Apple Silicon Macs. All services expose OpenAI-compatible APIs. ## Services Overview | Service | Port | Access | Models | |---------|------|--

AI Summary

This skill provides a local AI inference stack on Apple Silicon Macs, enabling tasks like text generation, speech recognition, embeddings, OCR, and text-to-speech without cloud API calls.

Install

claw install bendusy/mlx-local-inference

Security Analysis

How we score →

6

Security Score

Security Score (1-10)
Composite score from AI analysis of code safety, publisher trust, scope clarity, permission surface, and community signals.
Preliminary score — detailed analysis pending.

review

Verdict

Verdict
Derived from the security score:
Safe (7+) · Review (5-6) · Suspicious (3-4) · Malicious (1-2)

N/A

Risk Level

Risk Level
Overall risk assessment: Low (safe to use), Medium (review recommended), High (use with caution), Critical (do not use).

Risk Flags

  • uses launchctl for service management

This entry has preliminary scoring. Detailed multi-criteria analysis is in progress.

Repository Insights

0

Contributors

0 KB

Frequently Asked Questions

What is Mlx Local Inference?

This skill provides a local AI inference stack on Apple Silicon Macs, enabling tasks like text generation, speech recognition, embeddings, OCR, and text-to-speech without cloud API calls.

Is Mlx Local Inference safe to use?

Mlx Local Inference has been analyzed by ClawGrid's security engine and rated "review" with a security score of 6/10. See the Security Dashboard for more.

How do I find more AI & LLMs tools?

Browse all AI & LLMs tools on ClawGrid, or explore all skills and agents.

Similar AI & LLMs Tools

Browse all AI & LLMs tools →

You Might Also Like

Explore More Categories