Why Not Python? The Language Everyone Expected Me to Use for Drengr
When I started building Drengr — a tool that gives AI agents eyes and hands on mobile devices — the default choice was obvious. Every AI agent project in 2025-2026 is Python. LangChain is Python. CrewAI is Python. AutoGen is Python. Most MCP server implementations are Python. The ecosystem, the tutorials, the community, the hiring market — all Python.
I chose Rust instead. This is the honest explanation of why, what it cost me, and whether I'd make the same choice again.
The Distribution Problem
The single biggest reason I didn't use Python is distribution. Drengr is a developer tool that other people need to install on machines I'll never see. The install experience is the first impression. And with Python, that first impression is often painful.
Consider what a Python-based Drengr install looks like:
pip install drengr
# ERROR: requires Python 3.11+, you have 3.9
# or: conflicts with existing package versions
# or: needs a virtual environment
# or: pip install fails because of a C extension dependency
Every Android engineer has been on the receiving end of this. You follow the install instructions for some Python-based tooling — a test runner, a code generator, a device farm client — and you're greeted with a ModuleNotFoundError or a version conflict with something else in your environment. I've lost count of the hours I've spent debugging other people's dependency trees instead of doing my actual work.
The Rust alternative:
curl -fsSL https://drengr.dev/install.sh | bash
drengr doctor
One binary. No runtime. No dependencies. No virtual environment. No version conflicts. It either works or it doesn't, and if it doesn't, it's a bug I can actually reproduce and fix — because the binary is the same on every machine.
This isn't a theoretical concern. Drengr interacts with ADB, simctl, and Appium — tools that already have their own dependency and version requirements. Adding Python's dependency management on top of that would create a combinatorial explosion of "works on my machine" problems.
Cold Start Matters for MCP
Drengr runs as an MCP server. When Claude Desktop or Cursor connects to it, the server needs to start and respond to the first tool call. The user is waiting. The AI agent is waiting. Every millisecond of startup time is friction.
Drengr's cold start to first MCP response: ~15ms.
A Python MCP server with typical imports (json, asyncio, an HTTP client, a CLI framework) starts in 200-500ms. Add heavier libraries — image processing, XML parsing, the Anthropic SDK — and you're looking at 1-2 seconds.
For a one-off script, nobody cares. For a tool that an AI agent might start and stop multiple times during a session, or that needs to respond to tool calls in real-time during an autonomous OODA loop, the difference is significant. The agent's thinking time is already the bottleneck — the tool layer shouldn't add to it.
Memory and Reliability
Drengr manages long-running device sessions. An autonomous test run might interact with a device for 30 minutes or more, capturing hundreds of screenshots, parsing hundreds of UI trees, maintaining situation engine state. This is the kind of workload where Python's memory management gets interesting.
Python's garbage collector is good enough for most applications. But "good enough" means occasional GC pauses. It means memory growing over time as objects are allocated and collected. It means that a screenshot buffer you thought was freed is actually being held by a reference cycle until the GC gets around to collecting it.
Rust's ownership model means memory is freed deterministically — at the exact point where the owning variable goes out of scope. No GC pauses. No reference cycles. No "why is my process using 2GB after running for an hour?" investigations. Drengr's memory usage is flat and predictable regardless of session length.
What Python Would Have Given Me
I want to be fair. Python would have given me real advantages.
Prototyping speed. The first working version of Drengr took me about three weeks in Rust. In Python, I estimate it would have taken one week. The borrow checker adds friction during exploration — when I'm trying three different approaches to screen parsing, Rust demands I think through ownership at each step. Python lets me hack first and clean up later.
The AI ecosystem. When I needed to add LLM integration for the OODA loop, the Python path was obvious: pip install anthropic, call the API, get structured responses. In Rust, I made raw HTTP calls to the API and wrote my own response parsing. It works fine, but it was more work than it needed to be.
Community and contributors. More developers know Python than Rust. If Drengr were Python, more people could read the code, understand it, and potentially contribute. Rust's learning curve is a barrier to contribution.
Faster iteration on AI-adjacent features. Some of Drengr's planned features — smarter situation analysis, better stuck detection, screen diffing — would benefit from rapid experimentation. Python is better for that kind of exploratory work.
What Python Would Have Cost Me
But the costs are real too.
Every user becomes a debugger. With a Python tool, a meaningful percentage of support interactions would be "it doesn't install on my machine" or "it crashes with this import error." I've been that user enough times with other people's Python tools to know exactly how it goes. The first time someone opens an issue about a dependency conflict, I'd spend a day I could have spent building features.
The packaging problem is unsolved. PyInstaller, Nuitka, cx_Freeze, Briefcase — Python has many tools for creating standalone executables, and all of them have sharp edges. Platform-specific behavior, missing dependencies at runtime, binary size inflation. Rust's cargo build --release produces a binary that just works.
Concurrency complexity. Drengr's MCP server, SDK event listener, and OODA loop can all be active simultaneously, sharing state about the current device session. In Python, this means threading (with the GIL), multiprocessing (with serialization overhead), or asyncio (with the colored function problem). In Rust, the type system enforces safe concurrency. The compiler catches data races. I don't have to choose between correctness and performance.
Process weight. A Python process carries the interpreter, the standard library, and all imported modules. Drengr running as a Rust binary uses about 8MB of resident memory. An equivalent Python process would use 40-80MB. For a background tool running alongside an IDE, a browser, and whatever else the developer has open, this matters.
The Decision Framework
Here's how I think about it now, after having shipped Drengr in Rust:
Use Python when: You're building an AI application where the AI logic is the product. When you need rapid experimentation with LLM APIs, prompt engineering, and agent orchestration. When your users are data scientists or ML engineers who already have Python installed and configured. When distribution is pip or Docker.
Use Rust when: You're building infrastructure that AI applications consume. When the tool needs to install in one command on any machine. When cold start time matters. When the tool runs for long periods and memory predictability matters. When you're a solo developer and can't afford to spend time debugging environment issues on machines you've never seen.
Drengr is infrastructure, not an application. It doesn't contain AI logic — it provides tools that AI logic consumes. That distinction made Rust the right choice.
Would I Do It Again?
Yes. Without hesitation.
The three weeks of additional development time cost me once. The zero-friction install experience pays off every time someone tries Drengr. The 15ms cold start pays off on every MCP tool call. The predictable memory usage pays off on every long-running test session.
Python is the right language for AI applications. Rust is the right language for AI infrastructure. Drengr is infrastructure.
If you want to read about why I chose Rust over C and C++ — the other systems languages I seriously considered — I've written about that here.