Every design decision is a trade-off. Here are mine.

Building a privacy-focused app means saying no to things that make apps easier to build, easier to use, and easier to monetize. Every decision I made for Tarn has a cost. I want to be transparent about what those costs are and why I chose to pay them.

PIN, not biometrics

Tarn uses a 4-6 digit PIN. Not Face ID. Not fingerprint. Not a password manager integration.

Biometrics are convenient. They're also compellable. In many jurisdictions, law enforcement can force you to unlock your phone with your face or fingerprint. The legal status of compelling someone to provide a memorized PIN is more contested and, in several circuit courts, has been ruled as protected under the Fifth Amendment.

More importantly for the primary threat model: an abusive partner can hold your phone up to your face while you're sleeping. They cannot extract a PIN from your sleeping brain.

The trade-off is friction. You enter your PIN every time. Every single time. There's no shortcut. I know this is annoying. I know it costs daily usability. For the people Tarn is built for, the alternative is worse.

Zero network calls

Tarn makes no internet connections. None. No analytics. No crash reporting. No telemetry. No update checks. No server pings. Nothing.

Most apps justify network calls as necessary for basic functionality — sync, backup, error tracking. Each of those calls is a data point. Each data point is a record. Each record can be subpoenaed, intercepted, or leaked. The metadata alone — that this device contacted this server at this time — establishes that the app exists on the device.

The trade-off is that there's no cloud backup. If you lose your phone, your data is gone unless you've manually exported an encrypted backup. There's no sync between devices. There's no automatic error reporting, which means I rely on users to tell me when things break.

I considered adding optional, encrypted sync as a feature. I decided against it because "optional" network calls create a binary that's detectable from outside the app. Either the app makes network calls or it doesn't. "Sometimes" is harder to verify than "never."

Self-destruct by default

After a configurable number of wrong PIN attempts, Tarn overwrites all data with random bytes and deletes it. This is enabled by default.

The trade-off is that legitimate users can accidentally destroy their own data. A child borrowing a phone. A clumsy morning. A moment of panic. The data is gone, and there's no recovery. I've heard from people who find this terrifying. They're not wrong to feel that way.

I made it the default because the people who need it most are the least likely to find it in settings and enable it themselves. A domestic violence survivor who downloads this app during a three-minute bathroom break is not going to explore settings menus. The protection needs to be there from the first moment.

The threshold is configurable. If the default makes you uncomfortable, raise it.

Disguised icon, not hidden app

Tarn offers alternative app icons — calculator, notes, weather — so the app doesn't advertise itself on your home screen. This is a disguise, not invisibility. The app still appears in system settings, battery stats, and app store history.

I chose disguise over more aggressive hiding because true app hiding on iOS and Android requires either jailbreaking or system-level exploits, both of which create larger security problems than they solve. A disguise that works 95% of the time against a casual observer is more practical than invisibility that requires compromising the operating system.

The trade-off is that a determined, technically sophisticated adversary can find the app through system menus. I'm working on documentation that walks users through covering those tracks manually.

No account, no identity

Tarn doesn't ask for your email. Doesn't ask for your name. Doesn't ask for a phone number. There is no account. There is no way for me to know who uses the app, how many people use the app, or anything about them.

The trade-off is that I have no usage analytics. I don't know which features are popular. I don't know where users drop off. I don't know what's broken unless someone tells me. I'm building based on research, user interviews, and feedback from people who choose to reach out — not data dashboards.

This also means there's no password recovery. No "forgot your PIN" email. No support ticket that can reset your access. If you forget your PIN, your data is cryptographically inaccessible. That's not a bug. It's the point. If I could recover your data, so could someone else.

Open source, not open governance

The code is GPL-3.0 and public. Anyone can read it, audit it, fork it. Every security claim is verifiable.

But I'm not running a committee. Design decisions, feature priorities, and security trade-offs are mine. I listen to feedback, but the final calls are mine, and I'm accountable for them.

The trade-off is single-developer risk. If I'm compromised, compelled, or hit by a bus, the project's trajectory depends on whoever picks it up. Open source mitigates this partially — the code exists independently of me. But the active development, the security focus, the specific threat model priorities — those are currently tied to one person.

I don't have a great answer for this yet. I'm thinking about it.

The meta trade-off

Every one of these decisions trades convenience for safety. That's the fundamental bargain of Tarn. The app is less convenient than other trackers. It's harder to recover from mistakes. It asks more of you.

In exchange, it doesn't ask anything of your trust. It doesn't ask you to believe that a company will honor its privacy policy. It doesn't ask you to hope that a server won't be breached. It doesn't ask you to trust that a legal team will fight a subpoena on your behalf.

It asks you to set a PIN and remember it. Everything else is math.

Whether that trade-off is worth it depends on your situation. For some people, it isn't. For others, it's the only trade-off that makes sense. I built Tarn for the second group, and I'm trying to make it better for everyone else without compromising what makes it matter.