April 21, 2026

Privacy as a design constraint

Treating privacy as a hard constraint rather than a feature checkbox produces better software — and earns trust that is difficult to fake.

privacydesignprinciples

There is a version of privacy that is a marketing checkbox. The app has a privacy policy. There is a consent banner. The settings screen has a toggle. None of this necessarily means the product is private.

There is another version of privacy that is an engineering constraint. You decide, early, that you will not collect data you do not need. You architect the system so that sensitive information does not leave the device unless the user explicitly requests it. You choose third-party services based on what they know about your users, not just their pricing tier.

The second version is harder upfront and simpler later. When you do not collect data, you do not need to protect it, audit it, respond to requests about it, or explain a breach. The privacy story is shorter because there is less to explain.

For Dossier, this meant designing the data layer before designing the UI. Where does a note live? On-device by default. What leaves the device? Only what the user has explicitly synced via iCloud, which is Apple's infrastructure, not ours. What do we log? Crash reports, opt-in. Nothing else.

This is not privacy theatre. It is privacy as a requirement with real consequences for architecture. The result is a product that is genuinely easier to trust, not just easier to market.