Skip to content
Pedro's Blog
Go back

Making Missed Workouts Cost Money With Smart Contracts

Most fitness plans fail for the boring reasons: motivation fades, life happens, and “I’ll do it tomorrow” becomes an everyday excuse.

This post is about an experiment I’m running to prevent one specific failure mode: the ability to negotiate with myself.

I built FitVow, a small system where I lock real money into a smart contract and, each week, I need to hit my physical activity goals. If I don’t, I pay a fine. The twist: anyone can trigger enforcement and collect part of the fine as a reward, and the rest goes to a charity wallet (Giveth).

I have no control over the enforcement process whatsoever. If I miss my weekly goals, I’m automatically eligible for a fine—no pleading, no explaining, just contract rules.

If you want to verify (not trust) the current state of the experiment, the live dashboard is here: fitvow.pedroaugusto.dev. It reads directly from the blockchain (no backend) and shows the initial stake, week-by-week pass/fail results, and how much has been lost to fines so far.

FitVow is not a product. It’s a personal experiment in motivation: seeing if putting real money at risk succeeds where willpower alone usually fails.


The basic idea

  1. I stake funds into a smart contract for a fixed time window (e.g., 12 weeks).
  2. Every week has a set of activity goals (e.g., running, gym, sleep, screen time).
  3. If I hit the goals, the week is marked as complete.
  4. If I miss, a penalty can be applied for that week.
  5. Enforcement is permissionless: anyone can call the enforcement function on the smart contract and get part of the staked funds as a reward for doing so.
  6. At the end of the fixed time window and I get whatever if left from the stake back.

Think of it as a gym buddy you hand a pile of money to for safekeeping—except he’s allowed to give some of it away every time you skip a workout.


Why permissionless enforcement?

Because anything that depends on my future self being honest and diligent is exactly what I’m trying to avoid.

Permissionless enforcement turns this from “self-tracking with extra steps” into a credible commitment:

That incentive is also why the penalty is split. If I could enforce my own failure and get 100% of the penalty back, the system becomes theater. Splitting it between the enforcer and charity ensures that even if I try to “cheat” by enforcing my own missed week, I still lose real money to charity.


Architecture overview

AstroPaper v4

FitVow is split into three independent components:

  1. FitVow-Sync (Android app): A custom Android app I built that integrates with Android Health Connect to read workout sessions recorded by my Samsung Galaxy Watch. It signs weekly activity summaries using a non-exportable, hardware-backed key and publishes them on-chain.

  2. Fitness Unbreakable Vow (smart contracts): A set of contracts deployed on Arbitrum (Ethereum L2) that verify the signed activity data, track weekly pass/fail outcomes, and manage the stake by applying fines when a week is missed.

  3. FitVow Dashboard (website): A simple frontend that reads from the blockchain and displays the live state of the experiment (stake, weekly results, and fines paid). It also lets anyone connect a wallet with Metamask and enforce a missed week when enforcement is available.

Minimal mental model

* FitVow-Sync filters Health Connect records to only accept data originating from Samsung Health (the Galaxy Watch companion app).


Preventing cheating

If the system were just “check a box saying you went to the gym”, it would be pointless. I could click the box while watching Netflix in bed and call it a day.

So the real challenge is:

How do you make cheating annoying enough that the path of least resistance is simply doing the workout?

FitVow’s approach isn’t perfect, but it tries to raise the cost of cheating by anchoring trust in three places:

  1. A wearable-backed data source
    Activity data comes from my Galaxy Watch via Samsung Health → Health Connect. A watch produces a bundle of signals (duration, heart-rate patterns, calories/energy estimates, timestamps, etc.) that are much harder to fake convincingly than a manual checkbox. Could someone fabricate it? Probably. But it’s already a much higher-effort attack than “tap to confirm”.

  2. A hardware-backed signing key (Android Keystore / TEE)
    FitVow-Sync doesn’t just upload raw activity records—it uploads a signed weekly summary. The private key used to sign is non-exportable and lives in Android’s Trusted Execution Environment (TEE). This makes “I’ll just run a script on my laptop and submit fake workouts” not work: the contracts only accept updates that prove they came from the enrolled device.

    To make this verifiable, I also publish the key’s Android Key Attestation on-chain, so anyone can independently inspect the attestation chain and key properties [1] [2].

  3. Public, verifiable execution
    The rules and state live on-chain, and the code is public. That doesn’t magically prevent cheating, but it does make the experiment auditable. If I try to game the system, I’ll be doing it publicly—with a permanent on-chain paper trail.

The goal isn’t “impossible to cheat.” It’s cheating being more effort than the workout.


Security model (high level)

This system only works if the contract can distinguish “data produced by my enrolled phone + app” from “some script submitting whatever it wants” (aka the sandbox environment). The model is simple: trust the wearable data pipeline, and strongly authenticate the publisher.

Hardware-backed keys (P-256, non-exportable)

On first install, FitVow-Sync app generates a P-256 private key that is non-exportable and stored inside the Android TEE. The corresponding public key is registered on-chain. From that point on, the contracts only accept updates signed by that key [3].

In other words: the contract doesn’t trust “who sent the transaction”; it trusts “who can produce a valid signature”.

Android Key Attestation (verifiable by anyone)

To make that claim auditable, I publish the key’s Android Key Attestation on-chain. This gives third parties a way to verify (via Google’s attestation chain) properties about the enrolled key and environment—e.g. that the key is hardware-backed and tied to an expected app identity/build [4].

This primarily targets the “I’ll just run a modified build or a desktop script” attack.

“Sign-and-forget” builds (amnesiac releases)

A subtle failure mode in mobile security is silent drift over time: new builds, different signing keys, debug toggles, or “temporary” shortcuts that accidentally become permanent.

FitVow treats enrollment of a public key as a one-way door. The enrolled key/app identity isn’t something I casually swap. If I do rotate it, it’s intentionally expensive, so “upgrading into a cheat” has a real cost.

The mechanism: amnesiac APK signing

The FitVow app uses an intentionally non-upgradable signing model.

During release, the APK is signed with a one-time, ephemeral signing key, and then the signing keystore is permanently deleted. After that, it’s cryptographically impossible to produce another APK with the same signature [5].

This is not a mistake — it’s the security guarantee.

Why it matters: on Android, the app’s signing key is the app’s identity. If you ship a future APK signed with a different key, Android treats it as a different app. That forces a reinstall, as opposed to an update, and, crucially, destroys the original app’s hardware-backed (TEE) keys.

FitVow relies on this property:

By making the apk signing key unrecoverable, I remove my own ability to:

If I want to change behavior, I must pay the full price: reinstall (losing the TEE keys) and use the explicit on-chain key-rotation mechanism.

Auditable releases

To make releases verifiable, official FitVow-Sync APKs are built by GitHub Actions. Anyone can inspect the workflow, review the logs, and download the exact artifacts produced by CI. The version currently running on my phone comes from this build:


Current experiment parameters

This first run is deliberately sized to sting without being life-changing.

To put those numbers in context (Brazil): my monthly internet bill is R$99 (≈$18), natural gas is R$70 (≈$14), and the monthly national minimum wage is R$1518 (≈$286 💀). So yes — the fines are small, but they’re real enough to matter to me.

...in retrospective, not using stablecoins to keep it simple was not the best idea...

How a week is counted

Each week has three goals. The oracle records pass/fail for each one, and the vow contract uses that outcome to decide whether enforcement is allowed.

The rule for this run is simple:

The three weekly goals


The “stolen phone” problem (and why recovery is expensive)

If the system only trusts a specific hardware-backed key, then “my phone got stolen” becomes a real operational problem (I’ve had a phone stolen before in São Paulo).

So FitVow has an emergency escape hatch: a one-time key rotation. But it’s intentionally expensive: rotating the enrolled key costs 35% of the current contract balance, and that fee is donated to charity [6].

This is deliberate:

Think of it as: you can recover access, but you will absolutely feel it.


Automatic enforcement (the “no one showed up” safeguard)

Permissionless enforcement works because other people have an incentive to call the enforcement function when a week fails. But there’s an edge case: what if nobody interacts with the contract?

To prevent “winning by inactivity”, FitVow also has automatic enforcement powered by Chainlink Automation. The key point is that the Upkeep is owned and configured by the Fitness Unbreakable Vow contract itself — once registered, I (the pledger) can’t pause it, modify it, or disable it.

How it works:

In normal circumstances, humans enforce and get rewarded. Automation only exists as a backstop to make sure the vow can’t be bypassed by simply staying inactive. Access the Chainlink upkeep page here.


What this is (and what it isn’t)

This is:

This isn’t:

It’s a personal project built to answer a narrow question:

Can I design a system where the easiest path is just doing the workout?


Limitations and open problems

A few honest ones:

There’s also a philosophical limit here: you can’t outsource discipline completely. At best, this is a nudge in the sense of nudge theory—it doesn’t force behavior, it just changes the incentives and reduces the room for “I’ll do it tomorrow” negotiations.


TL;DR

I locked real money into a smart contract and wrote a system that penalizes missed workout weeks. Activity data comes from a wearable via Android, results are published on-chain, and enforcement is permissionless: anyone can trigger penalties and collect a reward, while the rest goes to charity.

If it works, the “easy” choice becomes doing the workout.


Share this post on: