Skip to content

cparish312/HindsightMobile

Repository files navigation

HindsightMobile

  1. Takes a screenshot every 2 seconds (only of apps you choose)
  2. Reads and embeds the text from each screenshot with OCR
  3. Lets you chat with anything you've seen on your phone via a local LLM (my favorite: llama3.2 1B)

Demos

Installation

Become Google Play Tester

  1. Join the Discord below and message in the google-play-testing channel or DM me directly
  2. If you don't have Discord you can email: connor@hindsight.life

Build from Source

  1. git clone --recursive https://github.com/cparish312/HindsightMobile.git
  2. Open the Project in Android Studio
  3. Connect your Android Device
  4. You need to do a release build for the LLM to run quickly:
    • Go View -> Tool Windows -> Build Variants and then click the drop down for release
  5. Run the application using Run > Run 'app' or the play button in Android Studio
    • If getting incompatible AGP version install the newest version of Android Studio

Communication

Join us on Discord

Setup an onboarding session or just chat about the project here

Settings

  • Ingest Screenshots: runs a manual ingestion of screenshots
    • Add to db
    • OCR
    • Embed
  • Manage Recordings: Takes you to manage recordings screen
    • If checked the app will be record
    • Delete all content (screenshots, videos, embeddings, OCR Results) for a given app
  • Delete Screenshots From the Last:
    • Let's you delete recent screenshots and OCR results
  • Chat: go to chat
  • Screen Recording: Start Screen recording Background Process (May have to click stop on Notification to stop)
  • Auto Ingest:
    • Runs auto ingest everytime your phone screen turns off
  • Auto Ingest When Not Charging:
    • If off then auto ingestion will only run if your phone is charging
  • IMPORTANT PLEASE READ THIS Record New Apps By Default: when you enter an app that has not been recorded yet it will automatically start recording

Bonus

  • If you click on the Assistant's response you can see the exact prompt that went into the LLM

Shoutouts