Skip to content

An assistive app for visually impaired individuals to easily fill out forms, built on a microservice architecture with Java, Next.js, and Swift. Utilizes CoreML and LLMs for real-time assistance on mobile devices, aiming to boost accessibility and independence with advanced technology.

Notifications You must be signed in to change notification settings

ananta/AssistiveForms

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AssistiveForms

Preview

Project Concept

This project is envisioned as a mobile application that leverages CoreML and a native Large Language Model (LLM) to assist disabled persons in filling out research forms. Using voice commands and guidance, the app aims to facilitate the completion of necessary forms, including consent documents, making the process more accessible for individuals with disabilities.

Proposed Features

  • Voice-Guided Form Completion: A native LLM will interpret voice commands and guide users through the form-filling process, ensuring it's both efficient and accessible.
  • CoreML Integration: By employing CoreML, the app will perform efficient on-device machine learning, prioritizing user privacy and real-time interactions.
  • Consent Form Assistance: The application will assist users in understanding and signing consent forms and other important documents effortlessly.
  • Accessibility First Design: The app will be designed with a focus on accessibility, catering to a wide range of disabilities to ensure broad usability.

Stage of Development

Currently, this project is in the ideation phase. The features and implementations described are conceptual and subject to change as the project evolves.

How to Get Involved

Feedback and Suggestions

We're open to feedback and suggestions! If you have ideas on how to improve this concept or want to suggest new features, please feel free to open an issue or pull request.

Collaboration

Looking for collaborators! If you're interested in contributing to the development of this idea, whether through coding, design, user experience, or any other aspect, please reach out.

Stay Updated

To stay updated on the project's progress or to express your interest in contributing, please star or watch this repository. We'll post updates as the project develops from an idea into a fully-fledged application.

Vision and Goals

Our goal is to create an application that significantly eases the process of filling out forms for disabled persons, enhancing their independence and participation in research opportunities. By leveraging advanced technologies like CoreML and LLMs in a privacy-conscious and accessible manner, we aim to deliver a solution that respects users' needs and dignity.

Contact

For more information, to offer support, or to join our team, please contact us at anantab@clemson.edu or menam@clemson.edu.

About

An assistive app for visually impaired individuals to easily fill out forms, built on a microservice architecture with Java, Next.js, and Swift. Utilizes CoreML and LLMs for real-time assistance on mobile devices, aiming to boost accessibility and independence with advanced technology.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published