FERPA-compliant student data protection
Tools like ChatGPT (and the thousands of consumer-facing AI tools out there) do not protect student data. The hidden cost of these tools is that any data that students put in (e.g. personal experiences, questions about sensitive topics) is then used by AI companies to train future versions of AI models. Similar to social media use, the "product" for these AI companies is your data.
Flint, on the other hand, is FERPA compliant. This means Flint...
Keeps student data safe
Flint does not share any student data (or any other user data, for that matter) with third parties. It also doesn't use any student data to train AI models. The data for each school is kept separate and is not accessible by providers like OpenAI.
Enables specific data access
Flint stores data only so that students, parents, teachers, and school administrators can access it. Administrators can view student assignment submissions on Flint, and a student or their parent can access the student’s data via the student’s login. Alternatively, we can export a particular student’s data based on administrator requests for the purpose of sharing with a parent if necessary.
Additional safety assurance
We have already signed data processing agreements (DPAs) that cover schools in Illinois, Massachusetts, Maine, New Hampshire, New York, Rhode Island, and Vermont. We’re happy to work with your school during the procurement process to ensure that a DPA for your state is in place.
The minimum age of use of ChatGPT is 13 years old, which protects the privacy of younger students but denies them the opportunity to develop AI literacy early on. Additionally, the vast difference in quality of free (e.g. regular ChatGPT) versus paid (e.g. ChatGPT Plus), means that not all students are on a level playing field when using AI.
Flint allows students under the age of 13 to access to AI, while working directly with schools to ensure COPPA compliance. When a school rolls out Flint, every student gets access to the most robust and accurate AI models (e.g. GPT-4) and don't have to pay anything themselves.
Regarding COPPA, Flint can be used by students under the age of 13 as long as your school has some waiver in place where parents sign to give administrators the right to choose which tools students can use. We’re working with a number of elementary and middle schools, and are happy to review the language in your school’s technology waiver to confirm whether the use of Flint by students under the age of 13 would be COPPA compliant.
Controlling the behavior of the AI
AI tools like ChatGPT can answer any question about anything — or at least they attempt to. This may be great for personal productivity but makes these tools difficult to use educationally. Because tools like ChatGPT are designed to satisfy the end user, they'll give students the immediate answer to any question instead of challenging students to think through problems themselves.
Flint allows teachers to control exactly how AI interacts with students. This control is established in the assignment creation panel, where rules for what role the AI should take on and how it should respond to students are editable by the teacher. This means that the AI...
Won’t give away the answer
Unless the teacher explicitly asks the AI to give answers, Flint’s AI will challenge students to think critically and guide them to an answer. The AI can also adjust the difficulty of assessment or review on the fly.
Stays on topic
If a student strays off topic, the AI will gently nudge the student back to the learning objective that the teacher has provided, and all chat data can be viewed by the teacher.
Ensuring accuracy and relevance
AI models are trained on millions of documents, ranging from textbooks to news articles to online forums. This makes these AI models seemingly knowledgeable on any topic, but dive deeper and you'll find that AI might make content up, or “hallucinate” when it doesn't know the answer. Most AI tools also often fail to provide specific sources for its knowledge.
On Flint, teachers can provide the AI with content from their class — whether that's a worksheet, textbook chapter, lecture slides, lesson plan, or link to a recent news article. The AI will then pull from the materials provided by the teacher when interacting with students, ensuring accuracy and relevant context.
Educator oversight of student AI use
Many of the concerns with irresponsible AI use by students today (i.e. using ChatGPT to write their essays) stem from the fact that there's no teacher oversight. Any individual can sign up for an AI tool if they have an email address and can easily lie about their age if they’re under 13.
Flint allows educators to view exactly how students are using AI. Every student interaction with AI is tracked and can be viewed by teachers as well as school administrators. In special cases, administrators have the ability (e.g. based on teacher concern or a parent request) to request an export of student data from Flint.
Teacher augmentation, not replacement
Students are already using AI for homework help, through tools like ChatGPT and many other apps (just search "AI homework" on the App Store). In the long run, if AI use continues to be driven primarily by students, there's a risk that the importance of teachers as the source of learning is undermined in the minds of students.
Flint changes the equation by letting teachers augment themselves with AI instead of feeling as if they are competing against it. Teachers have incredible context on the needs of their students, and can use Flint to multiply their own effectiveness, such as by providing students with more personalized help in the classroom.
Unregulated AI use by students currently risks them simply jumping straight to answers and generating unoriginal writing. Flint seeks to help teachers employ AI as a learning resource. By giving teachers control to emphasize the learning process rather than a “correct answer”, Flint can help create an AI-powered classroom that increases emphasis on both student creativity and agency.