Flick

Anonymous campus community platform for verified college students to post, discuss, and interact within their college network.

Overview

Flick is an anonymous campus discussion platform designed for students to share experiences, ask sensitive questions, and discuss campus life without revealing their identity.

Many student conversations — such as academic pressure, campus politics, relationship issues, or struggles with studies and career choices — are difficult to discuss openly on identity-based platforms.

Flick enables students to participate in honest conversations while maintaining anonymity and protecting their identity.

The platform focuses on enabling open discussions while preventing the chaos typically associated with anonymous forums through moderation and controlled access.

Problem

Students often hesitate to discuss sensitive campus topics on platforms where their real identity is visible.

Discussions about academic pressure, conflicts with faculty, personal struggles, campus culture, or career uncertainty rarely happen openly when users are tied to their real profiles.

Existing platforms either:

  • require identity, discouraging honest conversations
  • or allow full anonymity, which often leads to spam, trolling, and abusive behavior.

The goal of Flick was to build a system where students can speak freely while maintaining accountability and community safety.

Constraints

  • Anonymity vs accountability – the platform must hide public identity while still allowing moderation and abuse control.
  • Spam and abuse risk – anonymous communities attract low-quality content if moderation systems are not designed early.
  • Real-time interaction – discussions should feel live to encourage engagement.
  • Lightweight infrastructure – the system should remain simple to operate without requiring heavy distributed systems.

Key Engineering Decisions

Controlled anonymous identity

Instead of completely anonymous sessions, Flick assigns internal user identifiers after verification. These identifiers are never exposed publicly but allow the backend to maintain session continuity and enforce moderation rules.

Tradeoff: Requires strict separation between identity data and public activity data to prevent correlation.

Institutional email verification

Users verify their campus identity using institutional email addresses. Once verified, the system allows anonymous participation in discussions.

Reason: Ensures the platform is restricted to real students while preserving anonymity.

Tradeoff: Adds verification flow complexity but significantly improves community quality.

Threaded discussion model

Posts and replies follow a threaded conversation structure rather than flat comment streams.

Reason: Improves readability and enables deeper discussions.

Tradeoff: Thread hierarchy management becomes more complex as conversations grow.

Real-time discussion updates

Real-time updates allow users to see replies and interactions without refreshing.

Reason: Live discussions significantly improve engagement in community platforms.

Tradeoff: Maintaining persistent connections increases backend resource usage.

Moderation layer from day one

Anonymous platforms fail quickly without moderation. Flick includes reporting mechanisms and automated filters to detect abusive or spam content.

The system uses:

  • Perspective API for toxicity detection
  • rule-based filters for banned phrases
  • wildcard word matching
  • rate limiting
  • structured audit logging

Tradeoff: Adds operational complexity but is necessary to maintain a healthy community.

Results

  • Enabled anonymous discussions while protecting student identities.
  • Restricted participation to verified campus users.
  • Maintained structured conversations through threaded discussions.
  • Real-time updates improved interaction and engagement.
  • Moderation systems helped reduce spam and abusive content.

Takeaways

Building Flick highlighted the challenges of designing systems that balance privacy with accountability.

Key lessons:

  • Anonymous systems require moderation infrastructure from the start.
  • Separating identity data from public activity is critical to protect users.
  • Real-time interaction significantly improves engagement in community platforms.
  • Controlled anonymity can enable honest conversations without sacrificing community safety.