- Companies
-
-
New York, New York, United States
-
- Categories
- Market research Marketing strategy Information technology Media
Achievements



Latest feedback
Project feedback


Project feedback


Project feedback


Recent projects

Empowerment Dashboard Design with Synthetic Data
The Feminine Intelligence Agency (FIA) is a social innovation startup that builds tools to help women and other vulnerable populations recognize coercion, build psychological resilience, and protect themselves in relationships, workplaces, and digital environments. We’ve developed a suite of educational tools based on years of psychological research, including: The Agency Calculator – A diagnostic tool that scores users on 20 dimensions of personal agency (e.g., boundaries, emotional regulation, critical thinking, autonomy). The BlindSpot Quiz – A self-awareness tool that reveals a user's potential manipulation blind spots or psychological vulnerabilities. The Player Identifier Chatbot – An AI-guided conversation tool that scores patterns in past or present relationships to detect early warning signs of coercive control. All tools are grounded in a custom-built trait scoring system, and the backend is currently being migrated to a graph database (Neo4j) to support interactive analysis and visualization. The Goal of This Project: Your job is to take our structured psychological framework and design a user-facing data dashboard using synthetic data . This dashboard should help users: Understand their strengths and areas for growth across the 20 agency traits See how their relationship history or quiz patterns affect their risk profile Get personalized suggestions for chatbot training modules or education content Track their learning journey or improvement over time This dashboard should be visually intuitive , empowering , and curious-user-friendly . Think of it as the bridge between psychological insight and actionable growth. Data Provided: You’ll be given structured synthetic datasets that mirror real patterns from FIA’s tools (no real user data, no PII). These datasets will include: Trait scores from the Agency Calculator (0–10 scale, 20 traits) Vulnerability cluster types from the BlindSpot Quiz Player-type pattern flags from relationship assessments Sample chatbot session logs (e.g., user selected “Egocentric Evan,” answered 6/10 confrontation questions) Suggested learning modules and growth paths (Optional) simulated session timestamps for visualizing progress over time

Scaling the Listening Lab Initiative
he Listening Lab is a non-clinical, psychologist-reviewed group wellness experience designed for emotional regulation, especially for individuals who are emotionally overwhelmed but not seeking formal therapy. Participants gather in structured “pods” with designated roles to avoid advice-giving and encourage expressive, coaching-style, and creative interaction. This project offers students an opportunity to develop operational, technical, and strategic plans to scale the Listening Lab in healthcare-adjacent environments such as community wellness centers, university mental health programs, and domestic violence prevention services. Key Capstone Focus Areas Students will choose one or more of the following to develop as a team: AI Integration – Chatbot Support System Design a training framework for a chatbot that can help keep pods on track (e.g., when members go off-topic, offer grounding techniques or redirect questions). Propose ethical and practical guardrails (e.g., data privacy, non-therapeutic scope, prompt design). Technology & Application Design Develop requirements and mockups for a digital platform or mobile app to host and manage pod participation. Consider calendar integration, onboarding, role assignment, session structure, and light analytics (e.g., post-session check-ins). Strategic Marketing & Partnership Plan Identify potential healthcare or wellness organizations as partners. Draft outreach and communication strategy for user acquisition and institutional support. Design a marketing plan tailored to underserved or therapy-averse populations. Training Methodology & Implementation Playbook Create clear, accessible training materials that explain how to participate in a pod (e.g., facilitator guides, sample scripts, role explanations). Recommend digital onboarding and role rotation processes that minimize facilitation burden.

The Listening Lab - Marketing Strategy Development
The Feminine Intelligence Agency is launching a new wellness tool called The Listening Lab designed to support women's mental health and wellbeing through authenticity supports. The project aims to develop a comprehensive marketing strategy that effectively communicates the benefits of this tool to the target audience. The learners will analyze current market trends, identify key demographics, and craft a strategic plan that aligns with the agency's brand values. The project will involve researching competitors, understanding consumer behavior, and exploring innovative marketing channels. The goal is to create a strategy that not only raises awareness but also encourages engagement and adoption of the wellness tool. This project provides an opportunity for learners to apply their marketing knowledge and skills in a real-world context, focusing on a product that promotes mental well-being.

Policy Strategy for Combating AI-Assisted Coercive Control
The Feminine Intelligence Agency (FIA) is a research and public policy organization dedicated to advancing Social Discernment —the ability to detect manipulation early, respond effectively, and protect personal and collective agency in complex social environments. FIA is developing a comprehensive curriculum, scalable tools, and legal frameworks to address coercive control, emotional exploitation, and tech-enabled manipulation in relationships and digital life. Our work focuses on: 🧠 Social Discernment Curriculum : A new field of emotional intelligence that helps people recognize coercion, power games, and deceptive language before harm occurs. 📱 Player Identifier Chatbot : A predictive tool that analyzes early relationship patterns to help users recognize emotionally dangerous individuals. 🔍 AI Language Training App ( ChatBoy ) that teaches users to spot manipulative language across different contexts—dating, work, friendship, and family. 🧘♀️ Peer-to-Peer Wellness Platform : A non-therapeutic, group-based support system designed for women navigating emotional burnout and frustration in high-conflict or confusing relationships. 🔐 Cybersecurity for Women : A decentralized, peer-led training program that focuses on emotional safety, social engineering, and digital resilience—especially for those targeted by psychological abuse or coercion online. FIA’s work bridges public policy, digital safety, and emotional intelligence to help individuals and institutions detect invisible threats and respond with clarity, ethics, and courage. This project explores how the United States might adopt policies to criminalize coercive control in intimate and institutional relationships, following the lead of countries like the UK, Scotland, and Ireland. Specifically, it will examine how emerging technologies—particularly generative AI—are amplifying patterns of manipulation, isolation, and psychological domination, a phenomenon we refer to as AI-Assisted Psychological Exploitation (AIPEx) . The student will assess: How Britain successfully passed coercive control laws (strategy, framing, resistance) Why similar efforts have struggled in the U.S. (cultural, legal, and political factors) How new threats—especially tech-enabled coercion—may change the narrative or legislative appetite The outcome will be a policy roadmap for how coercive control could be reframed, regulated, or outlawed in the U.S., including legal frameworks, political feasibility, and public engagement strategies. 📘 Policy Problem: The U.S. lacks legal frameworks to address non-physical forms of abuse like gaslighting, digital surveillance, emotional manipulation, and isolation tactics—despite their well-documented psychological and economic impacts. The problem is compounded by AI tools that are being misused to amplify these tactics at scale (e.g., deepfakes, real-time surveillance, chatbot mirroring, voice cloning). This project addresses both a market failure (unregulated tools enabling harm) and a government failure (lack of legal protections and prevention strategies for psychological abuse). 🧩 Key Questions to Explore: What legislative, advocacy, and cultural strategies made coercive control laws pass in the UK and Ireland? What would a U.S.-specific policy roadmap need to look like—legally, politically, and culturally? Could technology-driven manipulation (AIPex) be a wedge issue for reform or regulation? What risks or resistance would this proposal face from political, tech, or legal sectors?