ÒreAyò

Safety & Privacy

ÒreAyò is being built with privacy, safety, and cultural respect as foundational requirements — not features to add later.

Core principles

These principles guide every decision about how ÒreAyò is designed, built, and deployed.

Privacy by design

Privacy is built into the system from the beginning, not added as a feature. Data handling decisions prioritise user control and transparency.

Cultural respect

Nigerian languages and speech data are treated with care and respect. Data is not extracted carelessly or used without consideration for cultural context.

Honest about limitations

Safety includes clear communication about what the system can and cannot do — especially during research and early development.

No hidden surprises

Users should never be surprised about what data is collected, how it's used, or who has access. Clarity is non-negotiable.

How we think about data

ÒreAyò is currently in research and development. No user data is being collected yet because the product is not publicly available.

When data collection becomes necessary, these principles will apply:

  • • Data is collected only when necessary for functionality, not for convenience
  • • Speech data is handled with the same care as sensitive personal information
  • • Users will have clear control over their data and how it's used
  • • Language data from Nigerian communities is treated as culturally significant, not just training material
  • • Transparency about data retention, deletion, and usage is built in by default

Safety in practice

Safety is not just about data protection. It's about building AI that people can trust to behave predictably and respectfully.

This means:

Clear communication about capabilities. The system will not imply it can do things it cannot. Limitations are stated honestly.

Designed for everyday use. ÒreAyò is being built to be used by parents, elders, and families — people who should not need to understand AI to use it safely.

Tested with real-world scenarios. Evaluation includes how the system behaves in noisy environments, with mixed languages, and with diverse accents — not just clean test data.

No hidden behaviour. The system should behave the way people expect it to, without surprising actions or unexpected results.

Research ethics

During the research and development phase, ethical practices are especially important.

This includes:

Responsible data sourcing

Speech data used for research is ethically sourced, properly licensed, and handled with cultural sensitivity.

Transparent progress

Research updates are shared publicly without exposing sensitive details that could compromise privacy or safety.

No exploitation

Nigerian language data is not treated as a resource to extract. Communities deserve respect and benefit from the systems built with their languages.

Long-term accountability

Decisions made during research affect future users. Ethics cannot be an afterthought.

Where we are now

ÒreAyò is not publicly available yet. No personal data, speech recordings, or user information is being collected from the public.

Research is focused on building the technical foundation — speech recognition, language understanding, and conversational intelligence for Nigerian languages.

When the system is ready for early testing, safety and privacy practices will be clearly communicated before anyone provides data.

This is deliberate. Safety cannot be rushed.

Questions about safety or privacy?

If you have concerns or questions about how ÒreAyò handles safety, privacy, or cultural respect, we want to hear from you.

Clear communication about these topics is part of building trust. Feedback helps make the system better.

Join the waitlist to stay updated →

Follow the journey

Join the waitlist to receive updates as safety and privacy practices are implemented, or follow research progress.

We share openly and honestly, without hype.