Newsroom

We are making some noise!

This Is to Prevent AI from Treating You Unfairly

leiwand.ai was interviewed by FutureZone on the Algorithmic Bias Risk Radar (ABRRA), our first in-house technology. Through it, leiwand.ai we will be able to identify potential adverse effects of AI systems early in their development, procurement, and certification process. (In German)

What remains is AI

Artificial intelligence makes it possible to have conversations with deceased people through apps. leiwand.ai CEO Gertraud Leimüller talked in an interview with “Woman” about those apps and what they are capable of - and what they are not. (In German)

AI Reinforces Stereotypical Images of Women

Data sets reflect social prejudices in artificial intelligence systems, which could potentially lead to discrimination. For this article, leiwand.ai CEO Gertraud Leimüller spoke about how bias in AI can affect women’s career chances. (In German)

Leiwand.ai: "Fairness in AI affects everyone, not just women and minorities"

We cannot expect the algorithms we develop and train with historical data and biases to make fair decisions. That’s, why at leiwand.ai, we have made it our goal to make AI trustworthy. Check out the first article on our company that delves into our work and mission. (In German)

Press Kit

Are you a media representative or partner that needs our brand material? In the press kit, you will find all relevant downloads from the leiwand.ai brand.

Media Contact:

Patrick Kosmider - Communications Manager

patrick.kosmider@leiwand.ai or contact@leiwand.ai

Linke Wienzeile 42

1060 Vienna