I would like to submit feedback regarding the current Fortnite Creative/UEFN map submission and Moderation/Sanction process as well as provide a few proposed solutions. While there are many excellent features already in place, such as the appeal system and AI-driven moderation for light tasks, there are concerns that need to be addressed to ensure that the system remains fair, unbiased, and transparent for all creators.
These suggestions are offered with the utmost positivity and in support of the community, reflecting key concerns shared by a variety of creators who are facing similar challenges or experiences within the creative space.
Current Concerns:
1. [Alleged] Bias in Human Moderation:
There have been instances where maps are rejected not due to any violation of Fortnite Creative guidelines but [allegedly] due to possible subjective opinions or perceived bias from human moderators. This includes cases where thumbnails that do not violate any rules have caused maps to be flagged, or instances where maps are mistakenly rejected without clear or reasonable explanations. These actions can create frustration for creators, particularly when there is no transparency about the reasons behind the rejection.
2. Lack of Transparency:
Creators often do not receive clear, specific feedback on why their maps were rejected. The lack of detailed explanations leaves room for confusion, as creators struggle to understand whether their maps were rejected for genuine guideline violations or due to a potentially biased or incorrect decision made by the moderator.
3. Inconsistency with Scam Maps:
Another issue is the apparent inconsistency in how scam maps are treated in the Discover tab. Some scam maps that clearly violate Fortnite’s guidelines continue to exist in Discover, while legitimate maps that don’t break any rules are rejected. This undermines the credibility of the moderation system and creates a sense of unfairness within the community.
4. Lack of Disclosure Regarding AI or Human Moderation:
There is currently no clear disclosure on whether a map is being reviewed by AI or by a human moderator. If a map is reviewed by a human, providing the name of the moderator would be a helpful step towards ensuring accountability. This could foster trust in the process, as creators could feel more confident knowing that their maps are being reviewed fairly by an individual whose actions are transparent. Additionally, knowing which moderator made a decision could help identify potential errors or inconsistencies and allow for further investigation if necessary. From another perspective, transparency about whether a map is being reviewed by AI or a human is important. In some jurisdictions, AI-driven decisions in moderation practices must be disclosed to users. For instance, in the European Union, the General Data Protection Regulation (GDPR) requires that automated decisions be explained to individuals, especially when they have a legal or similarly significant effect on the user (Article 22 of the GDPR). While this law primarily addresses issues such as profiling and automated decision-making in the context of data processing, the principles of transparency and accountability can be applied to the moderation process, ensuring that creators know whether AI or a human is making the decision.
[I am not a lawyer and am not offering legal advice, but I am simply referencing information that may be relevant to the discussion.]
Links to references:
- GDPR Article 22 : Art. 22 GDPR – Automated individual decision-making, including profiling - General Data Protection Regulation (GDPR)
- European Commission’s Guidelines on AI Transparency : https://ec.europa.eu/digital-strategy/our-policies/european-approach-artificial-intelligence_en
5. Longevity of Sanctions:
Currently, sanctions are applied without a clear end date, and some sanctions remain in place even after the issue that caused them has been resolved. This can create frustration for creators, especially when they have made efforts to correct any issues with their maps or content. There should be a defined “fall-off” date for sanctions, particularly when the original issues have been addressed.
A clear timeline for when sanctions will be lifted would help creators feel that they are being treated fairly, and it would encourage them to continue engaging with the platform without the fear of resolved penalties lingering indefinitely. This would also help reduce the potential for punitive measures to disproportionately affect creators who have taken steps to rectify their mistakes.
Lastly, having some information available in FAQs as to what types of sanctions will be held permanently on an account should be publicly displayed.
6. Concerns about the Terminology Regarding Unfounded Appeals: It is known that submitting unfounded appeals repeatedly may result in serious consequences, including a permanent account ban. However, this terminology creates a challenge for creators in determining whether or not an appeal is considered unfounded without being provided proper context or specific reasons for the rejection. Without transparency or clear feedback on why a map was rejected, it is difficult for creators to assess the validity of their appeal. Additionally, this policy may be confusing for younger players within the Fortnite 1.0 space. Many younger creators may not fully understand the complexities of the moderation process and could mistakenly perceive their legitimate concerns as unfounded, leading to unnecessary confusion, discouragement and unfair account bans. Providing more context around rejections and appeals would help prevent misunderstandings and ensure that all creators, regardless of age or experience, can navigate the process effectively.
These issues are not just minor inconveniences—they are causing significant frustration and, for some, even pushing them to abandon their creative pursuits. We must ensure that the process is fair, transparent, and responsive, so creators don’t feel like their work is being dismissed for reasons that seem arbitrary or unclear. Sadly, right now a lot of creators who abide by the law are starving, not thriving. I think as a community we ask, is this what it has all become? Is this the metaverse that was promised or envisioned?
If we allow these issues to persist, we risk losing talented creators who are passionate about the platform. It’s not just about fairness—it’s about protecting the heart of the Fortnite Creative/UEFN community. I believe that with these changes, we can create a system where every creator feels empowered to contribute and innovate without fear of bias or frustration.
Proposed Solutions:
-
Clear, Detailed Feedback for Map Rejections
When a map is rejected—whether through AI moderation or human moderation—creators should receive clear, specific feedback on the reason for the rejection. The feedback could include:- A breakdown of why the map violated specific guidelines (e.g., thumbnail issues, map content, inappropriate elements).
- A reference to the specific guideline(s) that the map violated.
- A suggestion for how the map can be revised to meet the guidelines.
- Disclosure- whether it was reviewed by AI or by human moderator with name.
Providing this information would allow creators to understand what went wrong and how to correct it, making the process feel more transparent and constructive.
2. Audit System for Human Moderation
To ensure fairness, and consistency, and prevent [alleged]bias or profiling tactics, an audit system should be implemented for human moderation. This system would log all map rejections and conduct periodic reviews to ensure that decisions are made based on objective criteria. The audit process could involve random checks or require rejected maps to be re-reviewed by a different set of moderators to reduce the risk of bias or profiling tactics. Additionally, creators should be informed that their rejection will or may undergo such reviews to ensure transparency.
3. Regular Monitoring of the Discover Tab
Scam maps that clearly violate Fortnite’s guidelines should be flagged and removed from the Discover tab, regardless of their popularity. Regular checks and enforcement will prevent the perception that scam maps are allowed to thrive while legitimate creators face penalties for minor issues. This would help to restore trust in the moderation system and ensure that all creators are treated equally.
4. Protection for Smaller Creators
Smaller creators often face additional challenges, such as false reports or having their maps unfairly rejected. With the current system allowing for [alleged] biases or inconsistencies, it becomes even harder for them to succeed. By improving transparency and accountability, Epic Games can create a more supportive environment for these creators, ensuring that they are not discouraged by arbitrary rejections or a system that feels stacked against them.
By implementing these changes, the Fortnite Creative community will feel that their creations are treated fairly and without bias. Transparency, clear communication, and accountability will go a long way in ensuring that map creators trust the moderation system and are empowered to continue innovating without fear of arbitrary or unfair rejection.
Epic Games has always prided itself on creating an inclusive, thriving community of creators. By addressing these concerns, we can strengthen that community, ensuring that both new and veteran creators feel valued and heard. This will foster trust and encourage even more talent to flourish within the Fortnite Creative space.
As stated:
These suggestions are offered with the utmost positivity and in support of the community, reflecting key concerns shared by a variety of creators who are facing similar challenges or experiences within the creative space.
Thank you for taking the time to read these suggestions. If implemented, these changes have the potential to greatly enhance the Fortnite Creative/UEFN moderation and sanction experience for both map creators and players alike.