AI Detection Bypass: Uncovering the Only Method That Works! I Tried Them All!

Andy Stapleton
22 May 202310:47

TLDRThe video script discusses the challenges of detecting AI-generated content and the methods people use to avoid plagiarism and AI detection. It highlights the effectiveness of a tool called '' in lowering AI detection scores and plagiarism rates. The speaker emphasizes the importance of original writing and using AI responsibly as an editing tool, not a content generator, and suggests transparency in academic work regarding AI assistance.


  • 🚫 AI-generated content can be detected, and current AI detection tools are effective at identifying such content.
  • πŸ” Plagiarism and originality detection tools like Unicheck and are used to assess the uniqueness of texts.
  • πŸ’‘ Retaining domain-specific details and using synonyms do not effectively reduce AI detection rates.
  • 🎨 Changing the tone of AI-generated content, such as mimicking Albert Einstein's style, does not prevent detection.
  • πŸ“ Manual paraphrasing of AI-generated text can reduce, but not eliminate, the AI detection score.
  • πŸ”„ Resequencing information does not help in avoiding AI detection.
  • πŸ“ˆ Adding more details to the AI prompt does not influence AI detection outcomes.
  • πŸ’¬ Increasing perplexity and burstiness in AI-generated text does not significantly impact AI detection.
  • πŸ› οΈ The only currently effective method to bypass AI detection is using a tool like
  • πŸ“‹ Disclosing the use of AI tools like GPT-4 in academic papers is encouraged as part of a responsible academic workflow.
  • πŸš€ The landscape of AI and its role in academia is rapidly evolving, and best practices are continually being shaped.

Q & A

  • What is the main concern regarding the use of AI tools for generating academic content?

    -The main concern is that AI-generated content can be detected, leading to issues with plagiarism and originality. The tools are becoming increasingly sophisticated at identifying such content, making it risky to use them for academic writing without proper attribution or editing.

  • How does the script's author demonstrate the effectiveness of AI detection tools?

    -The author demonstrates this by using two different AI detection tools, Unicheck and, to analyze AI-generated text. The results show high AI detection scores, indicating that the tools are effective at identifying content generated by AI.

  • What strategies have been tried to evade AI detection?

    -The strategies tried include using synonyms, retaining domain-specific details, changing the tone of the content, using paraphrasing tools like Quillbot, manual paraphrasing, resequencing the information, and increasing perplexity and burstiness in the response.

  • Why did changing the tone of the AI-generated content to that of Albert Einstein not work?

    -It did not work because, despite the change in tone, both Unicheck and still detected the content as 100% AI-generated. The attempt was also impractical as the content's purpose was not aligned with Einstein's writing style or expertise.

  • What was the outcome of using paraphrasing tools like Quillbot on AI-generated content?

    -The outcome was not entirely positive. While the paraphrasing reduced the plagiarism score to zero, the AI detection remained at 100%, indicating that the core AI-generated nature of the content was not effectively masked.

  • What did the author find as the most effective method to avoid AI detection?

    -The author found that using a tool called was the most effective method. This tool managed to reduce the AI detection score to 29% and the plagiarism score to 2.19%, which is a significant improvement compared to other methods tested.

  • What is the author's advice for using AI tools in academic writing?

    -The author advises against relying on AI for content generation and instead suggests using AI as a tool for editing, checking for inconsistencies, and making the writing more concise. They emphasize the importance of original writing and proper attribution when using AI.

  • How might the use of AI in academic writing be disclosed in the future?

    -The author suggests that we may see more disclosures in academic papers about the use of AI tools, similar to conflict of interest statements. This would involve acknowledging the use of AI and how it contributed to the work, including any limitations or potential issues it introduced.

  • What resources does the author offer for improving academic writing?

    -The author offers two ebooks, 'The Ultimate Academic Writing Toolkit' and 'The PhD Survival Guide', through their website They also provide a blog, a forum, and a resource pack for those applying for PhD or grad school.

  • How can one stay updated with the author's insights and resources?

    -By signing up to the author's newsletter at, one can receive exclusive content, including tools used by the author, podcast appearances, tips on writing perfect abstracts, and more.

  • What is the overall message the author conveys about AI and academic integrity?

    -The author emphasizes that while AI tools are powerful and can be helpful for certain tasks, they should not be used to generate original academic content. Instead, they should be used responsibly asθΎ…εŠ©ε·₯ε…· to enhance and refine one's own writing, with transparency and integrity in academic work.



🚨 Challenges in AI Detection and Plagiarism

The paragraph discusses the challenges of using AI tools like chat GPT to generate content for academic purposes. It highlights the risks of AI-generated content detection and plagiarism, emphasizing that current AI detection tools are effective in identifying such content. The speaker shares their experience with various methods to evade AI detection, including using synonyms, domain-specific details, changing tone, paraphrasing, and resequencing, but these attempts mostly proved unsuccessful. The paragraph underscores the importance of original writing and the potential of AI as an editing tool rather than a content generator.


πŸ” Exploring Solutions to Bypass AI Detection

This paragraph delves into the exploration of solutions to bypass AI detection in content generation. The speaker describes their experimentation with different approaches, such as increasing perplexity and burstiness to make the language less robotic. However, most attempts were unsuccessful. The paragraph reveals that the only effective method found was using a tool called '', which managed to significantly reduce AI detection and plagiarism scores. The speaker warns against relying solely on AI for content generation and emphasizes the value of original work, suggesting that AI should be used as aθΎ…εŠ©ε·₯ε…· rather than a primary content creator.


πŸ“š Resources and Future of AI in Academia

The final paragraph shifts focus to academic resources and the future role of AI in academia. The speaker promotes their project,, which offers ebooks, a blog, and resources for PhD and grad school applicants. They also encourage signing up for their newsletter for exclusive content and insights. The paragraph concludes with a reflection on the rapid evolution of AI tools and their potential impact on academic writing and research, suggesting that transparent disclosure of AI usage in academic work may become more common.



πŸ’‘AI tools

AI tools refer to artificial intelligence software programs designed to perform tasks that typically require human intelligence. In the context of the video, AI tools are used for text generation, such as creating literature reviews or academic papers. The video discusses the capabilities and limitations of these tools, particularly in relation to AI detection and plagiarism.


Plagiarism is the act of using someone else's words, ideas, or work without proper attribution or permission, thereby presenting it as one's own. In the video, the concern is about AI-generated content being detected as plagiarized due to its similarity to existing texts. The speaker uses tools to check for plagiarism to ensure originality in academic writing.

πŸ’‘AI detection

AI detection refers to the process of identifying content that has been generated by artificial intelligence rather than by a human. The video discusses various methods people use to try to avoid AI detection and how these methods are often unsuccessful against advanced detection tools.


Synonyms are words or phrases that have similar meanings. In the context of the video, using synonyms is suggested as a method to evade AI detection by changing the words in the AI-generated text to different ones with the same meaning, in an attempt to make the content appear more original.


Tone refers to the attitude or emotion conveyed in a piece of writing. The video discusses changing the tone of AI-generated content, such as mimicking the tone of Albert Einstein, as a strategy to confuse AI detection tools. However, this method was found to be ineffective.

πŸ’‘Paraphrasing tools

Paraphrasing tools are software applications that assist in rewording or rephrasing text to create a new version that conveys the same meaning but with different wording. The video explores the use of such tools like Quill Bot to alter AI-generated content and avoid detection.

πŸ’‘Manual paraphrasing

Manual paraphrasing involves an individual taking the original text and rewriting it in their own words while maintaining the original meaning. The video suggests that this method might be more effective than automated tools in avoiding AI detection.


Resequencing is the process of rearranging the order of sentences or paragraphs in a text. The video discusses trying this method to alter the structure of the AI-generated content in hopes of avoiding AI detection.

πŸ’‘Perplexity and burstiness

Perplexity and burstiness are linguistic terms referring to the complexity and variability in language use. In the context of the video, increasing these elements in AI-generated text is suggested as a way to make it sound less robotic and potentially avoid AI detection.

πŸ’‘ is mentioned as a tool in the video that claims to help users get around AI detection. It is presented as the only effective method tested by the speaker, reducing the AI detection score significantly.

πŸ’‘Academic integrity

Academic integrity refers to the ethical principles and standards that scholars must follow in their academic work, including honest and accurate representation of sources and original work. The video emphasizes the importance of writing original content and using AI tools ethically, in line with academic integrity.


AI tools are becoming more powerful, leading to temptation for misuse in generating academic content.

AI-generated content can be detected, and attempts to circumvent detection are mostly unsuccessful.

Chat GPT was used to generate content on organic photovoltaic devices, showing a high AI detection rate.

Plagiarism and originality checks were conducted using uni check and originality.aa.

Using synonyms and domain-specific details does not effectively reduce AI detection or plagiarism scores.

Changing the tone of AI-generated content, such as mimicking Albert Einstein, does not evade AI detection.

Paraphrasing tools like Quill bot and manual paraphrasing still result in high AI detection rates.

Resequencing information and adding more details to the prompt do not bypass AI detection.

Increasing perplexity and burstiness in AI-generated text slightly reduces AI detection but is not very effective.

The use of AI as an editing tool, rather than a content generator, is a recommended practice.

The academic community may see more transparency regarding AI usage in research, with explicit statements in papers. is currently the only tool found to effectively bypass AI detection.

It is crucial to generate original content and use AI responsibly to aid in editing and refining work.

The landscape of AI tools and their impact on academia is rapidly evolving.

Resources like the Ultimate Academic Writing Toolkit and the PhD Survival Guide are available for academic support.

The importance of ethical AI usage in academic writing and the potential for evolving standards.